DocumentCode :
3330956
Title :
Learning noisy perceptrons by a perceptron in polynomial time
Author :
Cohen, Edith
Author_Institution :
AT&T Labs.-Res., Florham Park, NJ, USA
fYear :
1997
fDate :
20-22 Oct 1997
Firstpage :
514
Lastpage :
523
Abstract :
Learning perceptrons (linear threshold functions) from labeled examples is an important problem in machine learning. We consider the problem where labels are subjected to random classification noise. The problem was known to be PAC learnable via a hypothesis that consists of a polynomial number of linear thresholds (due to A. Blum, A. Frieze, R. Kannan, and S. Vempala (1996)). The question of whether a hypothesis that is itself a perceptron (a single threshold function) can be found in polynomial time was open. We show that indeed, noisy perceptrons are PAC learnable with a hypothesis that is a perceptron
Keywords :
computational complexity; learning by example; perceptrons; random noise; random processes; PAC learnable; hypothesis; labeled examples; linear threshold functions; linear thresholds; machine learning; noisy perceptron learning; polynomial time; random classification noise; single threshold function; Ear; Linear programming; Machine learning; Noise generators; Noise level; Polynomials; Probability distribution; Vectors; Zinc;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Foundations of Computer Science, 1997. Proceedings., 38th Annual Symposium on
Conference_Location :
Miami Beach, FL
ISSN :
0272-5428
Print_ISBN :
0-8186-8197-7
Type :
conf
DOI :
10.1109/SFCS.1997.646140
Filename :
646140
Link To Document :
بازگشت