Title :
Learning noisy perceptrons by a perceptron in polynomial time
Author_Institution :
AT&T Labs.-Res., Florham Park, NJ, USA
Abstract :
Learning perceptrons (linear threshold functions) from labeled examples is an important problem in machine learning. We consider the problem where labels are subjected to random classification noise. The problem was known to be PAC learnable via a hypothesis that consists of a polynomial number of linear thresholds (due to A. Blum, A. Frieze, R. Kannan, and S. Vempala (1996)). The question of whether a hypothesis that is itself a perceptron (a single threshold function) can be found in polynomial time was open. We show that indeed, noisy perceptrons are PAC learnable with a hypothesis that is a perceptron
Keywords :
computational complexity; learning by example; perceptrons; random noise; random processes; PAC learnable; hypothesis; labeled examples; linear threshold functions; linear thresholds; machine learning; noisy perceptron learning; polynomial time; random classification noise; single threshold function; Ear; Linear programming; Machine learning; Noise generators; Noise level; Polynomials; Probability distribution; Vectors; Zinc;
Conference_Titel :
Foundations of Computer Science, 1997. Proceedings., 38th Annual Symposium on
Conference_Location :
Miami Beach, FL
Print_ISBN :
0-8186-8197-7
DOI :
10.1109/SFCS.1997.646140