DocumentCode :
1242354
Title :
Classification of linearly nonseparable patterns by linear threshold elements
Author :
Roychowdhury, Vwani P. ; Siu, Kai-Yeung ; Kailath, Thomas
Author_Institution :
Sch. of Electr. Eng., Purdue Univ., West Lafayette, IN, USA
Volume :
6
Issue :
2
fYear :
1995
fDate :
3/1/1995 12:00:00 AM
Firstpage :
318
Lastpage :
331
Abstract :
Learning and convergence properties of linear threshold elements or perceptrons are well understood for the case where the input vectors (or the training sets) to the perceptron are linearly separable. Little is known, however, about the behavior of the perceptron learning algorithm when the training sets are linearly nonseparable. We present the first known results on the structure of linearly nonseparable training sets and on the behavior of perceptrons when the set of input vectors is linearly nonseparable. More precisely, we show that using the well known perceptron learning algorithm, a linear threshold element can learn the input vectors that are provably learnable, and identify those vectors that cannot be learned without committing errors. We also show how a linear threshold element can be used to learn large linearly separable subsets of any given nonseparable training set. In order to develop our results, we first establish formal characterizations of linearly nonseparable training sets and define learnable structures for such patterns. We also prove computational complexity results for the related learning problems. Next, based on such characterizations, we show that a perceptron does the best one can expect for linearly nonseparable sets of input vectors and learns as much as is theoretically possible
Keywords :
computational complexity; learning (artificial intelligence); pattern classification; perceptrons; computational complexity results; convergence properties; formal characterizations; input vectors; large linearly separable subsets; learnable structures; linear threshold elements; linearly nonseparable pattern classification; linearly nonseparable training sets; perceptron learning algorithm; training sets; Computational complexity; Computational modeling; Convergence; Information systems; Laboratories; Machine learning; NASA; Vectors;
fLanguage :
English
Journal_Title :
Neural Networks, IEEE Transactions on
Publisher :
ieee
ISSN :
1045-9227
Type :
jour
DOI :
10.1109/72.363468
Filename :
363468
Link To Document :
بازگشت