DocumentCode :
3289668
Title :
Backpropagation separates when perceptrons do
Author :
Sontag, Eduardo D. ; Sussmann, Hétor J.
Author_Institution :
Dept. of Math., Rutgers Univ., New Brunswick, NJ, USA
fYear :
1989
fDate :
0-0 1989
Firstpage :
639
Abstract :
Consideration is given to the behavior of the least-squares problem that arises when one attempts to train a feedforward net with no hidden neurons. It is assumed that the net has monotonic nonlinear output units. Under the assumption that a training set is separable, that is, that there is a set of achievable outputs for which the error is zero, the authors show that there are no nonglobal minima. More precisely, they assume that the error is of a threshold least-mean square (LMS) type, in that the error function is zero for values beyond the target value. The authors´ proof gives, in addition, the following stronger result: the continuous gradient adjustment procedure is such that from any initial weight configuration a separating set of weights is obtained in finite time. Thus they have a precise analog of the perceptron learning theorem. The authors contrast their results with the more classical pattern recognition problem of threshold LMS with linear output units.<>
Keywords :
learning systems; least squares approximations; neural nets; backpropagation; feedforward net; neural nets; neurons; perceptron learning theorem; threshold least-mean square; training set; Learning systems; Least squares methods; Neural networks;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Neural Networks, 1989. IJCNN., International Joint Conference on
Conference_Location :
Washington, DC, USA
Type :
conf
DOI :
10.1109/IJCNN.1989.118644
Filename :
118644
Link To Document :
بازگشت