DocumentCode
1817371
Title
Theoretical derivation of momentum term in back-propagation
Author
Hagiwara, Masafumi
Author_Institution
Psychol. Dept., Stanford Univ., CA, USA
Volume
1
fYear
1992
fDate
7-11 Jun 1992
Firstpage
682
Abstract
The theoretical origin of a momentum term in the backpropagation algorithm is explained. It is proved that the backpropagation algorithm, which has a momentum term, can be derived through the following assumptions: (1) The cost function is E n=Σ/E p, for p =1 to n , where E p is the sum of squared error at the output layer, and (2) the most recent weights are assumed in calculating E n
Keywords
backpropagation; learning (artificial intelligence); neural nets; backpropagation; momentum term; neural nets; training; Acceleration; Artificial neural networks; Cost function; Gradient methods; Psychology; Resonance light scattering;
fLanguage
English
Publisher
ieee
Conference_Titel
Neural Networks, 1992. IJCNN., International Joint Conference on
Conference_Location
Baltimore, MD
Print_ISBN
0-7803-0559-0
Type
conf
DOI
10.1109/IJCNN.1992.287108
Filename
287108
Link To Document