DocumentCode :
1627318
Title :
An adaptive training algorithm for back-propagation neural networks
Author :
Hsin, Hsi-Chin ; Li, Ching-Chung ; Sun, Mingui ; Sclabassi, Robert J.
Author_Institution :
Pittsburgh Univ., PA, USA
fYear :
1992
Firstpage :
1049
Abstract :
To improve the convergence speed of the backpropagation training algorithm, the authors have chosen a dynamic learning rate which is a weighted average of direction cosines of successive incremental weight vectors ΔW at the current and several previous iterations. These adjacent direction cosines reflect the local curvature of the error surface, along which an `optimum´ search for the minimum error is determined for the weight adjustment of the next iteration. The authors have tested this on a real problem of training a three-layer feedforward artificial neural network for REM (rapid eye movement) sleep stage recognition. The training performance was significantly improved in terms of both faster convergence and smaller error when the last three direction cosines were included in determining the dynamic learning rate
Keywords :
backpropagation; biology computing; convergence; feedforward neural nets; pattern recognition; adaptive training algorithm; backpropagation; convergence; dynamic learning rate; multilayer feedforward neural nets; neural networks; pattern recognition; rapid eye movement; sleep stage recognition; Acceleration; Artificial neural networks; Convergence; Joining processes; Neural networks; Pattern recognition; Signal processing; Sleep; Sun; Surgery;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Systems, Man and Cybernetics, 1992., IEEE International Conference on
Conference_Location :
Chicago, IL
Print_ISBN :
0-7803-0720-8
Type :
conf
DOI :
10.1109/ICSMC.1992.271653
Filename :
271653
Link To Document :
بازگشت