Title :
Convergence of the generalized back-propagation algorithm with constant learning rates
Author :
Ng, S.C. ; Leung, S.H. ; Luk, A.
Author_Institution :
Dept. of Electron. Eng., City Univ. of Hong Kong, Hong Kong
Abstract :
A new generalized back-propagation algorithm which can effectively speed up the convergence rate and reduce the chance of being trapped in local minima was previously proposed by the authors (1996). In this paper, we will analyze the convergence of the generalized back-propagation algorithm. The weight sequences in generalized backpropagation algorithm can be approximated by a certain ordinary differential equation (ODE). When the learning rate tends to zero, the interpolated weight sequences of generalized back-propagation converge weakly to the solution of associated ODE
Keywords :
backpropagation; differential equations; interpolation; neural nets; ODE; constant learning rates; generalized back-propagation algorithm convergence; interpolated weight sequences; local minima; neural nets; ordinary differential equation; weight sequences; Algorithm design and analysis; Convergence; Differential equations; Electron traps; Gradient methods; Iterative algorithms; Neurons; Signal processing;
Conference_Titel :
Neural Networks Proceedings, 1998. IEEE World Congress on Computational Intelligence. The 1998 IEEE International Joint Conference on
Conference_Location :
Anchorage, AK
Print_ISBN :
0-7803-4859-1
DOI :
10.1109/IJCNN.1998.685924