DocumentCode :
1365670
Title :
Advanced neural-network training algorithm with reduced complexity based on Jacobian deficiency
Author :
Zhou, Guian ; Si, Jennie
Author_Institution :
Dept. of Electr. Eng., Arizona State Univ., Tempe, AZ, USA
Volume :
9
Issue :
3
fYear :
1998
fDate :
5/1/1998 12:00:00 AM
Firstpage :
448
Lastpage :
453
Abstract :
We introduce an advanced supervised training method for neural networks. It is based on Jacobian rank deficiency and it is formulated, in some sense, in the spirit of the Gauss-Newton algorithm. The Levenberg-Marquardt algorithm, as a modified Gauss-Newton, has been used successfully in solving nonlinear least squares problems including neural-network training. It outperforms the basic backpropagation and its variations with variable learning rate significantly, but with higher computation and memory complexities within each iteration. The mew method developed in this paper is aiming at improving convergence properties, while reducing the memory and computation complexities in supervised training of neural networks. Extensive simulation results are provided to demonstrate the superior performance of the new algorithm over the Levenberg-Marquardt algorithm
Keywords :
Jacobian matrices; computational complexity; convergence of numerical methods; iterative methods; learning (artificial intelligence); neural nets; Gauss-Newton algorithm; Jacobian deficiency; Levenberg-Marquardt algorithm; computational complexity; convergence; iterative method; learning; neural-network; subset updating; trust region algorithm; Backpropagation algorithms; Computational modeling; Computer networks; Convergence; Gaussian processes; Iterative algorithms; Jacobian matrices; Least squares methods; Neural networks; Newton method;
fLanguage :
English
Journal_Title :
Neural Networks, IEEE Transactions on
Publisher :
ieee
ISSN :
1045-9227
Type :
jour
DOI :
10.1109/72.668886
Filename :
668886
Link To Document :
بازگشت