DocumentCode :
2750603
Title :
A learning algorithm for multilayered neural networks: a Newton method using automatic differentiation
Author :
Yoshida, Takafumi
Author_Institution :
Dept. of Comput. Sci., Gunma Univ.
fYear :
1991
fDate :
8-14 Jul 1991
Abstract :
Summary form only given, as follows. A learning algorithm for multilayered neural networks which is implemented by a Newton method using automatic differentiation was compared to the back-propagation method. It has been thought that the computational cost for obtaining second-order derivatives of an error function is very high, and that a system of linear equations (the Newton equations) cannot be solved practically for large-scale neural networks. However, a forward method of automatic differentiation enables one to calculate the product of the Hessian of the error function and a search direction vector, without calculation of the Hessian itself, with a cost proportional to the cost for the error function. Therefore, even if the network is large, the Newton equations can be solved. Computer simulations show that this method converges to the solutions more rapidly than the back-propagation method
Keywords :
differentiation; iterative methods; learning systems; neural nets; Newton method; automatic differentiation; computational cost; learning algorithm; multilayered neural networks; Computational efficiency; Computer errors; Computer simulation; Cost function; Equations; Large-scale systems; Multi-layer neural network; Neural networks; Newton method; Vectors;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Neural Networks, 1991., IJCNN-91-Seattle International Joint Conference on
Conference_Location :
Seattle, WA
Print_ISBN :
0-7803-0164-1
Type :
conf
DOI :
10.1109/IJCNN.1991.155623
Filename :
155623
Link To Document :
بازگشت