DocumentCode :
2343156
Title :
Second order back-propagation learning algorithm and its application for neural network
Author :
Tienan, Liu ; Weijian, Ren ; Guangyi, Chen ; Xu Baochang ; Di, Yu
Author_Institution :
Dept. of Autom. & Control Eng., Daqing Pet. Inst., Heilongjiang, China
Volume :
2
fYear :
2000
fDate :
2000
Firstpage :
817
Abstract :
In this paper, a new second order recursive learning algorithm to multilayer feedforward network is proposed. This algorithm makes not only each layer´s errors but also second order derivative information factors backpropagate. And it is proved that it is equivalent to Newton iterative algorithm and has second order convergent speed. New algorithm achieves the recurrence calculation of Newton search directions and the inverse of Hessian matrices. Its calculation complexity corresponds to that of common recursive least squares algorithm. It is stated clearly that this new algorithm is superior to Karayiannis´ second order algorithm (1994) according to analysis of their properties
Keywords :
Hessian matrices; Newton method; backpropagation; computational complexity; feedforward neural nets; multilayer perceptrons; Hessian matrix inverses; Newton iterative algorithm; Newton search directions; calculation complexity; error backpropagation; multilayer feedforward network; neural network; recurrence; second-order back-propagation learning algorithm; second-order convergence speed; second-order derivative information factor backpropagation; second-order recursive learning algorithm; Algorithm design and analysis; Automation; Control engineering; Feedforward neural networks; Iterative algorithms; Least squares methods; Multi-layer neural network; Neural networks; Nonhomogeneous media; Petroleum;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Intelligent Control and Automation, 2000. Proceedings of the 3rd World Congress on
Conference_Location :
Hefei
Print_ISBN :
0-7803-5995-X
Type :
conf
DOI :
10.1109/WCICA.2000.863343
Filename :
863343
Link To Document :
بازگشت