Title :
TR-BFGS Algorithm for Multi-layer Feed-Forward Neural Networks
Author :
Donglian, Gao ; Rui, Shan
Author_Institution :
Dept. of Sci., Yanshan Univ., Qinhuangdao, China
Abstract :
The most popular algorithm training feed-forward neural networks is the back-propagation algorithm which minimizes the error function using the steepest descent direction. In practice, although, even with a small learning rate which slows down the training process, the BP algorithm can exhibit oscillatory behavior when it encounters steep valleys. Trust region method has advantages of global convergence and quick convergence rate and so on. Broyden-Fletcher-Goldfard-Shano (BFGS) has advantages of computing only an approximation to the inverse of Hessian matrix, that is no second derivatives calculated. Also, the BFGS Hessian matrix update is symmetric and positive definite, making the algorithm numerically stable. This article presents an efficient training algorithm, namely TR-BFGS algorithm, combining with both of their advantages. We evaluate the convergence speed of proposed algorithm by using an example of approximation of function. The experimental results show that the new TR-BFGS algorithm is effective.
Keywords :
Hessian matrices; approximation theory; backpropagation; feedforward neural nets; gradient methods; BFGS Hessian matrix; Broyden-Fletcher-Goldfard-Shano; TR-BFGS algorithm; an approximation; backpropagation algorithm; multi-layer feedforward neural networks; steepest descent direction; Algorithm design and analysis; Approximation algorithms; Approximation methods; Artificial neural networks; Convergence; Neurons; Training; TR-BFGS algorithm; approximation of function; feed-forward neural networks; trust region technology; weights learning;
Conference_Titel :
Cryptography and Network Security, Data Mining and Knowledge Discovery, E-Commerce & Its Applications and Embedded Systems (CDEE), 2010 First ACIS International Symposium on
Conference_Location :
Qinhuangdao
Print_ISBN :
978-1-4244-9595-5
DOI :
10.1109/CDEE.2010.38