Title :
Fast convergence for backpropagation network with magnified gradient function
Author :
Ng, S.C. ; Cheung, C.C. ; Leung, S.H. ; Luk, A.
Author_Institution :
Sch. of Sci. & Tech., Open Univ. of HK, Hong Kong, China
Abstract :
This paper presents a modified back-propagation algorithm using magnified gradient function (MGFPROP), which can effectively speed up the convergence rate and improve the global convergence capability of back-propagation. The purpose of MGFPROP is to increase the convergence rate by magnifying the gradient function of the activation function. From the convergence analysis, it is shown that the new algorithm retains the gradient descent property but gives faster convergence than that of the back-propagation algorithm. Simulation results show that, in terms of the convergence rate and the percentage of global convergence, the new algorithm always outperforms the standard back-propagation algorithm and other competing techniques.
Keywords :
backpropagation; convergence; feedforward neural nets; gradient methods; transfer functions; MGFPROP algorithm; activation function; backpropagation network; convergence analysis; fast convergence rate; feedforward neural networks; global convergence; gradient descent property; magnified gradient function; Acceleration; Algorithm design and analysis; Australia; Computer hacking; Convergence; Equations; Error correction; Investments; Neural networks; Neurons;
Conference_Titel :
Neural Networks, 2003. Proceedings of the International Joint Conference on
Print_ISBN :
0-7803-7898-9
DOI :
10.1109/IJCNN.2003.1223698