Title :
Backpropagation with two-phase magnified gradient function
Author :
Cheung, Chi-Chung ; Ng, Sin-Chun
Author_Institution :
Dept. of Electron. & Inf. Eng., Hong Kong Polytech. Univ., Hong Kong
Abstract :
Backpropagation (BP) learning algorithm is the most widely supervised learning technique which is extensively applied in the training of multi-layer feed-forward neural networks. Many modifications have been proposed to improve the performance of BP, and BP with Magnified Gradient Function (MGFPROP) is one of the fast learning algorithms which improve both the convergence rate and the global convergence capability of BP [19]. MGFPROP outperforms many benchmarking fast learning algorithms in different adaptive problems [19]. However, the performance of MGFPROP is limited due to the error overshooting problem. This paper presents a new approach called BP with Two-Phase Magnified Gradient Function (2P-MGFPROP) to overcome the error overshooting problem and hence speed up the convergence rate of MGFPROP. 2P-MGFPROP is modified from MGFPROP. It divides the learning process into two phases and adjusts the parameter setting of MGFPROP based on the nature of the phase of the learning process. Through simulation results in two different adaptive problems, 2P-MGFPROP outperforms MGFPROP with optimal parameter setting in terms of the convergence rate, and the improvement can be up to 50%.
Keywords :
backpropagation; convergence; gradient methods; backpropagation learning algorithm; convergence capability; supervised learning technique; training; two-phase magnified gradient function; Backpropagation;
Conference_Titel :
Neural Networks, 2008. IJCNN 2008. (IEEE World Congress on Computational Intelligence). IEEE International Joint Conference on
Conference_Location :
Hong Kong
Print_ISBN :
978-1-4244-1820-6
Electronic_ISBN :
1098-7576
DOI :
10.1109/IJCNN.2008.4633873