Title :
A new adaptive learning algorithm using magnified gradient function
Author :
Ng, S.C. ; Cheung, C.C. ; Leung, S.H. ; Luk, A.
Abstract :
An algorithm is proposed to solve the “flat spot” problem in backpropagation networks by magnifying the gradient function. The idea of the learning algorithm is to vary the gradient of the activation function so as to magnify the backward propagated error signal gradient function especially when the output approaches a wrong value, thus the convergence rate can be accelerated and the flat spot problem can be eliminated. Simulation results show that, in terms of the convergence rate and global search capability, the new algorithm always outperforms the other traditional methods
Keywords :
backpropagation; convergence; feedforward neural nets; multilayer perceptrons; activation function; adaptive learning algorithm; convergence rate; flat spot problem; global search capability; magnified gradient function; Acceleration; Australia; Computer science; Convergence; Equations; Error correction; Investments; Mars; Neurons; Region 7;
Conference_Titel :
Neural Networks, 2001. Proceedings. IJCNN '01. International Joint Conference on
Conference_Location :
Washington, DC
Print_ISBN :
0-7803-7044-9
DOI :
10.1109/IJCNN.2001.939009