DocumentCode :
1749048
Title :
A new adaptive learning algorithm using magnified gradient function
Author :
Ng, S.C. ; Cheung, C.C. ; Leung, S.H. ; Luk, A.
Volume :
1
fYear :
2001
fDate :
2001
Firstpage :
156
Abstract :
An algorithm is proposed to solve the “flat spot” problem in backpropagation networks by magnifying the gradient function. The idea of the learning algorithm is to vary the gradient of the activation function so as to magnify the backward propagated error signal gradient function especially when the output approaches a wrong value, thus the convergence rate can be accelerated and the flat spot problem can be eliminated. Simulation results show that, in terms of the convergence rate and global search capability, the new algorithm always outperforms the other traditional methods
Keywords :
backpropagation; convergence; feedforward neural nets; multilayer perceptrons; activation function; adaptive learning algorithm; convergence rate; flat spot problem; global search capability; magnified gradient function; Acceleration; Australia; Computer science; Convergence; Equations; Error correction; Investments; Mars; Neurons; Region 7;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Neural Networks, 2001. Proceedings. IJCNN '01. International Joint Conference on
Conference_Location :
Washington, DC
ISSN :
1098-7576
Print_ISBN :
0-7803-7044-9
Type :
conf
DOI :
10.1109/IJCNN.2001.939009
Filename :
939009
Link To Document :
بازگشت