Title :
An integrated algorithm of magnified gradient function and weight evolution for solving local minima problem
Author :
Ng, S.C. ; Leung, S.H. ; Luk, A.
fDate :
6/24/1905 12:00:00 AM
Abstract :
This paper presents the integration of magnified gradient function and weight evolution algorithms in order to solve the local minima problem. The combination of the two algorithms gives a significant improvement in terms of the convergence rate and global search capability as compared to some common fast learning algorithms such as the standard backpropagation, Quickprop, resilient propagation, SARPROP, and genetic algorithms
Keywords :
backpropagation; convergence of numerical methods; feedforward neural nets; genetic algorithms; gradient methods; mathematics computing; Quickprop; SARPROP; backpropagation; convergence; feedforward neural networks; genetic algorithms; global search; learning algorithms; local minima problem; magnified gradient function; resilient propagation; weight evolution algorithms; Australia; Computational modeling; Convergence; Equations; Feedforward neural networks; Genetic algorithms; Investments; Neural networks; Optimization methods; Simulated annealing;
Conference_Titel :
Neural Networks, 2002. IJCNN '02. Proceedings of the 2002 International Joint Conference on
Conference_Location :
Honolulu, HI
Print_ISBN :
0-7803-7278-6
DOI :
10.1109/IJCNN.2002.1005570