DocumentCode :
1747743
Title :
On solving the local minima problem of adaptive learning by using deterministic weight evolution algorithm
Author :
Ng, S.C. ; Leung, S.H.
Author_Institution :
Sch. of Sci. & Technol., Open Univ. of Hong Kong, Homantin, Hong Kong
Volume :
1
fYear :
2001
fDate :
2001
Firstpage :
251
Abstract :
This paper continues the discussion of weight evolution algorithm for solving the local minimum problem of back-propagation by changing the weights of a multi-layer neural network in a deterministic way. During the learning phase of back-propagation, the network weights are adjusted intentionally in order to have an improvement in system performance. The idea is to work backward from the error components and the system outputs to deduce a deterministic perturbation on particular network weights for optimization purpose. Simulation results show that the weight evolution algorithm always outperforms the other traditional methods in achieving global convergence. From mathematical analysis, it can be found that the weight evolution between the hidden and output layers can accelerate the convergence speed, whereas the weight evolution between the input and hidden layers can assist in solving the local minima problem
Keywords :
backpropagation; deterministic algorithms; neural nets; adaptive learning; backpropagation; deterministic weight evolution algorithm; global convergence; local minima problem; local minimum problem; mathematical analysis; multilayer neural network; optimization; simulation results; system performance; Acceleration; Convergence; Evolutionary computation; Feedforward neural networks; Indexing; Mathematical analysis; Multi-layer neural network; Neural networks; Paper technology; System performance;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Evolutionary Computation, 2001. Proceedings of the 2001 Congress on
Conference_Location :
Seoul
Print_ISBN :
0-7803-6657-3
Type :
conf
DOI :
10.1109/CEC.2001.934397
Filename :
934397
Link To Document :
بازگشت