Title :
Weight evolution algorithm with dynamic offset range
Author :
Ng, S.C. ; Leung, S.H. ; Luk, A.
Author_Institution :
Dept. of Electron. Eng., City Univ. of Hong Kong, Hong Kong
Abstract :
The main problems for gradient-descent algorithms such as backpropagation are its slow convergence rate and the possibility of being trapped in local minima. In this paper, a weight evolution algorithm with dynamic offset range is proposed to remedy the above problems. The idea of weight evolution is to evolve the network weights in a controlled manner during the learning phase of backpropagation so as to jump to the regions of smaller mean squared error whenever the backpropagation stops at a local minimum. If the algorithm is consistently being trapped in a local minimum, the offset range for weight evolution will be incremented to allow larger weight space to be searched. When the local minimum is bypassed, the offset range will be reset to the initial value. It can be proved that this method can always escape local minima and guarantee convergence to the global solution. Simulation results show that the weight evolution algorithm with dynamic offset range gives a faster convergence rate and global search capability
Keywords :
backpropagation; convergence; feedforward neural nets; multilayer perceptrons; backpropagation; dynamic offset range; global search capability; gradient-descent algorithms; learning phase; local minimum; mean squared error; weight evolution algorithm; Convergence; Dynamic range; Electron traps; Error correction; Heuristic algorithms; Multi-layer neural network; Neural networks; Neurons; Supervised learning; Weight control;
Conference_Titel :
Neural Networks,1997., International Conference on
Conference_Location :
Houston, TX
Print_ISBN :
0-7803-4122-8
DOI :
10.1109/ICNN.1997.616181