Title :
Local minima free neural network learning
Author :
Jordanov, Ivan N. ; Rafik, Tahseen A.
Author_Institution :
Dept. of Comput. Sci. & Software Eng., Portsmouth Univ., UK
Abstract :
Global optimization algorithm applied for feedforward neural networks (NN) supervised learning is investigated. The network weights are determined by minimizing the traditional backpropagation error function. The difference is that the optimization based learning algorithm utilizes stochastic technique, based on the use of low discrepancy sequences. This technique searches the parameter space, defined by the network weights, to define initial regions of attraction with candidates for local minima, and then exploits each region to locate the minima, and to determine a global minimum. The proposed technique is initially tested on multimodal mathematical functions and subsequently applied for training NN with moderate size for solving simple benchmark problems. Finally, the results are analysed, discussed, and compared with others.
Keywords :
backpropagation; feedforward neural nets; optimisation; stochastic processes; backpropagation error function; feedforward neural networks; free neural network learning; global optimization algorithm; local minima; low-discrepancy sequences; multimodal mathematical functions; network weights; parameter space; stochastic global optimisation; stochastic technique; supervised learning; Backpropagation algorithms; Benchmark testing; Convergence; Function approximation; Neural networks; Pattern recognition; Robot control; Speech recognition; Stochastic processes; Supervised learning;
Conference_Titel :
Intelligent Systems, 2004. Proceedings. 2004 2nd International IEEE Conference
Print_ISBN :
0-7803-8278-1
DOI :
10.1109/IS.2004.1344633