DocumentCode :
827026
Title :
Deterministic global optimization for FNN training
Author :
Toh, Kar-Ann
Volume :
33
Issue :
6
fYear :
2003
Firstpage :
977
Lastpage :
983
Abstract :
This paper addresses the issue of training feedforward neural networks by global optimization. The main contributions include characterization of global optimality of a network error function, and formulation of a global descent algorithm to solve the network training problem. A network with a single hidden-layer and a single-output unit is considered. By means of a monotonic transformation, a sufficient condition for global optimality of a network error function is presented. Based on this, a penalty-based algorithm is derived directing the search towards possible regions containing the global minima. Numerical comparison with benchmark problems from the neural network literature shows superiority of the proposed algorithm over some local methods, in terms of the percentage of trials attaining the desired solutions. The algorithm is also shown to be effective for several pattern recognition problems.
Keywords :
backpropagation; feedforward neural nets; nonlinear programming; optimisation; pattern recognition; backpropagation; convex functions; deterministic global optimization; feedforward neural networks training; global descent algorithm; monotonic transformation; network error function; nonlinear programming; pattern recognition; penalty-based algorithm; sufficient condition; terms-constrained optimization; Backpropagation algorithms; Computational complexity; Computers; Feedforward neural networks; Functional programming; Neural networks; Optimization methods; Pattern recognition; Sufficient conditions; Surges;
fLanguage :
English
Journal_Title :
Systems, Man, and Cybernetics, Part B: Cybernetics, IEEE Transactions on
Publisher :
ieee
ISSN :
1083-4419
Type :
jour
DOI :
10.1109/TSMCB.2002.804366
Filename :
1245272
Link To Document :
بازگشت