DocumentCode :
2445130
Title :
Learning without local minima
Author :
Barhen, J. ; Toomarian, N. ; Fijany, A.
Author_Institution :
Jet Propulsion Lab., California Inst. of Technol., Pasadena, CA, USA
Volume :
7
fYear :
1994
fDate :
27 Jun-2 Jul 1994
Firstpage :
4592
Abstract :
A computationally efficient methodology for overcoming local minima in nonlinear neural network learning is presented. This methodology is based on the newly discovered TRUST global optimization paradigm. Enhancements to the backpropagation schema in feedforward multilayer architectures, and to adjoint-operator learning in recurrent networks are discussed. Extensions to TRUST now formally guarantee reaching a global minimum in the multidimensional case. Results for a standard benchmark are included, to illustrate the theoretical developments
Keywords :
backpropagation; convergence of numerical methods; feedforward neural nets; optimisation; recurrent neural nets; TRUST global optimization paradigm; backpropagation; feedforward multilayer architectures; global minimum; learning; nonlinear neural network; recurrent networks; Backpropagation algorithms; Computational efficiency; Computer architecture; Computer networks; Minimization methods; Multi-layer neural network; Neural networks; Nonlinear dynamical systems; Optimization methods; Tunneling;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Neural Networks, 1994. IEEE World Congress on Computational Intelligence., 1994 IEEE International Conference on
Conference_Location :
Orlando, FL
Print_ISBN :
0-7803-1901-X
Type :
conf
DOI :
10.1109/ICNN.1994.375015
Filename :
375015
Link To Document :
بازگشت