DocumentCode :
856598
Title :
Optimization for training neural nets
Author :
Barnard, Etienne
Author_Institution :
Dept. of Electron. & Comput. Eng., Pretoria Univ., South Africa
Volume :
3
Issue :
2
fYear :
1992
fDate :
3/1/1992 12:00:00 AM
Firstpage :
232
Lastpage :
240
Abstract :
Various techniques of optimizing criterion functions to train neural-net classifiers are investigated. These techniques include three standard deterministic techniques (variable metric, conjugate gradient, and steepest descent), and a new stochastic technique. It is found that the stochastic technique is preferable on problems with large training sets and that the convergence rates of the variable metric and conjugate gradient techniques are similar
Keywords :
computerised pattern recognition; learning systems; minimisation; neural nets; conjugate gradient; convergence rates; deterministic techniques; minimisation; neural nets; neural-net classifiers; optimisation; steepest descent; stochastic technique; training; variable metric; Africa; Convergence; Error analysis; Frequency domain analysis; Maintenance engineering; Neural networks; Neurons; Polynomials; Robustness; Stochastic processes;
fLanguage :
English
Journal_Title :
Neural Networks, IEEE Transactions on
Publisher :
ieee
ISSN :
1045-9227
Type :
jour
DOI :
10.1109/72.125864
Filename :
125864
Link To Document :
بازگشت