Title :
Optimization for training neural nets
Author :
Barnard, Etienne
Author_Institution :
Dept. of Electron. & Comput. Eng., Pretoria Univ., South Africa
fDate :
3/1/1992 12:00:00 AM
Abstract :
Various techniques of optimizing criterion functions to train neural-net classifiers are investigated. These techniques include three standard deterministic techniques (variable metric, conjugate gradient, and steepest descent), and a new stochastic technique. It is found that the stochastic technique is preferable on problems with large training sets and that the convergence rates of the variable metric and conjugate gradient techniques are similar
Keywords :
computerised pattern recognition; learning systems; minimisation; neural nets; conjugate gradient; convergence rates; deterministic techniques; minimisation; neural nets; neural-net classifiers; optimisation; steepest descent; stochastic technique; training; variable metric; Africa; Convergence; Error analysis; Frequency domain analysis; Maintenance engineering; Neural networks; Neurons; Polynomials; Robustness; Stochastic processes;
Journal_Title :
Neural Networks, IEEE Transactions on