DocumentCode :
288428
Title :
Optimization schemes for neural network training
Author :
Chen, Oscal T C ; Sheu, Bing J.
Author_Institution :
Dept. of Electr. Eng., Univ. of Southern California, Los Angeles, CA, USA
Volume :
2
fYear :
1994
fDate :
27 Jun-2 Jul 1994
Firstpage :
817
Abstract :
Neural networks are parameterized by a set of synaptic weights. The task of an optimization scheme for a neural network is to find a set of synaptic weights that make the network perform the desired function. The backpropagation learning method, quasi-Newton method, non-derivative quasi-Newton method, Gauss-Newton method, secant method and simulated Cauchy annealing method have been investigated. According to the computation time, convergence speed, and mean-squared error between the network outputs and desired results, the comparison of these learning methods has been presented. For learning a sine function, the quasi-Newton method can yield the best performance and the Gauss-Newton method also can provide a good promising result
Keywords :
Newton method; backpropagation; computational complexity; convergence of numerical methods; learning (artificial intelligence); neural nets; optimisation; simulated annealing; Gauss-Newton method; backpropagation learning method; computation time; convergence speed; mean-squared error; network outputs; neural network training; nonderivative quasi-Newton method; optimization schemes; quasi-Newton method; secant method; simulated Cauchy annealing method; sine function; synaptic weights; Backpropagation; Computational modeling; Computer networks; Convergence; Learning systems; Least squares methods; Neural networks; Newton method; Recursive estimation; Simulated annealing;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Neural Networks, 1994. IEEE World Congress on Computational Intelligence., 1994 IEEE International Conference on
Conference_Location :
Orlando, FL
Print_ISBN :
0-7803-1901-X
Type :
conf
DOI :
10.1109/ICNN.1994.374284
Filename :
374284
Link To Document :
بازگشت