Title :
Backpropagation using generalized least squares
Author :
Loh, A.P. ; Fong, K.F.
Author_Institution :
Dept. of Electr. Eng., Singapore Univ, Singapore
Abstract :
The backpropagation algorithm is essentially a steepest gradient descent type of optimization routine minimizing a quadratic performance index at each step. The backpropagation algorithm is re-cast in the framework of generalized least squares. The main advantage is that it eliminates the need to predict an optimal value for the step size required in the standard backpropagation algorithm. A simulation result on the approximation of a nonlinear dynamical system is presented to show its rapid rate of convergence compared to the backpropagation algorithm
Keywords :
backpropagation; convergence of numerical methods; least squares approximations; neural nets; nonlinear systems; optimisation; performance index; backpropagation algorithm; convergence rate; generalized least squares; nonlinear dynamical system; optimization; quadratic performance index; steepest gradient descent type; Artificial neural networks; Backpropagation algorithms; Convergence; Least squares approximation; Least squares methods; Neural networks; Optimization methods; Performance analysis; Power generation; Power system modeling;
Conference_Titel :
Neural Networks, 1993., IEEE International Conference on
Conference_Location :
San Francisco, CA
Print_ISBN :
0-7803-0999-5
DOI :
10.1109/ICNN.1993.298624