Title :
A new local linearized least squares algorithm for training feedforward neural networks
Author :
Stan, Octavian ; Kamen, Edward W.
Author_Institution :
Sch. of Electr. & Comput. Eng., Georgia Inst. of Technol., Atlanta, GA, USA
Abstract :
An algorithm used to train the weights of a feedforward neural network is the global extended Kalman filter (GEKF) algorithm, which has much better performance than the popular gradient descent with error backpropagation in terms of convergence and quality of solution. However, the GEKF is very computationally intensive, and this has led to the development of simplified algorithms based on the partitioning of the global nonlinear optimization problem into a set of local nonlinear problems at the neuron level. In this paper a new training algorithm is developed by viewing the local subproblems as recursive linearized least squares problems. The objective function of the least squares problems for each neuron is the sum of the squares of the linearized backpropagated error signals. The new algorithm is shown to give better convergence results for two benchmark problems in comparison to existing local algorithms
Keywords :
Kalman filters; feedforward neural nets; learning (artificial intelligence); least squares approximations; linearisation techniques; GEKF; convergence; feedforward neural network training; global extended Kalman filter; global nonlinear optimization problem; linearized backpropagated error signals; local linearized least squares algorithm; local nonlinear problems; recursive linearized least squares problems; sum of squares; Backpropagation algorithms; Computational complexity; Differential equations; Feedforward neural networks; Least squares methods; Multilayer perceptrons; Neural networks; Neurons; Nonlinear equations; Partitioning algorithms;
Conference_Titel :
Neural Networks,1997., International Conference on
Conference_Location :
Houston, TX
Print_ISBN :
0-7803-4122-8
DOI :
10.1109/ICNN.1997.614184