Title :
Online least-squares training for the underdetermined case
Author :
Schultz, R.L. ; Hagan, Martin T.
Author_Institution :
Haliburton Energy Services, Houston, TX, USA
Abstract :
We describe an online method of training neural networks, which is based on solving the linearized least-squares problem using the pseudo-inverse for the underdetermined case. This underdetermined linearized least squares (ULLS) method requires significantly less computation and memory for implementation than standard higher-order methods such as the Gauss-Newton method or extended Kalman filter. This decrease is possible because the method allows training to proceed with a smaller number of samples than parameters. Simulation results which compare the performance of the ULLS algorithm to the recursive linearized least squares algorithm (RLLS) and the gradient descent algorithm are presented. Results showing the impact on computational complexity and squared-error performance of the ULLS method, when the number of terms in the Jacobian matrix is varied, are also presented
Keywords :
Jacobian matrices; computational complexity; learning (artificial intelligence); least squares approximations; neural nets; optimisation; real-time systems; Jacobian matrix; computational complexity; neural networks; online learning; squared-error performance; underdetermined linearized least squares; Computational modeling; Computer aided software engineering; Jacobian matrices; Least squares approximation; Least squares methods; Neural networks; Newton method; Optimization methods; Recursive estimation; Vectors;
Conference_Titel :
Neural Networks, 1999. IJCNN '99. International Joint Conference on
Conference_Location :
Washington, DC
Print_ISBN :
0-7803-5529-6
DOI :
10.1109/IJCNN.1999.832665