Title :
Separable nonlinear least-squares methods for on-line estimation of neural nets Hammerstein models
Author :
Ngia, Lester S H
Author_Institution :
Dept. of Signals & Syst., Chalmers Univ. of Technol., Goteborg, Sweden
Abstract :
When the estimation of Hammerstein models is based on a minimization of least-squares error criterion, the minimization problem becomes separable with respect to the linear parameters. Therefore, the original minimization problem can be reduced to a minimization problem only in the nonlinear parameters. The proposed recursive algorithms are resulted from this separated minimization problem. They have similar computational loads to the algorithms that result from the original unseparated problem, but they can converge faster and track better than them. In a system identification example, the proposed algorithms are shown to have better convergence and tracking properties than alternative algorithms
Keywords :
convergence; error analysis; identification; least squares approximations; minimisation; neural nets; nonlinear estimation; online operation; tracking; Hammerstein model estimation; computational load; convergence; least-squares error criterion minimization; linear parameters; neural nets; nonlinear parameters; online estimation; recursive algorithms; separable nonlinear least-squares methods; separated minimization problem; system identification; tracking properties; Convergence; Delay effects; Delay estimation; Feedforward neural networks; Feedforward systems; Minimization methods; Neural networks; Polynomials; Recursive estimation; Vectors;
Conference_Titel :
Neural Networks for Signal Processing X, 2000. Proceedings of the 2000 IEEE Signal Processing Society Workshop
Conference_Location :
Sydney, NSW
Print_ISBN :
0-7803-6278-0
DOI :
10.1109/NNSP.2000.889363