DocumentCode :
1219232
Title :
Modified recursive least squares (RLS) algorithm for neural networks using piecewise linear function
Author :
Gokhale, A.P. ; Nawghare, P.M.
Author_Institution :
Dept. of Electron. & Comput. Sci., Visvesvaraya Nat. Inst. of Technol., Nagpur, India
Volume :
151
Issue :
6
fYear :
2004
Firstpage :
510
Lastpage :
518
Abstract :
The recursive least squares (RLS) learning algorithm for multilayer feedforward neural networks uses a sigmoid nonlinearity at node outputs. It is shown that by using a piecewise linear function at node outputs, the algorithm becomes faster. The modified algorithm improves computational efficiency and by preserving matrix symmetry it is possible to avoid explosive divergence, which is normally seen in the conventional RLS algorithm due to the finite precision effects. Also the use of this piecewise linear function avoids the approximation, which is otherwise necessary in the derivation of the conventional algorithm with sigmoid nonlinearity. Simulation results on the XOR problem, 4-2-4 encoder and function approximation problem indicate that the modified algorithm reduces the occurrence of local minima and improves the convergence speed compared to the conventional RLS algorithm. A nonlinear system identification and control problem is considered to demonstrate the application of the algorithm to complex problems.
Keywords :
feedforward neural nets; least squares approximations; piecewise linear techniques; recursive functions; 4-2-4 encoder; RLS algorithm; XOR problem; computational efficiency; convergence speed; explosive divergence; finite precision effects; function approximation; local minima; matrix symmetry; multilayer feedforward neural networks; node outputs; piecewise linear function; recursive least squares algorithm; sigmoid nonlinearity;
fLanguage :
English
Journal_Title :
Circuits, Devices and Systems, IEE Proceedings -
Publisher :
iet
ISSN :
1350-2409
Type :
jour
DOI :
10.1049/ip-cds:20040614
Filename :
1387796
Link To Document :
بازگشت