Title :
Two regularizers for recursive least squared algorithms in feedforward multilayered neural networks
Author :
Leung, Chi-sing ; Tsoi, Ah-Chung ; Chan, Lai Wan
Author_Institution :
Dept. of Electron. Eng., City Univ. of Hong Kong, China
fDate :
11/1/2001 12:00:00 AM
Abstract :
Recursive least squares (RLS)-based algorithms are a class of fast online training algorithms for feedforward multilayered neural networks (FMNNs). Though the standard RLS algorithm has an implicit weight decay term in its energy function, the weight decay effect decreases linearly as the number of learning epochs increases, thus rendering a diminishing weight decay effect as training progresses. In this paper, we derive two modified RLS algorithms to tackle this problem. In the first algorithm, namely, the true weight decay RLS (TWDRLS) algorithm, we consider a modified energy function whereby the weight decay effect remains constant, irrespective of the number of learning epochs. The second version, the input perturbation RLS (IPRLS) algorithm, is derived by requiring robustness in its prediction performance to input perturbations. Simulation results show that both algorithms improve the generalization capability of the trained network
Keywords :
feedforward neural nets; least squares approximations; feedforward multilayered neural networks; modified energy function; online training algorithms; prediction performance; recursive least squared algorithms; regularizers; weight decay effect; Backpropagation algorithms; Feedforward neural networks; Intelligent networks; Least squares methods; Multi-layer neural network; Neural networks; Performance evaluation; Resonance light scattering; Robustness; Testing;
Journal_Title :
Neural Networks, IEEE Transactions on