Title :
Stochastic LMS with self adaptive forgetting factor
Author :
Chung, Sanguoon ; Mcleod, Paul
Author_Institution :
Adv. Micro Devices Inc., Austin, TX, USA
Abstract :
A least-mean-square algorithm, the stochastic gradient search least-mean-square (SGSLMS) algorithm, is proposed. It is robust to noise in the gradient estimate and has fast convergence without having to use an optimal step size. The SGSLMS algorithm is realized by estimating the correlation between the adaptive error and the input signals using as proposed new adaptive one-pole correlator. The correlator is used for fast, efficient gradient estimation rather than to control the step size, as in some other LMS variants. In the convergence process, the adaptive error, being initially large and highly nonstationary, results in a large gradient estimate. After convergence, the adaptive error is small, almost random, and almost stationary, yielding a small gradient estimate. The correlator is designed to optimize the rate of convergence under these two conditions. The performance of this algorithm is compared to those of the conventional and normalized LMS algorithms.<>
Keywords :
convergence of numerical methods; filtering and prediction theory; least squares approximations; signal processing; LMS algorithms; SGSLMS algorithm; adaptive error; adaptive one-pole correlator; fast convergence; filter weights; gradient estimation; input signals; self adaptive forgetting factor; signal processing; stochastic gradient search least-mean-square; Adaptive filters; Convergence; Correlators; Error correction; Finite impulse response filter; Iterative algorithms; Least squares approximation; Programmable control; Signal processing algorithms; Stochastic processes;
Conference_Titel :
Communications, Computers and Signal Processing, 1989. Conference Proceeding., IEEE Pacific Rim Conference on
Conference_Location :
Victoria, BC, Canada
DOI :
10.1109/PACRIM.1989.48398