Title :
On the convergence behavior of the LMS and the normalized LMS algorithms
Author_Institution :
Inst. Eurecom, Sophia Antipolis, France
fDate :
9/1/1993 12:00:00 AM
Abstract :
It is shown that the normalized least mean square (NLMS) algorithm is a potentially faster converging algorithm compared to the LMS algorithm where the design of the adaptive filter is based on the usually quite limited knowledge of its input signal statistics. A very simple model for the input signal vectors that greatly simplifies analysis of the convergence behavior of the LMS and NLMS algorithms is proposed. Using this model, answers can be obtained to questions for which no answers are currently available using other (perhaps more realistic) models. Examples are given to illustrate that even quantitatively, the answers obtained can be good approximations. It is emphasized that the convergence of the NLMS algorithm can be speeded up significantly by employing a time-varying step size. The optimal step-size sequence can be specified a priori for the case of a white input signal with arbitrary distribution
Keywords :
adaptive filters; convergence of numerical methods; filtering and prediction theory; least squares approximations; signal processing; LMS algorithm; NLMS algorithm; adaptive filter; convergence; input signal vectors; normalized least mean square; time-varying step size; white input signal; Adaptive filters; Algorithm design and analysis; Convergence; Finite impulse response filter; Least squares approximation; Signal analysis; Signal design; Signal processing algorithms; Statistics; Steady-state;
Journal_Title :
Signal Processing, IEEE Transactions on