Title :
An optimum NLMS algorithm: performance improvement over LMS
Author :
Douglas, S.C. ; Meng, T.H.-Y.
Author_Institution :
Inf. Syst. Lab., Stanford Univ., CA, USA
Abstract :
The authors prove that for zero-mean white data input, the optimum algorithm which modifies the data vector in the LMS (least mean square) gradient estimate to achieve the lowest excess mean-square error for a given convergence rate is the normalized LMS (NLMS) algorithm. It is shown that this adaptive filtering algorithm is equivalent to recursive least-squares adaptation with a known diagonal data covariance matrix. Moreover, the algorithm can be interpreted as a modified LMS algorithm in which the iterated weight vector is used to form the error estimate. Both theoretical calculations and simulations for white Gaussian data show that this NLMS algorithm performs as much as 3.6 dB better than standard LMS for the input
Keywords :
filtering and prediction theory; least squares approximations; LMS gradient estimate; NLMS algorithm; adaptive filtering; convergence rate; data vector; diagonal data covariance matrix; error estimate; iterated weight vector; least mean square; mean-square error; normalized LMS; recursive least-squares adaptation; simulations; white Gaussian data; zero-mean white data input; Control systems; Convergence; Covariance matrix; Echo cancellers; Error correction; Filtering algorithms; Filters; Information systems; Laboratories; Least squares approximation;
Conference_Titel :
Acoustics, Speech, and Signal Processing, 1991. ICASSP-91., 1991 International Conference on
Conference_Location :
Toronto, Ont.
Print_ISBN :
0-7803-0003-3
DOI :
10.1109/ICASSP.1991.150826