DocumentCode
1998581
Title
An optimum NLMS algorithm: performance improvement over LMS
Author
Douglas, S.C. ; Meng, T.H.-Y.
Author_Institution
Inf. Syst. Lab., Stanford Univ., CA, USA
fYear
1991
fDate
14-17 Apr 1991
Firstpage
2125
Abstract
The authors prove that for zero-mean white data input, the optimum algorithm which modifies the data vector in the LMS (least mean square) gradient estimate to achieve the lowest excess mean-square error for a given convergence rate is the normalized LMS (NLMS) algorithm. It is shown that this adaptive filtering algorithm is equivalent to recursive least-squares adaptation with a known diagonal data covariance matrix. Moreover, the algorithm can be interpreted as a modified LMS algorithm in which the iterated weight vector is used to form the error estimate. Both theoretical calculations and simulations for white Gaussian data show that this NLMS algorithm performs as much as 3.6 dB better than standard LMS for the input
Keywords
filtering and prediction theory; least squares approximations; LMS gradient estimate; NLMS algorithm; adaptive filtering; convergence rate; data vector; diagonal data covariance matrix; error estimate; iterated weight vector; least mean square; mean-square error; normalized LMS; recursive least-squares adaptation; simulations; white Gaussian data; zero-mean white data input; Control systems; Convergence; Covariance matrix; Echo cancellers; Error correction; Filtering algorithms; Filters; Information systems; Laboratories; Least squares approximation;
fLanguage
English
Publisher
ieee
Conference_Titel
Acoustics, Speech, and Signal Processing, 1991. ICASSP-91., 1991 International Conference on
Conference_Location
Toronto, Ont.
ISSN
1520-6149
Print_ISBN
0-7803-0003-3
Type
conf
DOI
10.1109/ICASSP.1991.150826
Filename
150826
Link To Document