Title :
Stochastic gradient adaptation under general error criteria
Author :
Douglas, Scott C. ; Meng, Teresa H Y
Author_Institution :
Dept. of Electr. Eng., Utah Univ., Salt Lake City, UT, USA
fDate :
6/1/1994 12:00:00 AM
Abstract :
Examines a family of adaptive filter algorithms of the form Wk+1=Wk+μf(dk-Wkt Xk)Xk in which f(·) is a memoryless odd-symmetric nonlinearity acting upon the error. Such algorithms are a generalization of the least-mean-square (LMS) adaptive filtering algorithm for even-symmetric error criteria. For this algorithm family, the authors derive general expressions for the mean and mean-square convergence of the filter coefficients For both arbitrary stochastic input data and Gaussian input data. They then provide methods for optimizing the nonlinearity to minimize the algorithm misadjustment for a given convergence rate. Using the calculus of variations, it is shown that the optimum nonlinearity to minimize misadjustment near convergence under slow adaptation conditions is independent of the statistics of the input data and can be expressed as -p´(x)/p(x), where p(x) is the probability density function of the uncorrelated plant noise. For faster adaptation under the white Gaussian input and noise assumptions, the nonlinearity is shown to be x/{1+μλx2/σk 2}, where λ is the input signal power and σk2 is the conditional error power. Thus, the optimum stochastic gradient error criterion for Gaussian noise is not mean-square. It is shown that the equations governing the convergence of the nonlinear algorithm are exactly those which describe the behavior of the optimum scalar data nonlinear adaptive algorithm for white Gaussian input. Simulations verify the results for a host of noise interferences and indicate the improvement using non-mean-square error criteria
Keywords :
adaptive filters; convergence of numerical methods; filtering and prediction theory; least squares approximations; stochastic processes; Gaussian input data; adaptive filter algorithms; adaptive filtering algorithm; algorithm misadjustment; convergence rate; even-symmetric error criteria; general error criteria; least-mean-squares; mean-square convergence; memoryless odd-symmetric nonlinearity; noise interference; nonmean-square error criteria; probability density function; stochastic gradient adaptation; stochastic input data; uncorrelated plant noise; white Gaussian input; Adaptive filters; Calculus; Convergence; Filtering algorithms; Gaussian noise; Genetic expression; Least squares approximation; Optimization methods; Statistics; Stochastic processes;
Journal_Title :
Signal Processing, IEEE Transactions on