Abstract :
We consider exponentially weighted recursive least squares (RLS) computations with forgetting factor γ (0 < γ < 1). The least squares estimator can be found by solving a matrix system A(t)x(t) = b(t) at each adaptive time step t. Unlike the sliding window RLS computation, the matrix A(t) is not a “near-Toeplitz” matrix (a sum of products of Toeplitz matrices). However, we show that its scaled matrix is a “near-Toeplitz” matrix, and hence the matrix-vector multiplication can be performed efficiently by using fast Fourier transforms (FFTs). We apply the FFT-based preconditioned conjugate gradient method to solve such systems. When the input stochastic process is stationary, we prove that both [ A(t) − E(A(t)) 2] and Var[ A(t) − E(A(t)) 2] tend to zero, provided that the number of data samples taken is sufficient large. Here (•) and Var(•) are the expectation and variance operators respectively. Hence the expected values of the eigenvalues of the preconditioned matrices are near to 1 except for a finite number of outlying eigenvalues. The result is stronger than those proved by Ng, Chan, and Plemmons that the spectra of the preconditioned matrices are clustered around 1 with probability 1.