Abstract :
It is well known that, for the deterministic optimization problem, the method of conjugate gradient has superior convergence rates compared to ordinary gradient methods. For quadratic problems, the conjugate gradient method has finite termination property which makes it one of the most favorable iterative methods. However, the fast convergence and the finite termination property can easily break down when the function to be optimized is noisy since conjugacy among search directions can no longer be maintained over the course of the iterations. In this letter, a conjugation procedure is applied in an adaptive filtering algorithm, where, instead of producing a set of conjugate search directions, only pairwise conjugation of gradients is produced during each system update. Simulations show that the algorithm provides superior convergence compared to the stochastic gradient descent algorithm, and comparable to existing conjugate gradient-based adaptive filtering algorithms, but at a lower computational cost.
Keywords :
adaptive filters; gradient methods; stochastic processes; adaptive filtering; deterministic optimization problem; finite termination property; global convergent stochastic algorithm; iterative methods; pairwise conjugate gradient-based algorithm; stochastic gradient descent algorithm; Adaptive filters; Computational efficiency; Computational modeling; Convergence; Filtering algorithms; Gradient methods; Iterative algorithms; Iterative methods; Optimization methods; Stochastic processes; Adaptive equalizer; adaptive filtering algorithms; conjugate gradients; minimal residual method; system identification;