Title :
Convergence of exponentiated gradient algorithms
Author :
Hill, Simon I. ; Williamson, Robert C.
Author_Institution :
Res. Sch. of Inf. Sci. & Eng., Australian Nat. Univ., Canberra, ACT, Australia
fDate :
6/1/2001 12:00:00 AM
Abstract :
This paper studies three related algorithms: the (traditional) gradient descent (GD) algorithm, the exponentiated gradient algorithm with positive and negative weights (EG± algorithm), and the exponentiated gradient algorithm with unnormalized positive and negative weights (EGU± algorithm). These algorithms have been previously analyzed using the “mistake-bound framework” in the computational learning theory community. We perform a traditional signal processing analysis in terms of the mean square error. A relationship between the learning rate and the mean squared error (MSE) of predictions is found for the family of algorithms. This is used to compare the performance of the algorithms by choosing learning rates such that they converge to the same steady-state MSE. We demonstrate that if the target weight vector is sparse, the EG± algorithm typically converges more quickly than the GD or EGU± algorithms that perform very similarly. A side effect of our analysis is a reparametrization of the algorithms that provides insights into their behavior. The general form of the results we obtain are consistent with those obtained in the mistake-bound framework. The application of the algorithms to acoustic echo cancellation is then studied, and it is shown in some circumstances that the EG± algorithm will converge faster than the other two algorithms
Keywords :
convergence of numerical methods; echo suppression; gradient methods; learning systems; mean square error methods; signal processing; acoustic echo cancellation; algorithms performance; computational learning theory; exponentiated gradient algorithm; exponentiated gradient algorithms convergence; gradient descent algorithm; learning rate; mean square error; mistake-bound; negative weight; prediction MSE; reparametrization; signal processing analysis; sparse target weight vector; steady-state MSE; unnormalized negative weights; unnormalized positive weights; Acoustic applications; Algorithm design and analysis; Convergence; Echo cancellers; Least squares approximation; Mean square error methods; Performance analysis; Signal analysis; Signal processing algorithms; Steady-state;
Journal_Title :
Signal Processing, IEEE Transactions on