DocumentCode :
155631
Title :
A delayed proximal gradient method with linear convergence rate
Author :
Feyzmahdavian, Hamid Reza ; Aytekin, Arda ; Johansson, Mikael
Author_Institution :
ACCESS Linnaeus Center, KTH - R. Inst. of Technol., Stockholm, Sweden
fYear :
2014
fDate :
21-24 Sept. 2014
Firstpage :
1
Lastpage :
6
Abstract :
This paper presents a new incremental gradient algorithm for minimizing the average of a large number of smooth component functions based on delayed partial gradients. Even with a constant step size, which can be chosen independently of the maximum delay bound and the number of objective function components, the expected objective value is guaranteed to converge linearly to within some ball around the optimum. We derive an explicit expression that quantifies how the convergence rate depends on objective function properties and algorithm parameters such as step-size and the maximum delay. An associated upper bound on the asymptotic error reveals the trade-off between convergence speed and residual error. Numerical examples confirm the validity of our results.
Keywords :
convergence; gradient methods; learning (artificial intelligence); algorithm parameter; asymptotic error; constant step size; convergence speed; delayed partial gradient; delayed proximal gradient method; incremental gradient algorithm; linear convergence rate; objective function component; objective function property; residual error; smooth component function; Convergence; Delays; Linear programming; Optimization; Radio frequency; Upper bound; Vectors; Incremental gradient; asynchronous parallelism; machine learning;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Machine Learning for Signal Processing (MLSP), 2014 IEEE International Workshop on
Conference_Location :
Reims
Type :
conf
DOI :
10.1109/MLSP.2014.6958872
Filename :
6958872
Link To Document :
بازگشت