DocumentCode :
2886511
Title :
Ergodic mirror descent
Author :
Duchi, John C. ; Agarwal, Alekh ; Johansson, Mikael ; Jordan, Michael I.
Author_Institution :
Dept. of Electr. Eng. & Comput. Sci., Univ. of California, Berkeley, CA, USA
fYear :
2011
fDate :
28-30 Sept. 2011
Firstpage :
701
Lastpage :
706
Abstract :
We generalize stochastic subgradient methods to situations in which we do not receive independent samples from the distribution over which we optimize, but instead receive samples that are coupled over time. We show that as long as the source of randomness is suitably ergodic-it converges quickly enough to a stationary distribution-the method enjoys strong convergence guarantees, both in expectation and with high probability. This result has implications for high-dimensional stochastic optimization, peer-to-peer distributed optimization schemes, and stochastic optimization problems over combinatorial spaces.
Keywords :
peer-to-peer computing; statistical analysis; stochastic processes; combinatorial spaces; convergence; ergodic mirror descent; high dimensional stochastic optimization; peer to peer distributed optimization; stationary distribution; stochastic optimization problems; stochastic subgradient methods; Convergence; Convex functions; Digital TV; Markov processes; Mirrors; Optimization;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Communication, Control, and Computing (Allerton), 2011 49th Annual Allerton Conference on
Conference_Location :
Monticello, IL
Print_ISBN :
978-1-4577-1817-5
Type :
conf
DOI :
10.1109/Allerton.2011.6120236
Filename :
6120236
Link To Document :
بازگشت