• DocumentCode
    655181
  • Title

    Efficient Accelerated Coordinate Descent Methods and Faster Algorithms for Solving Linear Systems

  • Author

    Yin Tat Lee ; Sidford, Aaron

  • Author_Institution
    Math. Dept., Massachusetts Inst. of Technol., Cambridge, MA, USA
  • fYear
    2013
  • fDate
    26-29 Oct. 2013
  • Firstpage
    147
  • Lastpage
    156
  • Abstract
    In this paper we show how to accelerate randomized coordinate descent methods and achieve faster convergence rates without paying per-iteration costs in asymptotic running time. In particular, we show how to generalize and efficiently implement a method proposed by Nesterov, giving faster asymptotic running times for various algorithms that use standard coordinate descent as a black box. In addition to providing a proof of convergence for this new general method, we show that it is numerically stable, efficiently implementable, and in certain regimes, asymptotically optimal. To highlight the power of this algorithm, we show how it can used to create faster linear system solvers in several regimes: - We show how this method achieves a faster asymptotic runtime than conjugate gradient for solving a broad class of symmetric positive definite systems of equations. - We improve the convergence guarantees for Kaczmarz methods, a popular technique for image reconstruction and solving over determined systems of equations, by accelerating an algorithm of Strohmer and Vershynin. - We achieve the best known running time for solving Symmetric Diagonally Dominant (SDD) system of equations in the unit-cost RAM model, obtaining a running time of O(m log3/2n (log log n)1/2 log((log n)/eps)) by accelerating a recent solver by Kelner et al. Beyond the independent interest of these solvers, we believe they highlight the versatility of the approach of this paper and we hope that they will open the door for further algorithmic improvements in the future.
  • Keywords
    computational complexity; gradient methods; linear systems; numerical stability; randomised algorithms; Kaczmarz methods; SDD equations system; accelerated coordinate descent methods; asymptotic running time; asymptotically optimal; black box; conjugate gradient; convergence rates; image reconstruction; linear system solvers; numerical stability; overdetermined equations systems; randomized coordinate descent methods; standard coordinate descent; symmetric diagonally dominant equations system; symmetric positive definite equations systems; unit-cost RAM model; Acceleration; Algorithm design and analysis; Convergence; Iterative methods; Linear systems; Probabilistic logic; Standards; Kaczmarz method; convex optimization; coordinate descent; symmetric diagonally dominant matrix;
  • fLanguage
    English
  • Publisher
    ieee
  • Conference_Titel
    Foundations of Computer Science (FOCS), 2013 IEEE 54th Annual Symposium on
  • Conference_Location
    Berkeley, CA
  • ISSN
    0272-5428
  • Type

    conf

  • DOI
    10.1109/FOCS.2013.24
  • Filename
    6686150