• DocumentCode
    53408
  • Title

    Fast Distributed Gradient Methods

  • Author

    Jakovetic, Dusan ; Xavier, Joao ; Moura, Jose M. F.

  • Author_Institution
    Inst. for Syst. & Robot., Univ. of Lisbon, Lisbon, Portugal
  • Volume
    59
  • Issue
    5
  • fYear
    2014
  • fDate
    May-14
  • Firstpage
    1131
  • Lastpage
    1146
  • Abstract
    We study distributed optimization problems when N nodes minimize the sum of their individual costs subject to a common vector variable. The costs are convex, have Lipschitz continuous gradient (with constant L), and bounded gradient. We propose two fast distributed gradient algorithms based on the centralized Nesterov gradient algorithm and establish their convergence rates in terms of the per-node communications K and the per-node gradient evaluations k. Our first method, Distributed Nesterov Gradient, achieves rates O( logK/K) and O(logk/k). Our second method, Distributed Nesterov gradient with Consensus iterations, assumes at all nodes knowledge of L and μ(W) - the second largest singular value of the N ×N doubly stochastic weight matrix W. It achieves rates O( 1/ K2-ξ) and O( 1/k2) ( ξ > 0 arbitrarily small). Further, we give for both methods explicit dependence of the convergence constants on N and W. Simulation examples illustrate our findings.
  • Keywords
    computational complexity; convergence; distributed algorithms; gradient methods; matrix algebra; optimisation; singular value decomposition; stochastic processes; Lipschitz continuous gradient; bounded gradient; centralized Nesterov gradient algorithm; common vector variable; consensus iterations; convergence constants; convergence rates; convex costs; distributed Nesterov gradient; distributed gradient algorithms; distributed optimization problems; doubly stochastic weight matrix; fast distributed gradient methods; per-node communications; per-node gradient evaluations; singular value; Acceleration; Algorithm design and analysis; Convergence; Educational institutions; Gradient methods; Vectors; Consensus; Nesterov gradient; convergence rate; distributed optimization;
  • fLanguage
    English
  • Journal_Title
    Automatic Control, IEEE Transactions on
  • Publisher
    ieee
  • ISSN
    0018-9286
  • Type

    jour

  • DOI
    10.1109/TAC.2014.2298712
  • Filename
    6705625