• DocumentCode
    253173
  • Title

    Distributed stochastic optimization and learning

  • Author

    Shamir, Ohad ; Srebro, Nathan

  • Author_Institution
    Dept. of Comput. Sci. & Appl. Math., Weizmann Inst. of Sci., Rehovot, Israel
  • fYear
    2014
  • fDate
    Sept. 30 2014-Oct. 3 2014
  • Firstpage
    850
  • Lastpage
    857
  • Abstract
    We consider the problem of distributed stochastic optimization, where each of several machines has access to samples from the same source distribution, and the goal is to jointly optimize the expected objective w.r.t. the source distribution, minimizing: (1) overall runtime; (2) communication costs; (3) number of samples used. We study this problem systematically, highlighting fundamental limitations, and differences versus distributed consensus problems where each machine has a different, independent, objective. We show how the best known guarantees are obtained by an accelerated mini-batched SGD approach, and contrast the runtime and sample costs of the approach with those of other distributed optimization algorithms.
  • Keywords
    communication complexity; distributed algorithms; gradient methods; learning (artificial intelligence); optimisation; accelerated mini-batched SGD approach; communication cost minimization; distributed consensus problems; distributed stochastic optimization algorithms; learning; overall runtime minimization; source distribution; stochastic gradient descent; Approximation methods; Complexity theory; Optimization; Presses; Runtime; Stochastic processes; Vectors;
  • fLanguage
    English
  • Publisher
    ieee
  • Conference_Titel
    Communication, Control, and Computing (Allerton), 2014 52nd Annual Allerton Conference on
  • Conference_Location
    Monticello, IL
  • Type

    conf

  • DOI
    10.1109/ALLERTON.2014.7028543
  • Filename
    7028543