• DocumentCode
    730589
  • Title

    Rapid: Rapidly accelerated proximal gradient algorithms for convex minimization

  • Author

    Ziming Zhang ; Saligrama, Venkatesh

  • Author_Institution
    ECE, Boston Univ., Boston, MA, USA
  • fYear
    2015
  • fDate
    19-24 April 2015
  • Firstpage
    3796
  • Lastpage
    3800
  • Abstract
    In this paper, we propose a new algorithm to speed-up the convergence of accelerated proximal gradient (APG) methods. In order to minimize a convex function f(x), our algorithm introduces a simple line search step after each proximal gradient step in APG so that a biconvex function f(θx) is minimized over scalar variable θ > 0 while fixing variable x. We propose two new ways of constructing the auxiliary variables in APG based on the intermediate solutions of the proximal gradient and the line search steps. We prove that at arbitrary iteration step t(t ≥ 1), our algorithm can achieve a smaller upper-bound for the gap between the current and optimal objective values than those in the traditional APG methods such as FISTA [1], making it converge faster in practice. We apply our algorithm to many important convex optimization problems such as sparse linear regression. Our experimental results demonstrate that our algorithm converges faster than APG, even comparable to some sophisticated solvers.
  • Keywords
    compressed sensing; gradient methods; regression analysis; accelerated proximal gradient methods; biconvex function; convex minimization; sparse linear regression; Acceleration; Convergence; Convex functions; Gradient methods; Minimization; Signal processing algorithms; Upper bound;
  • fLanguage
    English
  • Publisher
    ieee
  • Conference_Titel
    Acoustics, Speech and Signal Processing (ICASSP), 2015 IEEE International Conference on
  • Conference_Location
    South Brisbane, QLD
  • Type

    conf

  • DOI
    10.1109/ICASSP.2015.7178681
  • Filename
    7178681