• DocumentCode
    3170237
  • Title

    A Globally Convergent Conjugate Gradient Method for Minimizing Self-Concordant Functions with Application to Constrained Optimisation Problems

  • Author

    Ji, Huibo ; Huang, Minyi ; Moore, John B. ; Manton, Jonathan H.

  • Author_Institution
    Australian Nat. Univ., Canberra
  • fYear
    2007
  • fDate
    9-13 July 2007
  • Firstpage
    540
  • Lastpage
    545
  • Abstract
    Self-concordant functions are a special class of convex functions introduced by Nesterov and Nemirovskii and used in interior point methods. This paper proposes a damped conjugate gradient method for optimization of self-concordant functions. This method is an ordinary conjugate gradient method but with a novel step-size selection rule which is proved to ensure the algorithm converges to the global minimum. As an example, the algorithm is applied to a quadratically constrained quadratic optimization problem.
  • Keywords
    convex programming; gradient methods; constrained optimisation problems; convex functions; globally convergent conjugate gradient method; interior point methods; quadratically constrained quadratic optimization problem; self-concordant functions; step-size selection rule; Australia; Cities and towns; Constraint optimization; Convergence; Cost function; Functional programming; Gradient methods; Newton method; Optimization methods; Polynomials;
  • fLanguage
    English
  • Publisher
    ieee
  • Conference_Titel
    American Control Conference, 2007. ACC '07
  • Conference_Location
    New York, NY
  • ISSN
    0743-1619
  • Print_ISBN
    1-4244-0988-8
  • Electronic_ISBN
    0743-1619
  • Type

    conf

  • DOI
    10.1109/ACC.2007.4282797
  • Filename
    4282797