• DocumentCode
    719307
  • Title

    Some results concerning rank-one truncated steepest descent directions in tensor spaces

  • Author

    Uschmajew, Andre

  • Author_Institution
    Hausdorff Center for Math. & Inst. for NumericalSimulation, Univ. of Bonn, Bonn, Germany
  • fYear
    2015
  • fDate
    25-29 May 2015
  • Firstpage
    415
  • Lastpage
    419
  • Abstract
    The idea of finding low-rank solutions to matrix or tensor optimization tasks by greedy rank-one methods has been showing itself repeatedly in the literature. The simplest method, and often a central building block in accelerated methods, consists in performing updates along low-rank approximations of the negative gradient. This is convenient as it does increase the rank in a prescribed manner per step, and also because it allows for a somewhat surprisingly simple convergence analysis. The main point is that in a tensor product space of finite dimension, the best rank-one approximation of a tensor has a guaranteed minimal overlap with the tensor itself. Thus rank-one approximations of anti-gradients provide descent directions. This key concept can also be used in Hilbert space, if the rank growth of the approximation sequence can be balanced with convergence speed. This work presents a conceptual review of this approach, and also provides some new insights.
  • Keywords
    Hilbert spaces; convergence; matrix algebra; tensors; Hilbert space; convergence analysis; greedy rank-one methods; matrix; rank-one truncated steepest descent directions; tensor optimization; Convergence; Hilbert space; Least squares approximations; Linear systems; Optimization; Tensile stress;
  • fLanguage
    English
  • Publisher
    ieee
  • Conference_Titel
    Sampling Theory and Applications (SampTA), 2015 International Conference on
  • Conference_Location
    Washington, DC
  • Type

    conf

  • DOI
    10.1109/SAMPTA.2015.7148924
  • Filename
    7148924