• DocumentCode
    67534
  • Title

    Bilinear Generalized Approximate Message Passing—Part I: Derivation

  • Author

    Parker, Jason T. ; Schniter, Philip ; Cevher, Volkan

  • Author_Institution
    Sensors Directorate, Air Force Res. Lab., Wright-Patterson AFB, OH, USA
  • Volume
    62
  • Issue
    22
  • fYear
    2014
  • fDate
    Nov.15, 2014
  • Firstpage
    5839
  • Lastpage
    5853
  • Abstract
    In this paper, we extend the generalized approximate message passing (G-AMP) approach, originally proposed for high-dimensional generalized-linear regression in the context of compressive sensing, to the generalized-bilinear case, which enables its application to matrix completion, robust PCA, dictionary learning, and related matrix-factorization problems. Here, in Part I of a two-part paper, we derive our Bilinear G-AMP (BiG-AMP) algorithm as an approximation of the sum-product belief propagation algorithm in the high-dimensional limit, where central-limit theorem arguments and Taylor-series approximations apply, and under the assumption of statistically independent matrix entries with known priors. In addition, we propose an adaptive damping mechanism that aids convergence under finite problem sizes, an expectation-maximization (EM)-based method to automatically tune the parameters of the assumed priors, and two rank-selection strategies. In Part II of the paper, we will discuss the specializations of EM-BiG-AMP to the problems of matrix completion, robust PCA, and dictionary learning, and we will present the results of an extensive empirical study comparing EM-BiG-AMP to state-of-the-art algorithms on each problem.
  • Keywords
    approximation theory; compressed sensing; expectation-maximisation algorithm; matrix decomposition; message passing; principal component analysis; regression analysis; EM-based method; Taylor-series approximations; adaptive damping mechanism; biG-AMP algorithm; bilinear generalized approximate message passing approach; central-limit theorem arguments; compressive sensing; dictionary learning; expectation-maximization-based method; finite problem sizes; high-dimensional generalized-linear regression; high-dimensional limit; matrix completion; rank-selection strategy; related matrix-factorization problems; robust PCA; statistical independent matrix; sum-product belief propagation algorithm; Context; Dictionaries; Manganese; Principal component analysis; Random variables; Robustness; Signal processing algorithms; Approximate message passing; belief propagation; bilinear estimation; dictionary learning; matrix completion; matrix factorization; robust principal components analysis;
  • fLanguage
    English
  • Journal_Title
    Signal Processing, IEEE Transactions on
  • Publisher
    ieee
  • ISSN
    1053-587X
  • Type

    jour

  • DOI
    10.1109/TSP.2014.2357776
  • Filename
    6898015