• DocumentCode
    573308
  • Title

    Expectation-maximization Gaussian-mixture approximate message passing

  • Author

    Vila, Jeremy ; Schniter, Philip

  • Author_Institution
    Dept. of ECE, Ohio State Univ., Columbus, OH, USA
  • fYear
    2012
  • fDate
    21-23 March 2012
  • Firstpage
    1
  • Lastpage
    6
  • Abstract
    When recovering a sparse signal from noisy compressive linear measurements, the distribution of the signal´s non-zero coefficients can have a profound affect on recovery mean-squared error (MSE). If this distribution was apriori known, one could use efficient approximate message passing (AMP) techniques for nearly minimum MSE (MMSE) recovery. In practice, though, the distribution is unknown, motivating the use of robust algorithms like Lasso-which is nearly minimax optimal-at the cost of significantly larger MSE for non-least-favorable distributions. As an alternative, we propose an empirical-Bayesian technique that simultaneously learns the signal distribution while MMSE-recovering the signal-according to the learned distribution-using AMP. In particular, we model the non-zero distribution as a Gaussian mixture, and learn its parameters through expectation maximization, using AMP to implement the expectation step. Numerical experiments confirm the state-of-the-art performance of our approach on a range of signal classes.
  • Keywords
    Bayes methods; Gaussian processes; expectation-maximisation algorithm; mean square error methods; message passing; signal processing; AMP; Bayesian technique; MSE; approximate message passing; expectation maximization Gaussian-mixture approximate message passing; mean squared error; minimum MSE; noisy compressive linear measurements; nonzero coefficient signal; signal distribution; sparse signal;
  • fLanguage
    English
  • Publisher
    ieee
  • Conference_Titel
    Information Sciences and Systems (CISS), 2012 46th Annual Conference on
  • Conference_Location
    Princeton, NJ
  • Print_ISBN
    978-1-4673-3139-5
  • Electronic_ISBN
    978-1-4673-3138-8
  • Type

    conf

  • DOI
    10.1109/CISS.2012.6310932
  • Filename
    6310932