• DocumentCode
    79691
  • Title

    A Max-Product EM Algorithm for Reconstructing Markov-Tree Sparse Signals From Compressive Samples

  • Author

    Zhao Song ; Dogandzic, A.

  • Author_Institution
    Dept. of Electr. & Comput. Eng., Iowa State Univ., Ames, IA, USA
  • Volume
    61
  • Issue
    23
  • fYear
    2013
  • fDate
    Dec.1, 2013
  • Firstpage
    5917
  • Lastpage
    5931
  • Abstract
    We propose a Bayesian expectation-maximization (EM) algorithm for reconstructing Markov-tree sparse signals via belief propagation. The measurements follow an underdetermined linear model where the regression-coefficient vector is the sum of an unknown approximately sparse signal and a zero-mean white Gaussian noise with an unknown variance. The signal is composed of large- and small-magnitude components identified by binary state variables whose probabilistic dependence structure is described by a Markov tree. Gaussian priors are assigned to the signal coefficients given their state variables and the Jeffreys´ noninformative prior is assigned to the noise variance. Our signal reconstruction scheme is based on an EM iteration that aims at maximizing the posterior distribution of the signal and its state variables given the noise variance. We construct the missing data for the EM iteration so that the complete-data posterior distribution corresponds to a hidden Marcov tree (HMT) probabilistic graphical model that contains no loops and implement its maximization (M) step via a max-product algorithm. This EM algorithm estimates the vector of state variables as well as solves iteratively a linear system of equations to obtain the corresponding signal estimate. We select the noise variance so that the corresponding estimated signal and state variables obtained upon convergence of the EM iteration have the largest marginal posterior distribution. We compare the proposed and existing state-of-the-art reconstruction methods via signal and image reconstruction experiments.
  • Keywords
    Gaussian noise; compressed sensing; expectation-maximisation algorithm; hidden Markov models; signal reconstruction; trees (mathematics); white noise; Bayesian EM algorithm; Bayesian expectation-maximization algorithm; EM iteration; Gaussian priors; HMT probabilistic graphical model; Jeffreys noninformative prior; Markov-tree sparse signal reconstruction; belief propagation; binary state variables; complete-data posterior distribution; compressive samples; hidden Marcov tree probabilistic graphical model; image reconstruction; large-magnitude component; marginal posterior distribution; max-product EM algorithm; noise variance; posterior distribution maximization; probabilistic dependence structure; regression-coefficient vector; signal coefficient; small-magnitude component; state variable vector estimation; state variables; zero-mean white Gaussian noise; Decision support systems; Belief propagation; compressed sensing; expectation-maximization algorithms; hidden Markov models; signal reconstruction;
  • fLanguage
    English
  • Journal_Title
    Signal Processing, IEEE Transactions on
  • Publisher
    ieee
  • ISSN
    1053-587X
  • Type

    jour

  • DOI
    10.1109/TSP.2013.2277833
  • Filename
    6577604