DocumentCode :
1382223
Title :
Computationally Efficient Sparse Bayesian Learning via Belief Propagation
Author :
Tan, Xing ; Li, Jian
Author_Institution :
Dept. of Electr. & Comput. Eng., Univ. of Florida, Gainesville, FL, USA
Volume :
58
Issue :
4
fYear :
2010
fDate :
4/1/2010 12:00:00 AM
Firstpage :
2010
Lastpage :
2021
Abstract :
We present a belief propagation (BP)-based sparse Bayesian learning (SBL) algorithm, referred to as the BP-SBL, to recover sparse transform coefficients in large scale compressed sensing problems. BP-SBL is based on a widely used hierarchical Bayesian model, which is turned into a factor graph so that BP can be applied to achieve computational efficiency. We prove that the messages in BP are Gaussian probability density functions and therefore, we only need to update their means and variances when we update the messages. The computational complexity of BP-SBL is proportional to the number of transform coefficients, allowing the algorithms to deal with large scale compressed sensing problems efficiently. Numerical examples are provided to demonstrate the effectiveness of BP-SBL.
Keywords :
Bayes methods; Gaussian distribution; belief networks; computational complexity; signal reconstruction; signal sampling; sparse matrices; transform coding; Gaussian probability density functions; belief propagation; computational complexity; computational efficiency; hierarchical Bayesian model; large scale compressed sensing problems; sparse Bayesian learning; sparse signal recovery; sparse transform coefficients; Belief propagation; compressed sensing; expectation maximization; sparse Bayesian learning;
fLanguage :
English
Journal_Title :
Signal Processing, IEEE Transactions on
Publisher :
ieee
ISSN :
1053-587X
Type :
jour
DOI :
10.1109/TSP.2010.2040683
Filename :
5382564
Link To Document :
بازگشت