DocumentCode
1286083
Title
Information-Theoretic Limits on Sparsity Recovery in the High-Dimensional and Noisy Setting
Author
Wainwright, Martin J.
Author_Institution
Dept. of Stat., Univ. of California, Berkeley, Berkeley, CA, USA
Volume
55
Issue
12
fYear
2009
Firstpage
5728
Lastpage
5741
Abstract
The problem of sparsity pattern or support set recovery refers to estimating the set of nonzero coefficients of an unknown vector beta* isin Ropfp based on a set of n noisy observations. It arises in a variety of settings, including subset selection in regression, graphical model selection, signal denoising, compressive sensing, and constructive approximation. The sample complexity of a given method for subset recovery refers to the scaling of the required sample size n as a function of the signal dimension p, sparsity index k (number of non-zeroes in beta*), as well as the minimum value betamin of beta* over its support and other parameters of measurement matrix. This paper studies the information-theoretic limits of sparsity recovery: in particular, for a noisy linear observation model based on random measurement matrices drawn from general Gaussian measurement matrices, we derive both a set of sufficient conditions for exact support recovery using an exhaustive search decoder, as well as a set of necessary conditions that any decoder, regardless of its computational complexity, must satisfy for exact support recovery. This analysis of fundamental limits complements our previous work on sharp thresholds for support set recovery over the same set of random measurement ensembles using the polynomial-time Lasso method (lscr1-constrained quadratic programming).
Keywords
computational complexity; decoding; quadratic programming; search problems; set theory; sparse matrices; vectors; Gaussian measurement matrices; compressive sensing; computational complexity; constructive approximation; exhaustive search decoder; graphical model selection; high-dimensional setting; information-theoretic limits; lscr1-constrained quadratic programming; noisy linear observation model; noisy setting; nonzero coefficients; polynomial-time Lasso method; random measurement matrix; signal denoising; sparsity recovery; subset selection; support set recovery; vector; Computational complexity; Decoding; Gaussian noise; Graphical models; Particle measurements; Polynomials; Quadratic programming; Signal denoising; Size measurement; Sufficient conditions; $ell_1$ -relaxation; Compressed sensing; Fano\´s method; Lasso; high-dimensional statistical inference; information-theoretic bounds; model selection; signal denoising; sparsity pattern; sparsity recovery; subset selection; support recovery;
fLanguage
English
Journal_Title
Information Theory, IEEE Transactions on
Publisher
ieee
ISSN
0018-9448
Type
jour
DOI
10.1109/TIT.2009.2032816
Filename
5319750
Link To Document