DocumentCode :
3502986
Title :
Least favorable compressed sensing problems for first-order methods
Author :
Maleki, Arian ; Baraniuk, Richard
Author_Institution :
Dept. of Electr. & Comput. Eng., Rice Univ., Houston, TX, USA
fYear :
2011
fDate :
July 31 2011-Aug. 5 2011
Firstpage :
134
Lastpage :
138
Abstract :
Compressed sensing (CS) exploits the compressibility of natural signals to reduce the number of samples required for accurate reconstruction. The cost for sub-Nyquist sampling has been computationally expensive reconstruction algorithms, including large-scale ℓ1 optimization. Therefore, first-order optimization methods that exploit only the gradient of the reconstruction cost function have been developed; notable examples include iterative soft thresholding (IST), fast iterative soft thresholding algorithm (FISTA), and approximate message passing (AMP). The performance of these algorithms has been studied mainly in the standard framework of convex optimization, called the deterministic framework here. In this paper, we first show that the deterministic approach results in overly pessimistic conclusions that are not indicative of algorithm performance in practice. As an alternative to the deterministic framework, we second study the theoretical aspects of the statistical convergence rate, a topic that has remained unexplored in the sparse recovery literature. Our theoretical and empirical studies reveal several hallmark properties of the statistical convergence of first-order methods, including universality over the matrix ensemble and the least favorable coefficient distribution.
Keywords :
convex programming; gradient methods; signal reconstruction; signal sampling; AMP; FISTA; IST; approximate message passing; convex optimization; deterministic framework; fast iterative soft thresholding algorithm; first-order optimization method; iterative soft thresholding; large-scale optimization; least favorable coefficient distribution; least favorable compressed sensing problem; matrix ensemble; reconstruction cost gradient function; sparse recovery literature; statistical convergence rate; subNyquist sampling cost;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Information Theory Proceedings (ISIT), 2011 IEEE International Symposium on
Conference_Location :
St. Petersburg
ISSN :
2157-8095
Print_ISBN :
978-1-4577-0596-0
Electronic_ISBN :
2157-8095
Type :
conf
DOI :
10.1109/ISIT.2011.6033758
Filename :
6033758
Link To Document :
بازگشت