DocumentCode :
1780593
Title :
Simple error bounds for regularized noisy linear inverse problems
Author :
Thrampoulidis, Christos ; Oymak, Samet ; Hassibi, Babak
Author_Institution :
Dept. of Electr. Eng., Caltech, Pasadena, CA, USA
fYear :
2014
fDate :
June 29 2014-July 4 2014
Firstpage :
3007
Lastpage :
3011
Abstract :
Consider estimating a structured signal x0 from linear, underdetermined and noisy measurements y = Ax0+z, via solving a variant of the lasso algorithm: x̂ = arg minx{∥y-Ax∥2+λf(x)}. Here, f is a convex function aiming to promote the structure of x0, say ℓ1-norm to promote sparsity or nuclear norm to promote low-rankness. We assume that the entries of A are independent and normally distributed and make no assumptions on the noise vector z, other than it being independent of A. Under this generic setup, we derive a general, non-asymptotic and rather tight upper bound on the ℓ2-norm of the estimation error ∥x̂ - x0∥2. Our bound is geometric in nature and obeys a simple formula; the roles of λ, f and x0 are all captured by a single summary parameter δ(λ∂f(x0)), termed the Gaussian squared distance to the scaled subdifferential. We connect our result to the literature and verify its validity through simulations.
Keywords :
Gaussian processes; inverse problems; signal processing; Gaussian squared distance; convex function; error bounds; estimation error; lasso algorithm; noise vector; noisy linear inverse problems; sparsity norm; structured signal; Estimation error; Information theory; Noise; Noise measurement; Upper bound; Vectors;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Information Theory (ISIT), 2014 IEEE International Symposium on
Conference_Location :
Honolulu, HI
Type :
conf
DOI :
10.1109/ISIT.2014.6875386
Filename :
6875386
Link To Document :
بازگشت