Title :
Asymptotically exact error analysis for the generalized equation-LASSO
Author :
Christos Thrampoulidis;Ashkan Panahi;Babak Hassibi
Author_Institution :
Department of Electrical Engineering, Caltech, Pasadena, USA
fDate :
6/1/2015 12:00:00 AM
Abstract :
Given an unknown signal x0 ∈ Rn and linear noisy measurements y = Ax0 + σv ∈ Rm, the generalized ℓ22-LASSO solves x̂ := arg minx 1/2∥y-Ax ∥22 + σλf(x). Here, f is a convex regularization function (e.g. ℓ1-norm, nuclearnorm) aiming to promote the structure of x0 (e.g. sparse, lowrank), and, λ ≥ 0 is the regularizer parameter. A related optimization problem, though not as popular or well-known, is often referred to as the generalized ℓ2-LASSO and takes the form x̂̂̂ := arg minx ∥y-Ax∥2 +λf(x), and has been analyzed by Oymak, Thrampoulidis and Hassibi. Oymak et al. further made conjectures about the performance of the generalized ℓ22-LASSO. This paper establishes these conjectures rigorously. We measure performance with the normalized squared error NSE(σ) := ∥x-x0∥22/(mσ2). Assuming the entries of A are i.i.d. Gaussian N (0,1/m) and those of v are i.i.d. N(0,1), we precisely characterize the “asymptotic NSE” aNSE := limσ→0 NSE(σ) when the problem dimensions tend to infinity in a proportional manner. The role of λ, f and x0 is explicitly captured in the derived expression via means of a single geometric quantity, the Gaussian distance to the subdifferential. We conjecture that aNSE = supσ>0 NSE(σ). We include detailed discussions on the interpretation of our result, make connections to relevant literature and perform computational experiments that validate our theoretical findings.
Keywords :
"Optimization","Noise measurement","Compressed sensing","Linear programming","Robustness","Information theory","Standards"
Conference_Titel :
Information Theory (ISIT), 2015 IEEE International Symposium on
Electronic_ISBN :
2157-8117
DOI :
10.1109/ISIT.2015.7282810