Title :
Greedy dirty models: A new algorithm for multiple sparse regression
Author :
Jalali, Ali ; Sanghavi, Sujay
Author_Institution :
Electr. & Comput. Eng. Dept., Univ. of Texas at Austin, Austin, TX, USA
Abstract :
This paper considers the recovery of multiple sparse vectors, with partially shared supports, from a small number of noisy linear measurements of each. It has recently been shown that it is possible to lower the sample complexity of recovery, for all levels of support sharing, by using a “dirty model”: a super-position of sparse and group-sparse modeling approaches; this is based on convex optimization. In this paper, we provide a new forward-backward greedy procedure for the dirty model approach. Each forward step involves the addition of either a shared feature common to all vectors, or a unique feature to one of the vectors, chosen in a natural greedy fashion. Each backward step involves greedy removal, again of at most one feature of either type. Analytical and empirical evidence shows that this outperforms all convex approaches, in terms of both sample and computational complexity.
Keywords :
computational complexity; greedy algorithms; regression analysis; signal processing; Greedy dirty models; computational complexity; convex optimization; forward-backward greedy procedure; greedy removal; group-sparse modeling super-position; multiple sparse regression; multiple sparse vectors recovery; noisy linear measurements; sample complexity; support sharing; Complexity theory; Greedy algorithms; Noise; Noise measurement; Sparse matrices; Standards; Vectors; Block-Sparsity; Dirty Model; Forward-Backward Greedy Algorithm; High-dimensional Statistics; Multi-task Learning; Sparsity;
Conference_Titel :
Statistical Signal Processing Workshop (SSP), 2012 IEEE
Conference_Location :
Ann Arbor, MI
Print_ISBN :
978-1-4673-0182-4
Electronic_ISBN :
pending
DOI :
10.1109/SSP.2012.6319719