Title :
Shrinkage mappings and their induced penalty functions
Author_Institution :
Los Alamos Nat. Lab., Los Alamos, NM, USA
Abstract :
Many optimization problems that are designed to have sparse solutions employ the ℓ1 or ℓ0 penalty functions. Consequently, several algorithms for compressive sensing or sparse representations make use of soft or hard thresholding, both of which are examples of shrinkage mappings. Their usefulness comes from the fact that they are the proximal mappings of the ℓ1 and ℓ0 penalty functions, meaning that they provide the solution to the corresponding penalized least-squares problem. In this paper, we both generalize and reverse this process: we show that one can begin with any of a wide class of shrinkage mappings, and be guaranteed that it will be the proximal mapping of a penalty function with several desirable properties. Such a shrinkage-mapping/penalty-function pair comes ready-made for use in efficient algorithms. We give an example of such a shrinkage mapping, and use it to advance the state of the art in compressive sensing.
Keywords :
compressed sensing; least squares approximations; optimisation; ℓ0 penalty functions; ℓ1 penalty functions; compressive sensing; hard thresholding; induced penalty functions; optimization problems; penalized least-squares problem; proximal mappings; shrinkage mappings; soft thresholding; sparse representations; sparse solutions; Compressed sensing; Image reconstruction; Magnetic resonance imaging; Optimization; Signal processing algorithms; Speech; Compressive sensing; alternating direction method of multipliers; nonconvex optimization; shrinkage; sparse representations;
Conference_Titel :
Acoustics, Speech and Signal Processing (ICASSP), 2014 IEEE International Conference on
Conference_Location :
Florence
DOI :
10.1109/ICASSP.2014.6853752