Title :
On the derivation of RIP for random Gaussian matrices and binary sparse signals
Author :
Park, Sangjun ; Lee, Heung-No
Author_Institution :
Dept. of Inf. & Commun., Gwangju Inst. of Sci. & Technol., Gwangju, South Korea
Abstract :
The number of measurements, M, sufficient for successful recovery via the L1 minimization is well known to be M = O(Klog(N/K)) M [7] with Gaussian measurement matrices used for sensing K sparse signals of ambient dimension N. We aim to shed a light on the source of the log(N) factor, and see if the bound can be improved by considering it for simplest possible K-sparse signals -0/1 binary K sparse signals. Previous work exists with which it is reasonable to expect reduction in the number of measurements when the signal has smaller degrees of freedom. We derive an upper bound on the probability that any set of K randomly selected Gaussian column vectors are mutually independent; we use this to find an upper bound on the probability that a Gaussian sensing matrix satisfies the restricted isometry condition. Using this result, a sufficient condition for good signal recovery is found. Surprisingly, the result remains the same, i.e., M = O(Klog(N/K)), which may suggest the log(N) factor is generic for Gaussian measurements.
Keywords :
Gaussian processes; minimisation; probability; signal reconstruction; sparse matrices; vectors; Gaussian measurement matrices; Gaussian sensing matrix; K randomly selected Gaussian column vector; K sparse signal sensing; LI minimization; RIP derivation; binary sparse signal; compressive sensing; restricted isometry property derivation; signal recovery; upper bound probability; Compressed sensing; Minimization; Random variables; Sensors; Size measurement; Sparse matrices; Upper bound; Binary Sparse Signal; Compressive Sensing; Restricted Isometry Property;
Conference_Titel :
ICT Convergence (ICTC), 2011 International Conference on
Conference_Location :
Seoul
Print_ISBN :
978-1-4577-1267-8
DOI :
10.1109/ICTC.2011.6082562