• DocumentCode
    20414
  • Title

    Minimum Variance Estimation of a Sparse Vector Within the Linear Gaussian Model: An RKHS Approach

  • Author

    Jung, Alexandra ; Schmutzhard, Sebastian ; Hlawatsch, Franz ; Ben-Haim, Zvika ; Eldar, Yonina C.

  • Author_Institution
    Inst. of Telecommun., Vienna Univ. of Technol., Vienna, Austria
  • Volume
    60
  • Issue
    10
  • fYear
    2014
  • fDate
    Oct. 2014
  • Firstpage
    6555
  • Lastpage
    6575
  • Abstract
    We consider minimum variance estimation within the sparse linear Gaussian model (SLGM). A sparse vector is to be estimated from a linearly transformed version embedded in Gaussian noise. Our analysis is based on the theory of reproducing kernel Hilbert spaces (RKHS). After a characterization of the RKHS associated with the SLGM, we derive a lower bound on the minimum variance achievable by estimators with a prescribed bias function, including the important special case of unbiased estimation. This bound is obtained via an orthogonal projection of the prescribed mean function onto a subspace of the RKHS associated with the SLGM. It provides an approximation to the minimum achievable variance (Barankin bound) that is tighter than any known bound. Our bound holds for an arbitrary system matrix, including the overdetermined and underdetermined cases. We specialize it to compressed sensing measurement matrices and express it in terms of the restricted isometry constant. For the special case of the SLGM given by the sparse signal in noise model, we derive closed-form expressions of the Barankin bound and of the corresponding locally minimum variance estimator. Finally, we compare our bound with the variance of several well-known estimators, namely, the maximum-likelihood estimator, the hard-thresholding estimator, and compressive reconstruction using orthogonal matching pursuit and approximate message passing.
  • Keywords
    Gaussian noise; Hilbert spaces; compressed sensing; iterative methods; maximum likelihood estimation; message passing; signal reconstruction; sparse matrices; time-frequency analysis; Barankin bound; RKHS subspace; RKHS theory; SLGM; approximate message passing; arbitrary system matrix; closed-form expressions; compressed sensing measurement matrices; compressive reconstruction; hard thresholding estimator; maximum likelihood estimator; minimum variance estimator; orthogonal matching pursuit; orthogonal projection; reproducing kernel Hilbert space theory; restricted isometry constant; sparse Gaussian signal noise model; sparse linear Gaussian model; sparse vector minimum variance estimation; Electronic mail; Estimation; Hilbert space; Kernel; Noise; Sparse matrices; Vectors; Barankin bound; Cram??r??Rao bound; Hammersley??Chapman??Robbins bound; RKHS; Sparsity; compressed sensing; denoising; locally minimum variance unbiased estimator; unbiased estimation;
  • fLanguage
    English
  • Journal_Title
    Information Theory, IEEE Transactions on
  • Publisher
    ieee
  • ISSN
    0018-9448
  • Type

    jour

  • DOI
    10.1109/TIT.2014.2346508
  • Filename
    6874571