• DocumentCode
    3065310
  • Title

    “Compressed” compressed sensing

  • Author

    Reeves, G. ; Gastpar, Michael

  • Author_Institution
    Dept. of Electr. Eng. & Comput. Sci., Univ. of California, Berkeley, CA, USA
  • fYear
    2010
  • fDate
    13-18 June 2010
  • Firstpage
    1548
  • Lastpage
    1552
  • Abstract
    The field of compressed sensing has shown that a sparse but otherwise arbitrary vector can be recovered exactly from a small number of randomly constructed linear projections (or samples). The question addressed in this paper is whether an even smaller number of samples is sufficient when there exists prior knowledge about the distribution of the unknown vector, or when only partial recovery is needed. An information-theoretic lower bound with connections to free probability theory and an upper bound corresponding to a computationally simple thresholding estimator are derived. It is shown that in certain cases (e.g. discrete valued vectors or large distortions) the number of samples can be decreased. Interestingly though, it is also shown that in many cases no reduction is possible.
  • Keywords
    information theory; probability; compressed sensing; free probability theory; information-theoretic lower bound; Compressed sensing; Computer science; Information theory; Mathematics; Sampling methods; Signal to noise ratio; Sparse matrices; Sufficient conditions; Upper bound; Vectors;
  • fLanguage
    English
  • Publisher
    ieee
  • Conference_Titel
    Information Theory Proceedings (ISIT), 2010 IEEE International Symposium on
  • Conference_Location
    Austin, TX
  • Print_ISBN
    978-1-4244-7890-3
  • Electronic_ISBN
    978-1-4244-7891-0
  • Type

    conf

  • DOI
    10.1109/ISIT.2010.5513517
  • Filename
    5513517