• DocumentCode
    3128604
  • Title

    Pointwise relations between information and estimation in Gaussian noise

  • Author

    Venkat, Kartik ; Weissman, Tsachy

  • Author_Institution
    Dept. of Electr. Eng., Stanford Univ., Stanford, CA, USA
  • fYear
    2012
  • fDate
    1-6 July 2012
  • Firstpage
    701
  • Lastpage
    705
  • Abstract
    Many of the classical and recent relations between information and estimation in the presence of Gaussian noise can be viewed as identities between expectations of random quantities. These include the I-MMSE relationship of Guo et al.; the relative entropy and mismatched estimation relationship of Verdu; the relationship between causal estimation and mutual information of Duncan, and its extension to the presence of feedback by Kadota et al.; the relationship between causal and non-casual estimation of Guo et al., and its mismatched version of Weissman. We dispense with the expectations and explore the nature of the pointwise relations between the respective random quantities. The pointwise relations that we find are as succinctly stated as - and give considerable insight into - the original expectation identities. As an illustration of our results, consider Duncan´s 1970 discovery that the mutual information is equal to the causal MMSE in the AWGN channel, which can equivalently be expressed saying that the difference between the input-output information density and half the causal estimation error is a zero mean random variable (regardless of the distribution of the channel input). We characterize this random variable explicitly, rather than merely its expectation. Classical estimation and information theoretic quantities emerge with new and surprising roles. For example, the variance of this random variable turns out to be given by the causal MMSE (which, in turn, is equal to the mutual information by Duncan´s result).
  • Keywords
    AWGN channels; Gaussian noise; information theory; AWGN channel; Gaussian noise; I-MMSE relationship; causal estimation; information theoretic quantities; input-output information density; mutual information; pointwise relations; respective random quantities; zero mean random variable; Channel estimation; Couplings; Entropy; Estimation error; Mutual information; Random variables;
  • fLanguage
    English
  • Publisher
    ieee
  • Conference_Titel
    Information Theory Proceedings (ISIT), 2012 IEEE International Symposium on
  • Conference_Location
    Cambridge, MA
  • ISSN
    2157-8095
  • Print_ISBN
    978-1-4673-2580-6
  • Electronic_ISBN
    2157-8095
  • Type

    conf

  • DOI
    10.1109/ISIT.2012.6284310
  • Filename
    6284310