• DocumentCode
    995663
  • Title

    Representation of Mutual Information Via Input Estimates

  • Author

    Palomar, Daniel P. ; Verdu, Sergio

  • Author_Institution
    Dept. of Electr. Eng., Princeton Univ., NJ
  • Volume
    53
  • Issue
    2
  • fYear
    2007
  • Firstpage
    453
  • Lastpage
    470
  • Abstract
    A relationship between information theory and estimation theory was recently shown for the Gaussian channel, relating the derivative of mutual information with the minimum mean-square error. This paper generalizes the link between information theory and estimation theory to arbitrary channels, giving representations of the derivative of mutual information as a function of the conditional marginal input distributions given the outputs. We illustrate the use of this representation in the efficient numerical computation of the mutual information achieved by inputs such as specific codes or natural language
  • Keywords
    Gaussian channels; estimation theory; information theory; mean square error methods; Gaussian channel; estimation theory; information theory; marginal input distribution; minimum mean-square error; mutual information representation; numerical computation; Collaborative work; Covariance matrix; Estimation theory; Gaussian channels; Government; Information theory; Mutual information; Parity check codes; Signal to noise ratio; Vectors; Computation of mutual information; extrinsic information; input estimation; low-density parity-check (LDPC) codes; minimum mean square error (MMSE); mutual information; soft channel decoding;
  • fLanguage
    English
  • Journal_Title
    Information Theory, IEEE Transactions on
  • Publisher
    ieee
  • ISSN
    0018-9448
  • Type

    jour

  • DOI
    10.1109/TIT.2006.889728
  • Filename
    4069154