• DocumentCode
    1106784
  • Title

    Estimating Mutual Information Via Kolmogorov Distance

  • Author

    Zhang, Zhengmin

  • Author_Institution
    Carleton Univ., Ottawa
  • Volume
    53
  • Issue
    9
  • fYear
    2007
  • Firstpage
    3280
  • Lastpage
    3282
  • Abstract
    By use of a coupling technique, two inequalities are established which set upper bounds to the mutual information of finite discrete random variables in terms of the Kolmogorov distance (variational distance).
  • Keywords
    entropy; random processes; Kolmogorov distance; Shannon entropy; coupling technique; finite discrete random variables; set upper bounds; variational distance; Entropy; Information theory; Mathematics; Mutual coupling; Mutual information; Probability distribution; Random variables; Statistical distributions; Testing; Upper bound; Kolmogorov distance; Shannon entropy; mutual information;
  • fLanguage
    English
  • Journal_Title
    Information Theory, IEEE Transactions on
  • Publisher
    ieee
  • ISSN
    0018-9448
  • Type

    jour

  • DOI
    10.1109/TIT.2007.903122
  • Filename
    4294175