• DocumentCode
    909920
  • Title

    The amount of information that y gives about X

  • Author

    Blachman, Netson M.

  • Volume
    14
  • Issue
    1
  • fYear
    1968
  • fDate
    1/1/1968 12:00:00 AM
  • Firstpage
    27
  • Lastpage
    31
  • Abstract
    No single measure M(X;y) of the amount of information that a specific value y of a random variable Y gives about another random variable X has all of the desirable properties possessed by Shannon\´s measure I(X;Y) = E{M(X;y)} of the average mutual information of X and Y . It is shown that one of these properties (additivity) determines one particular form for M(X;y) , while others (non-negativity or coordinate independance) determine a different form. The latter, which is the more useful and accepted information measure, is thus seen to be unique.
  • Keywords
    Information theory; Random variables; Coordinate measuring machines; Entropy; Helium; Marine vehicles; Mutual information; Particle measurements; Q measurement; Random variables;
  • fLanguage
    English
  • Journal_Title
    Information Theory, IEEE Transactions on
  • Publisher
    ieee
  • ISSN
    0018-9448
  • Type

    jour

  • DOI
    10.1109/TIT.1968.1054094
  • Filename
    1054094