• DocumentCode
    922966
  • Title

    The common information of two dependent random variables

  • Author

    Wyner, Aaron D.

  • Volume
    21
  • Issue
    2
  • fYear
    1975
  • fDate
    3/1/1975 12:00:00 AM
  • Firstpage
    163
  • Lastpage
    179
  • Abstract
    The problem of finding a meaningful measure of the "common information" or "common randomness\´ of two discrete dependent random variables X,Y is studied. The quantity C(X; Y) is defined as the minimum possible value of I(X, Y; W) where the minimum is taken over all distributions defining an auxiliary random variable W \\in mathcal{W} , a finite set, such that X, Y are conditionally independent given W . The main result of the paper is contained in two theorems which show that C(X; Y) is i) the minimum R_0 such that a sequence of independent copies of (X,Y) can be efficiently encoded into three binary streams W_0, W_1,W_2 with rates R_0,R_1,R_2 , respectively, [\\sum R_i = H(X, Y)] and X recovered from (W_0, W_1) , and Y recovered from (W_0, W_2) , i.e., W_0 is the common stream; ii) the minimum binary rate R of the common input to independent processors that generate an approximation to X,Y .
  • Keywords
    Information rates; Random variables; Source coding; Binary sequences; Entropy; Equations; Probability distribution; Random sequences; Random variables;
  • fLanguage
    English
  • Journal_Title
    Information Theory, IEEE Transactions on
  • Publisher
    ieee
  • ISSN
    0018-9448
  • Type

    jour

  • DOI
    10.1109/TIT.1975.1055346
  • Filename
    1055346