DocumentCode
922966
Title
The common information of two dependent random variables
Author
Wyner, Aaron D.
Volume
21
Issue
2
fYear
1975
fDate
3/1/1975 12:00:00 AM
Firstpage
163
Lastpage
179
Abstract
The problem of finding a meaningful measure of the "common information" or "common randomness\´ of two discrete dependent random variables
is studied. The quantity
is defined as the minimum possible value of
where the minimum is taken over all distributions defining an auxiliary random variable
, a finite set, such that
are conditionally independent given
. The main result of the paper is contained in two theorems which show that
is i) the minimum
such that a sequence of independent copies of
can be efficiently encoded into three binary streams
with rates
, respectively,
and
recovered from
, and
recovered from
, i.e.,
is the common stream; ii) the minimum binary rate
of the common input to independent processors that generate an approximation to
.
is studied. The quantity
is defined as the minimum possible value of
where the minimum is taken over all distributions defining an auxiliary random variable
, a finite set, such that
are conditionally independent given
. The main result of the paper is contained in two theorems which show that
is i) the minimum
such that a sequence of independent copies of
can be efficiently encoded into three binary streams
with rates
, respectively,
and
recovered from
, and
recovered from
, i.e.,
is the common stream; ii) the minimum binary rate
of the common input to independent processors that generate an approximation to
.Keywords
Information rates; Random variables; Source coding; Binary sequences; Entropy; Equations; Probability distribution; Random sequences; Random variables;
fLanguage
English
Journal_Title
Information Theory, IEEE Transactions on
Publisher
ieee
ISSN
0018-9448
Type
jour
DOI
10.1109/TIT.1975.1055346
Filename
1055346
Link To Document