DocumentCode
923910
Title
A conditional entropy bound for a pair of discrete random variables
Author
Witsenhausen, Hans S. ; Wyner, Aaron D.
Volume
21
Issue
5
fYear
1975
fDate
9/1/1975 12:00:00 AM
Firstpage
493
Lastpage
501
Abstract
Let
be a pair of discrete random variables with a given joint probability distribution. For
, the entropy of
, define the function
as the infimum of
, the conditional entropy of
given
, with respect to all discrete random variables
such that a)
, and b)
and
are conditionally independent given
. This paper concerns the function
, its properties, its calculation, and its applications to several problems in information theory.
be a pair of discrete random variables with a given joint probability distribution. For
, the entropy of
, define the function
as the infimum of
, the conditional entropy of
given
, with respect to all discrete random variables
such that a)
, and b)
and
are conditionally independent given
. This paper concerns the function
, its properties, its calculation, and its applications to several problems in information theory.Keywords
Entropy functions; Random variables; Entropy; Information theory; Probability distribution; Random variables; Stochastic processes; Writing;
fLanguage
English
Journal_Title
Information Theory, IEEE Transactions on
Publisher
ieee
ISSN
0018-9448
Type
jour
DOI
10.1109/TIT.1975.1055437
Filename
1055437
Link To Document