DocumentCode :
923910
Title :
A conditional entropy bound for a pair of discrete random variables
Author :
Witsenhausen, Hans S. ; Wyner, Aaron D.
Volume :
21
Issue :
5
fYear :
1975
fDate :
9/1/1975 12:00:00 AM
Firstpage :
493
Lastpage :
501
Abstract :
Let X, Y be a pair of discrete random variables with a given joint probability distribution. For 0 \\leq x \\leq H(X) , the entropy of X , define the function F(x) as the infimum of H(Y\\mid W) , the conditional entropy of Y given W , with respect to all discrete random variables W such that a) H(X\\mid W) = x , and b) W and Y are conditionally independent given X . This paper concerns the function F , its properties, its calculation, and its applications to several problems in information theory.
Keywords :
Entropy functions; Random variables; Entropy; Information theory; Probability distribution; Random variables; Stochastic processes; Writing;
fLanguage :
English
Journal_Title :
Information Theory, IEEE Transactions on
Publisher :
ieee
ISSN :
0018-9448
Type :
jour
DOI :
10.1109/TIT.1975.1055437
Filename :
1055437
Link To Document :
بازگشت