DocumentCode :
1386933
Title :
On characterization of entropy function via information inequalities
Author :
Zhang, Zhen ; Yeung, Raymond W.
Author_Institution :
Dept. of Electr. Eng. Syst., Univ. of Southern California, Los Angeles, CA, USA
Volume :
44
Issue :
4
fYear :
1998
fDate :
7/1/1998 12:00:00 AM
Firstpage :
1440
Lastpage :
1452
Abstract :
Given n discrete random variables Ω={X1, ···, Xn}, associated with any subset α of (1, 2, ···, n), there is a joint entropy H(Xα) where Xα={Xi:iεα}. This can be viewed as a function defined on 2/sup {1, 2, ···, n}/ taking values in (0, +∞). We call this function the entropy function of Ω. The nonnegativity of the joint entropies implies that this function is nonnegative; the nonnegativity of the conditional joint entropies implies that this function is nondecreasing; and the nonnegativity of the conditional mutual information implies that this function has the following property: for any two subsets α and β of {1, 2, ···, n} HΩ(α)+HΩ(β)⩾H Ω(α∪β)+HΩ (α∩β). These properties are the so-called basic information inequalities of Shannon´s information measures. Do these properties fully characterize the entropy function? To make this question more precise, we view an entropy function as a 2n-1-dimensional vector where the coordinates are indexed by the nonempty subsets of the ground set {1, 2, ···, n}. Let Γn be the cone in R2n-1 consisting of all vectors which have these three properties when they are viewed as functions defined on 2/sup {1, 2, ···, n}/. Let Γn* be the set of all 2n-1-dimensional vectors which correspond to the entropy functions of some sets of n discrete random variables. The question can be restated as: is it true that for any n, Γ¯n*=Γn? Here Γ¯n* stands for the closure of the set Γn*. The answer is “yes” when n=2 and 3 as proved in our previous work. Based on intuition, one may tend to believe that the answer should be “yes” for any n. The main discovery of this paper is a new information-theoretic inequality involving four discrete random variables which gives a negative answer to this fundamental problem in information theory: Γ¯*n is strictly smaller than Γn whenever n>3. While this new inequality gives a nontrivial outer bound to the cone Γ¯4*, an inner bound for Γ¯*4 is also given. The inequality is also extended to any number of random variables
Keywords :
entropy; functional analysis; random processes; set theory; Shannon´s information measures; conditional joint entropies; conditional mutual information; cone; discrete random variables; entropy function; ground set; information-theoretic inequality; joint entropy; nondecreasing function; nonempty subsets; nonnegative function; nontrivial outer bound; Councils; Cramer-Rao bounds; Entropy; Information theory; Mutual information; Random variables;
fLanguage :
English
Journal_Title :
Information Theory, IEEE Transactions on
Publisher :
ieee
ISSN :
0018-9448
Type :
jour
DOI :
10.1109/18.681320
Filename :
681320
Link To Document :
بازگشت