DocumentCode :
2522037
Title :
The entropy region for three Gaussian random variables
Author :
Hassibi, Babak ; Shadbakht, Sormeh
Author_Institution :
Depertment of Electr. Eng., California Inst. of Technol., Pasadena, CA
fYear :
2008
fDate :
6-11 July 2008
Firstpage :
2634
Lastpage :
2638
Abstract :
Given n (discrete or continuous) random variables Xi, the (2n - 1)-dimensional vector obtained by evaluating the joint entropy of all non-empty subsets of {X1,hellip, Xn} is called an entropic vector. Determining the region of entropic vectors is an important open problem in information theory. Recently, Chan has shown that the entropy regions for discrete and continuous random variables, though different, can be determined from one another. An important class of continuous random variables are those that are vector-valued and jointly Gaussian. It is known that Gaussian random variables violate the Ingleton bound, which many random variables such as those obtained from linear codes over finite fields do satisfy, and they also achieve certain non-Shannon inequalities. In this paper we give a full characterization of the entropy region for three jointly-Gaussian vector-valued random variables and, rather surprisingly, show that the region is strictly smaller than the entropy region for three arbitrary random variables. However, we also show the following result. For any given entropic vector h isin R7, there exists a thetas* > 0, such that for all thetas ges thetas*, the vector 1/thetas h can be generated by three vector-valued jointly Gaussian random variables. This implies that for three random variables the region of entropic vectors can be obtained by considering the cone generated by the space of Gaussian entropic vectors. It also suggests that studying Gaussian random variables for n ges 4 may be a fruitful approach to studying the space of entropic vectors for arbitrary n.
Keywords :
Gaussian processes; entropy; set theory; Ingleton bound; continuous random variables; discrete random variables; entropic vectors; entropy region; information theory; jointly-Gaussian vector-valued random variables; nonShannon inequalities; nonempty subsets; Cramer-Rao bounds; Entropy; Galois fields; Information theory; Linear code; Mutual information; Probability density function; Random variables; Vectors;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Information Theory, 2008. ISIT 2008. IEEE International Symposium on
Conference_Location :
Toronto, ON
Print_ISBN :
978-1-4244-2256-2
Electronic_ISBN :
978-1-4244-2257-9
Type :
conf
DOI :
10.1109/ISIT.2008.4595469
Filename :
4595469
Link To Document :
بازگشت