DocumentCode :
1015966
Title :
Generalized Entropy Power Inequalities and Monotonicity Properties of Information
Author :
Madiman, Mokshay ; Barron, Andrew
Author_Institution :
Yale Univ., New Haven
Volume :
53
Issue :
7
fYear :
2007
fDate :
7/1/2007 12:00:00 AM
Firstpage :
2317
Lastpage :
2329
Abstract :
New families of Fisher information and entropy power inequalities for sums of independent random variables are presented. These inequalities relate the information in the sum of n independent random variables to the information contained in sums over subsets of the random variables, for an arbitrary collection of subsets. As a consequence, a simple proof of the monotonicity of information in central limit theorems is obtained, both in the setting of independent and identically distributed (i.i.d.) summands as well as in the more general setting of independent summands with variance-standardized sums.
Keywords :
entropy; information theory; random functions; set theory; Fisher information; arbitrary collection; central limit theorems; entropy power inequalities; independent random variables; monotonicity properties; subsets; variance-standardized sums; Conferences; Convergence; Cramer-Rao bounds; Entropy; Gaussian distribution; Information theory; Probability density function; Random variables; Statistics; Central limit theorem; entropy power; information inequalities;
fLanguage :
English
Journal_Title :
Information Theory, IEEE Transactions on
Publisher :
ieee
ISSN :
0018-9448
Type :
jour
DOI :
10.1109/TIT.2007.899484
Filename :
4252338
Link To Document :
بازگشت