DocumentCode :
1410602
Title :
Information Theoretic Proofs of Entropy Power Inequalities
Author :
Rioul, Olivier
Author_Institution :
Inst. Telecom, Telecom ParisTech, Paris, France
Volume :
57
Issue :
1
fYear :
2011
Firstpage :
33
Lastpage :
55
Abstract :
While most useful information theoretic inequalities can be deduced from the basic properties of entropy or mutual information, up to now Shannon´s entropy power inequality (EPI) is an exception: Existing information theoretic proofs of the EPI hinge on representations of differential entropy using either Fisher information or minimum mean-square error (MMSE), which are derived from de Bruijn´s identity. In this paper, we first present an unified view of these proofs, showing that they share two essential ingredients: 1) a data processing argument applied to a covariance-preserving linear transformation; 2) an integration over a path of a continuous Gaussian perturbation. Using these ingredients, we develop a new and brief proof of the EPI through a mutual information inequality, which replaces Stam and Blachman´s Fisher information inequality (FII) and an inequality for MMSE by Guo, Shamai, and Verdú used in earlier proofs. The result has the advantage of being very simple in that it relies only on the basic properties of mutual information. These ideas are then generalized to various extended versions of the EPI: Zamir and Feder´s generalized EPI for linear transformations of the random variables, Takano and Johnson´s EPI for dependent variables, Liu and Viswanath´s covariance-constrained EPI, and Costa´s concavity inequality for the entropy power.
Keywords :
Gaussian processes; covariance analysis; entropy; integration; least mean squares methods; perturbation techniques; Fisher information inequality; MMSE; concavity inequality; continuous Gaussian perturbation; covariance-constrained EPI; covariance-preserving linear transformation; data processing argument; de Bruijn´s identity; differential entropy; entropy power inequality; information theoretic inequality; information theoretic proofs; integration; minimum mean-square error; random variables; Covariance matrix; Data processing; Entropy; Estimation; Markov processes; Random variables; Symmetric matrices; Data processing inequality; Fisher information; Fisher information inequality (FII); de Bruijn´s identity; differential entropy; divergence; entropy power inequality (EPI); minimum mean-square error (MMSE); mutual information; relative entropy;
fLanguage :
English
Journal_Title :
Information Theory, IEEE Transactions on
Publisher :
ieee
ISSN :
0018-9448
Type :
jour
DOI :
10.1109/TIT.2010.2090193
Filename :
5673809
Link To Document :
بازگشت