Title :
A generalization of the entropy power inequality with applications
Author :
Zamir, Ram ; Feder, Meir
Author_Institution :
Dept. of Electr. Eng., Tel Aviv Univ., Israel
fDate :
9/1/1993 12:00:00 AM
Abstract :
The authors prove the following generalization of the entropy power inequality: h(ax_)⩾h(Ax_) where h(·) denotes (joint-) differential-entropy x_=x1...xn , is a random vector with independent components, x˜_=x˜...x˜n, is a Gaussian vector with independent components such that h(x¯i)=h(xi ), i=1...n, and A is any matrix. This generalization of the entropy-power inequality is applied to show that a non-Gaussian vector with independent components becomes “closer” to Gaussianity after a linear transformation, where the distance to Gaussianity is measured by the information divergence. Another application is a lower bound, greater than zero, for the mutual-information between nonoverlapping spectral components of a non-Gaussian white process. They also describe a dual generalization of the Fisher information inequality
Keywords :
information theory; matrix algebra; spectral analysis; Fisher information inequality; Gaussian vector; differential-entropy; entropy power inequality; independent components; information divergence; linear transformation; lower bound; matrix; mutual information; nonGaussian vector; nonGaussian white process; nonoverlapping spectral components; random vector; Covariance matrix; Cramer-Rao bounds; Entropy; Gaussian distribution; Gaussian processes; Information theory; Linear matrix inequalities; Power measurement; Upper bound; Vectors;
Journal_Title :
Information Theory, IEEE Transactions on