Title :
A conditional entropy power inequality for dependent variables
Author_Institution :
Centre for Math. Sci., Cambridge Univ., UK
Abstract :
We provide a condition under which a version of Shannon´s entropy power inequality will hold for dependent variables. We first provide a Fisher information inequality extending that found in the independent case. The key ingredients are a conditional expectation representation for the score function of a sum, and the de Bruijn identity which relates entropy and Fisher information.
Keywords :
entropy; information theory; Fisher information inequality; Shannons entropy; conditional entropy power inequality; de Bruijn identity; dependent variables; score function; Convolution; Cramer-Rao bounds; Entropy; Integral equations; Random variables; Entropy power inequality; Fisher information;
Journal_Title :
Information Theory, IEEE Transactions on
DOI :
10.1109/TIT.2004.831790