DocumentCode :
1205962
Title :
On the maximum entropy of the sum of two dependent random variables
Author :
Cover, Thomas M. ; Zhang, Zhen
Author_Institution :
Dept. of Electr. Eng., Stanford Univ., CA, USA
Volume :
40
Issue :
4
fYear :
1994
fDate :
7/1/1994 12:00:00 AM
Firstpage :
1244
Lastpage :
1246
Abstract :
Investigates the maximization of the differential entropy h(X+Y) of arbitrary dependent random variables X and Y under the constraints of fixed equal marginal densities for X and Y. We show that max[h(X+Y)]=h(2X), under the constraints that X and Y have the same fixed marginal density f, if and only if f is log-concave. The maximum is achieved when X=Y. If f is not log-concave, the maximum is strictly greater than h(2X). As an example, identically distributed Gaussian random variables have log-concave densities and satisfy max[h(X+Y)]=h(2X) with X=Y. More general inequalities in this direction should lead to capacity bounds for additive noise channels with feedback
Keywords :
entropy; information theory; optimisation; additive noise channels; capacity bounds; dependent random variables; differential entropy; entropy power inequality; feedback; fixed equal marginal densities; identically distributed Gaussian random variables; log-concave density; maximum entropy; Additive noise; Cramer-Rao bounds; Density measurement; Entropy; Feedback; Noise measurement; Random variables; Statistics; Sufficient conditions; Upper bound;
fLanguage :
English
Journal_Title :
Information Theory, IEEE Transactions on
Publisher :
ieee
ISSN :
0018-9448
Type :
jour
DOI :
10.1109/18.335945
Filename :
335945
Link To Document :
بازگشت