Author_Institution :
Inst. of Network Coding, Chinese Univ. of Hong Kong, Hong Kong, China
Abstract :
Let X be an arbitrary continuous random variable and Z be an independent Gaussian random variable with zero mean and unit variance. For t > 0, Costa proved that e2h(X+√t Z) is concave in t, where the proof hinged on the first and second order derivatives of h(X + √t Z). In particular, these two derivatives are signed, i.e., (∂/∂t)h(X + √tZ) ≥ 0 and (∂2/∂t2)h(X + √tZ) ≤ 0. In this paper, we show that the third order derivative of h(X + √tZ) is nonnegative, which implies that the Fisher information J(X + √tZ) is convex in t. We further show that the fourth order derivative of h(X +√tZ) is nonpositive. Following the first four derivatives, we make two conjectures on h(X +√tZ): the first is that (∂n/∂tn)h(X +√tZ) is nonnegative in t if n is odd, and nonpositive otherwise; the second is that log J(X + √tZ) is convex in t. The first conjecture can be rephrased in the context of completely monotone functions: J(X + √tZ) is completely monotone in t. The history of the first conjecture may date back to a problem in mathematical physics studied by McKean in 1966. Apart from these results, we provide a geometrical interpretation to the covariance-preserving transformation and study the concavity of h(√t X +√1 - t Z), revealing its connection with Costa´s entropy power inequality.
Keywords :
covariance analysis; entropy; Costa entropy power inequality; Fisher information; arbitrary continuous random variable; covariance-preserving transformation; higher order derivative; independent Gaussian random variable; mathematical physics; monotone function; Context; Entropy; Heating; Instruments; Network coding; Random variables; Yttrium; Completely monotone function; Costa???s EPI; Costa?s EPI; Differential entropy; Entropy power inequality; Fisher information; Heat equation; McKean???s problem; McKean?s problem; completely monotone function; differential entropy; entropy power inequality; fisher information; heat equation;