Title :
On mutual information, likelihood ratios, and estimation error for the additive Gaussian channel
Author_Institution :
Dept. of Electr. Eng., Technion-Israel Inst. of Technol., Haifa
Abstract :
This paper considers the model of an arbitrarily distributed signal x observed through an added independent white Gaussian noise w, y=x+w. New relations between the minimal mean-square error of the noncausal estimator and the likelihood ratio between y and w are derived. This is followed by an extended version of a recently derived relation between the mutual information I(x;y) and the minimal mean-square error. These results are applied to derive infinite-dimensional versions of the Fisher information and the de Bruijn identity. A comparison between the causal and noncausal estimation errors yields a restricted form of the logarithmic Sobolev inequality. The derivation of the results is based on the Malliavin calculus
Keywords :
AWGN channels; Gaussian noise; calculus; entropy; mean square error methods; nonlinear filters; signal processing; white noise; Fisher information; Gaussian channel; Malliavin calculus; arbitrarily distributed signal; independent white Gaussian noise; logarithmic Sobolev inequality; minimal mean-square error; mutual information; noncausal estimator; nonlinear filtering; relative entropy; Calculus; Entropy; Estimation error; Gaussian channels; Indium tin oxide; Information filtering; Information filters; Mutual information; Noise measurement; White noise; Gaussian channel; Malliavin calculus; logarithmic Sobolev inequality; minimal mean-square estimation error; mutual information; nonlinear filtering; relative entropy;
Journal_Title :
Information Theory, IEEE Transactions on
DOI :
10.1109/TIT.2005.853297