DocumentCode :
2984104
Title :
Mismatched estimation and relative entropy
Author :
Verdú, Sergio
Author_Institution :
Dept. of Electr. Eng., Princeton Univ., Princeton, NJ, USA
fYear :
2009
fDate :
June 28 2009-July 3 2009
Firstpage :
809
Lastpage :
813
Abstract :
A random variable with distribution P is observed in Gaussian noise and is estimated by a minimum mean-square estimator that assumes that the distribution is Q. This paper shows that the integral over all signal-to-noise ratios of the excess mean-square estimation error incurred by the mismatched estimator is twice the relative entropy D(PparQ). This representation of relative entropy can be generalized to non real-valued random variables, and can be particularized to give a new general representation of mutual information in terms of conditional means. Inspired by the new representation, we also propose a definition of free relative entropy which fills a gap in, and is consistent with, the literature on free probability.
Keywords :
Gaussian noise; entropy codes; information theory; least mean squares methods; Gaussian noise; minimum mean-square estimator; mismatched estimation; relative entropy; Entropy; Estimation error; Estimation theory; Gaussian distribution; Gaussian noise; Information theory; Mutual information; Random variables; Signal to noise ratio; Uncertainty;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Information Theory, 2009. ISIT 2009. IEEE International Symposium on
Conference_Location :
Seoul
Print_ISBN :
978-1-4244-4312-3
Electronic_ISBN :
978-1-4244-4313-0
Type :
conf
DOI :
10.1109/ISIT.2009.5205651
Filename :
5205651
Link To Document :
بازگشت