DocumentCode :
109399
Title :
Information-Estimation Relationships Over Binomial and Negative Binomial Models
Author :
Taborda, Camilo G. ; Dongning Guo ; Perez-Cruz, Fernando
Author_Institution :
Dept. of Signal Theor. & Commun., Carlos III Univ. of Madrid, Leganés, Spain
Volume :
60
Issue :
5
fYear :
2014
fDate :
May-14
Firstpage :
2630
Lastpage :
2646
Abstract :
In recent years, a number of new connections between information measures and estimation have been found under various models, including, predominantly, Gaussian and Poisson models. This paper develops similar results for the binomial and negative binomial models. In particular, it is shown that the derivative of the relative entropy and the derivative of the mutual information for the binomial and negative binomial models can be expressed through the expectation of closed-form expressions that have conditional estimates as the main argument. Under mild conditions, those derivatives take the form of an expected Bregman divergence.
Keywords :
Gaussian processes; stochastic processes; Gaussian models; Poisson models; closed-form expressions; expected Bregman divergence; information estimation relationships; mutual information; negative binomial models; Entropy; Erbium; Estimation; Loss measurement; Mutual information; Random variables; Signal to noise ratio; Binomial model; Bregman divergence; mutual information; negative binomial model; relative entropy;
fLanguage :
English
Journal_Title :
Information Theory, IEEE Transactions on
Publisher :
ieee
ISSN :
0018-9448
Type :
jour
DOI :
10.1109/TIT.2014.2307070
Filename :
6746122
Link To Document :
بازگشت