DocumentCode :
771470
Title :
On Divergences and Informations in Statistics and Information Theory
Author :
Liese, Friedrich ; Vajda, Igor
Author_Institution :
Dept. of Math., Rostock Univ.
Volume :
52
Issue :
10
fYear :
2006
Firstpage :
4394
Lastpage :
4412
Abstract :
The paper deals with the f-divergences of Csiszar generalizing the discrimination information of Kullback, the total variation distance, the Hellinger divergence, and the Pearson divergence. All basic properties of f-divergences including relations to the decision errors are proved in a new manner replacing the classical Jensen inequality by a new generalized Taylor expansion of convex functions. Some new properties are proved too, e.g., relations to the statistical sufficiency and deficiency. The generalized Taylor expansion also shows very easily that all f-divergences are average statistical informations (differences between prior and posterior Bayes errors) mutually differing only in the weights imposed on various prior distributions. The statistical information introduced by De Groot and the classical information of Shannon are shown to be extremal cases corresponding to alpha=0 and alpha=1 in the class of the so-called Arimoto alpha-informations introduced in this paper for 0<alpha<1 by means of the Arimoto alpha-entropies. Some new examples of f-divergences are introduced as well, namely, the Shannon divergences and the Arimoto alpha-divergences leading for alphauarr1 to the Shannon divergences. Square roots of all these divergences are shown to be metrics satisfying the triangle inequality. The last section introduces statistical tests and estimators based on the minimal f-divergence with the empirical distribution achieved in the families of hypothetic distributions. For the Kullback divergence this leads to the classical likelihood ratio test and estimator
Keywords :
decision theory; entropy; maximum likelihood estimation; statistical testing; Arimoto entropy; Kullback divergence; Shannon divergence; Taylor expansion; classical likelihood ratio estimator; convex function; decision error; discrimination information; information theory; statistical information; statistical testing; Automation; Electronic mail; Entropy; Information theory; Mathematics; Probability; Random variables; Statistics; Taylor series; Testing; Arimoto divergence; Arimoto entropy; Arimoto information; Shannon divergence; Shannon information; deficiency; discrimination information; minimum; statistical information; sufficiency;
fLanguage :
English
Journal_Title :
Information Theory, IEEE Transactions on
Publisher :
ieee
ISSN :
0018-9448
Type :
jour
DOI :
10.1109/TIT.2006.881731
Filename :
1705001
Link To Document :
بازگشت