DocumentCode :
1779807
Title :
Justification of logarithmic loss via the benefit of side information
Author :
Jiantao Jiao ; Courtade, Thomas ; Venkat, Kartik ; Weissman, Tsachy
Author_Institution :
Dept. of Electr. Eng., Stanford Univ., Stanford, CA, USA
fYear :
2014
fDate :
June 29 2014-July 4 2014
Firstpage :
946
Lastpage :
950
Abstract :
We consider a natural measure of the benefit of side information: the reduction in optimal estimation risk when side information is available to the estimator. When such a measure satisfies a natural data processing property, and the source alphabet has cardinality greater than two, we show that it is uniquely characterized by the optimal estimation risk under logarithmic loss, and the corresponding measure is equal to mutual information. Further, when the source alphabet is binary, we characterize the only admissible forms the measure of predictive benefit can assume. These results unify many causality measures in the literature as instantiations of directed information, and present a natural axiomatic characterization of mutual information without requiring the sum or recursivity property.
Keywords :
decision theory; information theory; causality measure; logarithmic loss; natural axiomatic characterization; natural data processing property; optimal estimation risk reduction; predictive benefit measure; recursivity property; side information benefit measure; source alphabet; statistical decision theory; Data processing; Entropy; Estimation; Loss measurement; Mutual information; Yttrium;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Information Theory (ISIT), 2014 IEEE International Symposium on
Conference_Location :
Honolulu, HI
Type :
conf
DOI :
10.1109/ISIT.2014.6874972
Filename :
6874972
Link To Document :
بازگشت