Title :
Game theory, maximum generalized entropy, minimum discrepancy, robust Bayes and Pythagoras
Author :
Grünwald, P.D. ; Dawid, A.P.
Author_Institution :
CWI, Amsterdam, Netherlands
Abstract :
Suppose that, for purposes of inductive inference or choosing an optimal decision, we wish to select a single distribution P* to act as representative of a class Γ of such distributions. The maximum entropy principle ("MaxeEnt") (Jaynes 1989; Csiszar 1991) is widely applied for this purpose, but its rationale has often been controversial (Shimony 1985; Seidenfeld 1986). Here we emphasize and generalize a reinterpretation of the maximum entropy principle (Topsoe (1979); Walley (1991); Grunwald (1998)): that the distribution P* that maximizes the entropy over Γ also minimizes the worst-case expected logarithmic score (log loss). In the terminology of decision theory (Berger 1985), P* is a robust Bayes, or Γ-minimax, act, when loss is measured by the log loss. This gives a decision-theoretic justification for maximum entropy.
Keywords :
Bayes methods; decision theory; exponential distribution; game theory; information theory; maximum entropy methods; probability; Pythagoras; Pythagorean inequality; decision theory; exponential distributions; game theory; generalized relative entropy; inductive inference; maximum entropy principle; maximum generalized entropy; minimum discrepancy; probability mass function; robust Bayes; worst-case expected logarithmic score; Constraint theory; Decision theory; Delta modulation; Entropy; Game theory; Loss measurement; Minimax techniques; Robustness; Stochastic systems; Terminology;
Conference_Titel :
Information Theory Workshop, 2002. Proceedings of the 2002 IEEE
Print_ISBN :
0-7803-7629-3
DOI :
10.1109/ITW.2002.1115425