• DocumentCode
    51647
  • Title

    Rényi Divergence and Kullback-Leibler Divergence

  • Author

    van Erven, Tim ; Harremoes, Peter

  • Author_Institution
    Dept. de Math., Univ. Paris-Sud, Orsay, France
  • Volume
    60
  • Issue
    7
  • fYear
    2014
  • fDate
    Jul-14
  • Firstpage
    3797
  • Lastpage
    3820
  • Abstract
    Rényi divergence is related to Rényi entropy much like Kullback-Leibler divergence is related to Shannon´s entropy, and comes up in many settings. It was introduced by Rényi as a measure of information that satisfies almost the same axioms as Kullback-Leibler divergence, and depends on a parameter that is called its order. In particular, the Rényi divergence of order 1 equals the Kullback-Leibler divergence. We review and extend the most important properties of Rényi divergence and Kullback- Leibler divergence, including convexity, continuity, limits of σ-algebras, and the relation of the special order 0 to the Gaussian dichotomy and contiguity. We also show how to generalize the Pythagorean inequality to orders different from 1, and we extend the known equivalence between channel capacity and minimax redundancy to continuous channel inputs (for all orders) and present several other minimax results.
  • Keywords
    algebra; entropy; minimax techniques; σ-algebras; Gaussian contiguity; Gaussian dichotomy; Kullback-Leibler divergence; Pythagorean inequality; Rényi divergence; Rényi entropy; Shannon entropy; channel capacity; continuity; continuous channel inputs; convexity; minimax redundancy; Bismuth; Convergence; Data processing; Entropy; Markov processes; Q measurement; Testing; (alpha) -divergence; Bhattacharyya distance; Kullback-Leibler divergence; Pythagorean inequality; R??nyi divergence; information divergence;
  • fLanguage
    English
  • Journal_Title
    Information Theory, IEEE Transactions on
  • Publisher
    ieee
  • ISSN
    0018-9448
  • Type

    jour

  • DOI
    10.1109/TIT.2014.2320500
  • Filename
    6832827