Title of article :
Rényi divergence measures for commonly used univariate continuous distributions
Author/Authors :
M. Gil، نويسنده , , F. Alajaji، نويسنده , , T. Linder، نويسنده ,
Issue Information :
روزنامه با شماره پیاپی سال 2013
Abstract :
Probabilistic ‘distances’ (also called divergences), which in some sense assess how ‘close’ two probability distributions are from one another, have been widely employed in probability, statistics, information theory, and related fields. Of particular importance due to their generality and applicability are the Rényi divergence measures. This paper presents closed-form expressions for the Rényi and Kullback–Leibler divergences for nineteen commonly used univariate continuous distributions as well as those for multivariate Gaussian and Dirichlet distributions. In addition, a table summarizing four of the most important information measure rates for zero-mean stationary Gaussian processes, namely Rényi entropy, differential Shannon entropy, Rényi divergence, and Kullback–Leibler divergence, is presented. Lastly, a connection between the Rényi divergence and the variance of the log-likelihood ratio of two distributions is established, thereby extending a previous result by Song [J. Stat. Plan. Infer. 93 (2001)] on the relation between Rényi entropy and the log-likelihood function. A table with the corresponding variance expressions for the univariate distributions considered here is also included.
Keywords :
Log-likelihood ratio , Kullback divergence , Continuous distributions , Rényi divergence , Probabilistic distances , Rényi divergence rate
Journal title :
Information Sciences
Journal title :
Information Sciences