• DocumentCode
    595051
  • Title

    Closed-form information-theoretic divergences for statistical mixtures

  • Author

    Nielsen, Frank

  • fYear
    2012
  • fDate
    11-15 Nov. 2012
  • Firstpage
    1723
  • Lastpage
    1726
  • Abstract
    Statistical mixtures such as Rayleigh, Wishart or Gaussian mixture models are commonly used in pattern recognition and signal processing tasks. Since the Kullback-Leibler divergence between any two such mixture models does not admit an analytical expression, the relative entropy can only be approximated numerically using time-consuming Monte-Carlo stochastic sampling. This drawback has motivated the quest for alternative information-theoretic divergences such as the recent Jensen-Rényi, Cauchy-Schwarz, or total square loss divergences that bypass the numerical approximations by providing exact analytic expressions. In this paper, we state sufficient conditions on the mixture distribution family so that these novel non-KL statistical divergences between any two such mixtures can be expressed in generic closed-form formulas.
  • Keywords
    Monte Carlo methods; approximation theory; entropy; pattern recognition; sampling methods; statistical distributions; stochastic processes; Kullback-Leibler divergence; Monte Carlo stochastic sampling; information theory divergence; nonKL statistical divergence; numerical approximation; pattern recognition; relative entropy; signal processing; statistical mixture distribution; sufficient conditions; Closed-form solutions; Entropy; Gaussian mixture model; Laplace equations; Monte Carlo methods; Shape;
  • fLanguage
    English
  • Publisher
    ieee
  • Conference_Titel
    Pattern Recognition (ICPR), 2012 21st International Conference on
  • Conference_Location
    Tsukuba
  • ISSN
    1051-4651
  • Print_ISBN
    978-1-4673-2216-4
  • Type

    conf

  • Filename
    6460482