• DocumentCode
    2564446
  • Title

    Kullback-Leibler divergence rate between probability distributions on sets of different cardinalities

  • Author

    Vidyasagar, M.

  • Author_Institution
    Dept. of Bioeng., Univ. of Texas at Dallas, Richardson, TX, USA
  • fYear
    2010
  • fDate
    15-17 Dec. 2010
  • Firstpage
    948
  • Lastpage
    953
  • Abstract
    In this paper we generalize the familiar notion of the Kullback-Leibler divergence between two probability distribitions on a finite set to the case where the two probability distributions are defined on sets of different cardinalities. This is achieved by `dilating´ the distribution on the smaller set to one on the larger set. This idea follows. Then, using the log sum inequality, we give a formula for this divergence that permits all computations to be carried out on the set of smaller cardinality. However, computing the divergence is still in general an intractable problem. Hence we give a greedy algorithm for quickly obtaining an upper bound for this divergence. The ideas given here can be used to study the concept of an optimally aggregated model of a Markov or hidden Markov process. This is done in a companion paper.
  • Keywords
    Markov processes; entropy; set theory; statistical distributions; Kullback-Leibler divergence rate; Shannon entropy; cardinality; greedy algorithm; hidden Markov process; log sum inequality; probability distribution; Complexity theory; Entropy; Greedy algorithms; Markov processes; Mutual information; Probability distribution; Tin;
  • fLanguage
    English
  • Publisher
    ieee
  • Conference_Titel
    Decision and Control (CDC), 2010 49th IEEE Conference on
  • Conference_Location
    Atlanta, GA
  • ISSN
    0743-1546
  • Print_ISBN
    978-1-4244-7745-6
  • Type

    conf

  • DOI
    10.1109/CDC.2010.5716982
  • Filename
    5716982