• DocumentCode
    970319
  • Title

    The Kullback-Leibler divergence rate between Markov sources

  • Author

    Rached, Ziad ; Alajaji, Fady ; Campbell, L. Lorne

  • Author_Institution
    Dept. of Math. & Stat., Queen´´s Univ., Kingston, Ont., Canada
  • Volume
    50
  • Issue
    5
  • fYear
    2004
  • fDate
    5/1/2004 12:00:00 AM
  • Firstpage
    917
  • Lastpage
    921
  • Abstract
    In this work, we provide a computable expression for the Kullback-Leibler divergence rate limn→∞1/nD(p(n)||q(n)) between two time-invariant finite-alphabet Markov sources of arbitrary order and arbitrary initial distributions described by the probability distributions p(n) and q(n), respectively. We illustrate it numerically and examine its rate of convergence. The main tools used to obtain the Kullback-Leibler divergence rate and its rate of convergence are the theory of nonnegative matrices and Perron-Frobenius theory. Similarly, we provide a formula for the Shannon entropy rate limn→∞1/nH(p(n)) of Markov sources and examine its rate of convergence.
  • Keywords
    Markov processes; convergence of numerical methods; entropy; matrix algebra; Kullback-Leibler divergence rate; Perron-Frobenius theory; Shannon entropy rate; arbitrary initial distributions; convergence rate; decision theory; nonnegative matrices theory; pattern recognition; time-invariant finite-alphabet Markov sources; Convergence of numerical methods; Councils; Decision theory; Distributed computing; Entropy; Mathematics; Pattern recognition; Probability distribution; Statistical distributions; Statistics;
  • fLanguage
    English
  • Journal_Title
    Information Theory, IEEE Transactions on
  • Publisher
    ieee
  • ISSN
    0018-9448
  • Type

    jour

  • DOI
    10.1109/TIT.2004.826687
  • Filename
    1291741