• DocumentCode
    1374477
  • Title

    Fast Computation of the Kullback–Leibler Divergence and Exact Fisher Information for the First-Order Moving Average Model

  • Author

    Makalic, Enes ; Schmidt, Daniel F.

  • Author_Institution
    Centre for MEGA Epidemiology, Univ. of Melbourne, Carlton, VIC, Australia
  • Volume
    17
  • Issue
    4
  • fYear
    2010
  • fDate
    4/1/2010 12:00:00 AM
  • Firstpage
    391
  • Lastpage
    393
  • Abstract
    In this note expressions are derived that allow computation of the Kullback-Leibler (K-L) divergence between two first-order Gaussian moving average models in O n(1) time as the sample size n ?? ??. These expressions can also be used to evaluate the exact Fisher information matrix in On(1) time, and provide a basis for an asymptotic expression of the K-L divergence.
  • Keywords
    Gaussian processes; information theory; moving average processes; Kullback-Leibler divergence; first-order Gaussian moving average models; fisher information; Fisher information; Kullback–Leibler divergence; moving average models;
  • fLanguage
    English
  • Journal_Title
    Signal Processing Letters, IEEE
  • Publisher
    ieee
  • ISSN
    1070-9908
  • Type

    jour

  • DOI
    10.1109/LSP.2009.2039659
  • Filename
    5371931