• DocumentCode
    28704
  • Title

    Universal Estimation of Directed Information

  • Author

    Jiantao Jiao ; Permuter, Haim H. ; Lei Zhao ; Young-Han Kim ; Weissman, Tsachy

  • Author_Institution
    Dept. of Electr. Eng., Stanford Univ., Stanford, CA, USA
  • Volume
    59
  • Issue
    10
  • fYear
    2013
  • fDate
    Oct. 2013
  • Firstpage
    6220
  • Lastpage
    6242
  • Abstract
    Four estimators of the directed information rate between a pair of jointly stationary ergodic finite-alphabet processes are proposed, based on universal probability assignments. The first one is a Shannon-McMillan-Breiman-type estimator, similar to those used by Verdú in 2005 and Cai in 2006 for estimation of other information measures. We show the almost sure and L1 convergence properties of the estimator for any underlying universal probability assignment. The other three estimators map universal probability assignments to different functionals, each exhibiting relative merits such as smoothness, nonnegativity, and boundedness. We establish the consistency of these estimators in almost sure and L1 senses, and derive near-optimal rates of convergence in the minimax sense under mild conditions. These estimators carry over directly to estimating other information measures of stationary ergodic finite-alphabet processes, such as entropy rate and mutual information rate, with near-optimal performance and provide alternatives to classical approaches in the existing literature. Guided by these theoretical results, the proposed estimators are implemented using the context-tree weighting algorithm as the universal probability assignment. Experiments on synthetic and real data are presented, demonstrating the potential of the proposed schemes in practice and the utility of directed information estimation in detecting and measuring causal influence and delay.
  • Keywords
    causality; delays; entropy; estimation theory; probability; Shannon-McMillan-Breiman-type estimator; causal influence measurement; context-tree weighting algorithm; delay; directed information rate estimation; entropy rate; mutual information rate; near-optimal performance; stationary ergodic finite-alphabet process; universal probability assignment; Context; Convergence; Educational institutions; Entropy; Estimation; Information rates; Q measurement; Causal influence; context-tree weighting (CTW); directed information; rate of convergence; universal probability assignment;
  • fLanguage
    English
  • Journal_Title
    Information Theory, IEEE Transactions on
  • Publisher
    ieee
  • ISSN
    0018-9448
  • Type

    jour

  • DOI
    10.1109/TIT.2013.2267934
  • Filename
    6555871