• DocumentCode
    3594830
  • Title

    On complexity and randomness of Markov-chain prediction

  • Author

    Ratsaby, J.

  • Author_Institution
    Dept. of Electr. & Electron. Eng., Ariel Univ. of Samaria, Ariel, Israel
  • fYear
    2015
  • Firstpage
    1
  • Lastpage
    5
  • Abstract
    Let {Xt : t ∈ ℤ} be a sequence of binary random variables generated by a stationary Markov source of order k*. Let β be the probability of the event “Xt = 1”. Consider a learner based on a Markov model of order k, where k may be different from k*, who trains on a sample sequence X(m) which is randomly drawn by the source. Test the learner´s performance by asking it to predict the bits of a test sequence X(n) (generated by the source). An error occurs at time t if the prediction Yt differs from the true bit value Xt, 1 ≤ t ≤ n. Denote by Ξ(n) the sequence of errors where the error bit Ξt at time t equals 1 or 0 according to whether an error occurred or not, respectively. Consider the subsequence Ξ(ν) of Ξ(n) which corresponds to the errors made when predicting a 0, that is, Ξ(ν) consists of those bits Ξt at times t where Yt = 0: In this paper we compute an upper bound on the absolute deviation between the frequency of 1 in Ξ(ν) and β. The bound has an explicit dependence on k, k*, m, ν, n. It shows that the larger k, or the larger the difference k - k*, the less random that Ξ(ν) can become.
  • Keywords
    Markov processes; binary sequences; Markov-chain prediction; binary random variables; binary sequences; stationary Markov source; Complexity theory; Markov processes; Random variables; Stability analysis; Testing; Training; Yttrium;
  • fLanguage
    English
  • Publisher
    ieee
  • Conference_Titel
    Information Theory Workshop (ITW), 2015 IEEE
  • Print_ISBN
    978-1-4799-5524-4
  • Type

    conf

  • DOI
    10.1109/ITW.2015.7133078
  • Filename
    7133078