• DocumentCode
    1629779
  • Title

    Universal algorithms: Building a case for pointwise convergence

  • Author

    Santhanam, Narayana ; Anantharam, Venkat

  • Author_Institution
    Dept. of ECE, Univ. of Hawaii Honolulu, Honolulu, HI, USA
  • fYear
    2012
  • Firstpage
    489
  • Lastpage
    493
  • Abstract
    We consider algorithms for prediction, compression and entropy estimation in a universal setup. In each case, we estimate some function of an unknown distribution p over the set of natural numbers, using only n observations generated i.i.d. from p. While p is unknown, it belongs to a known collection P of possible models. When the supports of distributions in P are uniformly bounded, consistent algorithms exist for each of the problems. Namely, the convergence of the estimate to the true value can be bounded by a function depending only on the sample size, n, and not on the underlying distribution p. However, when the supports of distributions in P are not uniformly bounded, a more natural approach involves algorithms that are pointwise consistent, namely, the convergence to the true value is at a rate that depends on both n and the underlying (unknown) distribution p. The obvious practical difficulty with pointwise convergence is that the asymptotic consistency of the algorithm may indicate nothing about the performance of the algorithm for any fixed sample size, since the underlying distribution is unknown. In this paper, we first note that for many complex model classes P, we can still circumvent the above practical difficulty with pointwise convergence. Secondly, we take here a preliminary step towards characterizing a broad framework establishing the hierarchy of difficulty of problems involving pointwise convergence. We look for connections among the following problems which we define for a pointwise convergence scenario: (i) predicting good upper bounds on the next unseen sample, (ii) weak universal compression, and (iii) entropy estimation. We construct counter-examples to show that no two properties above imply the third.
  • Keywords
    computational complexity; entropy; estimation theory; asymptotic consistency; complex model classes; entropy estimation; pointwise convergence; universal algorithm; universal compression; Accuracy; Convergence; Entropy; Estimation; Insurance; Length measurement; Prediction algorithms; entropy estimation; insurance; non-parametric approaches; prediction; strong and weak universal compression;
  • fLanguage
    English
  • Publisher
    ieee
  • Conference_Titel
    Communication, Control, and Computing (Allerton), 2012 50th Annual Allerton Conference on
  • Conference_Location
    Monticello, IL
  • Print_ISBN
    978-1-4673-4537-8
  • Type

    conf

  • DOI
    10.1109/Allerton.2012.6483258
  • Filename
    6483258