DocumentCode :
3663396
Title :
Minimax estimation of discrete distributions
Author :
Yanjun Han;Jiantao Jiao;Tsachy Weissman
Author_Institution :
Tsinghua University, China
fYear :
2015
fDate :
6/1/2015 12:00:00 AM
Firstpage :
2291
Lastpage :
2295
Abstract :
We analyze the problem of discrete distribution estimation under ℓ1 loss. We provide non-asymptotic upper and lower bounds on the maximum risk of the empirical distribution (the maximum likelihood estimator), and the minimax risk in regimes where the alphabet size S may grow with the number of observations n. We show that among distributions with bounded entropy H, the asymptotic maximum risk for the empirical distribution is 2H / ln n, while the asymptotic minimax risk is H / ln n. Moreover, a hard-thresholding estimator, whose threshold does not depend on the unknown upper bound H, is asymptotically minimax. We draw connections between our work and the literature on density estimation, entropy estimation, total variation distance (ℓ1 divergence) estimation, joint distribution estimation in stochastic processes, normal mean estimation, and adaptive estimation.
Publisher :
ieee
Conference_Titel :
Information Theory (ISIT), 2015 IEEE International Symposium on
Electronic_ISBN :
2157-8117
Type :
conf
DOI :
10.1109/ISIT.2015.7282864
Filename :
7282864
Link To Document :
بازگشت