DocumentCode :
890275
Title :
Information-theoretic upper and lower bounds for statistical estimation
Author :
Zhang, Tong
Author_Institution :
Yahoo Inc., New York, NY
Volume :
52
Issue :
4
fYear :
2006
fDate :
4/1/2006 12:00:00 AM
Firstpage :
1307
Lastpage :
1321
Abstract :
In this paper, we establish upper and lower bounds for some statistical estimation problems through concise information-theoretic arguments. Our upper bound analysis is based on a simple yet general inequality which we call the information exponential inequality. We show that this inequality naturally leads to a general randomized estimation method, for which performance upper bounds can be obtained. The lower bounds, applicable for all statistical estimators, are obtained by original applications of some well known information-theoretic inequalities, and approximately match the obtained upper bounds for various important problems. Moreover, our framework can be regarded as a natural generalization of the standard minimax framework, in that we allow the performance of the estimator to vary for different possible underlying distributions according to a predefined prior
Keywords :
information theory; minimax techniques; random processes; statistical analysis; general randomized estimation method; information-theoretic inequality; lower bound analysis; standard minimax framework; statistical estimation; upper bound analysis; Additives; Bayesian methods; Helium; Information analysis; Minimax techniques; Pattern recognition; Probability; Random variables; Statistical learning; Upper bound; Gibbs algorithm; PAC-Bayes; lower bound; minimax; randomized estimatin; statistical estimation;
fLanguage :
English
Journal_Title :
Information Theory, IEEE Transactions on
Publisher :
ieee
ISSN :
0018-9448
Type :
jour
DOI :
10.1109/TIT.2005.864439
Filename :
1614067
Link To Document :
بازگشت