DocumentCode :
1451891
Title :
Universal and Composite Hypothesis Testing via Mismatched Divergence
Author :
Unnikrishnan, Jayakrishnan ; Huang, Dayu ; Meyn, Sean P. ; Surana, Amit ; Veeravalli, Venugopal V.
Author_Institution :
Dept. of Electr. & Comput. Eng., Univ. of Illinois at Urbana-Champaign, Urbana, IL, USA
Volume :
57
Issue :
3
fYear :
2011
fDate :
3/1/2011 12:00:00 AM
Firstpage :
1587
Lastpage :
1603
Abstract :
For the universal hypothesis testing problem, where the goal is to decide between the known null hypothesis distribution and some other unknown distribution, Hoeffding proposed a universal test in the nineteen sixties. Hoeffding´s universal test statistic can be written in terms of Kullback-Leibler (K-L) divergence between the empirical distribution of the observations and the null hypothesis distribution. In this paper a modification of Hoeffding´s test is considered based on a relaxation of the K-L divergence, referred to as the mismatched divergence. The resulting mismatched test is shown to be a generalized likelihood-ratio test (GLRT) for the case where the alternate distribution lies in a parametric family of distributions characterized by a finite-dimensional parameter, i.e., it is a solution to the corresponding composite hypothesis testing problem. For certain choices of the alternate distribution, it is shown that both the Hoeffding test and the mismatched test have the same asymptotic performance in terms of error exponents. A consequence of this result is that the GLRT is optimal in differentiating a particular distribution from others in an exponential family. It is also shown that the mismatched test has a significant advantage over the Hoeffding test in terms of finite sample size performance for applications involving large alphabet distributions. This advantage is due to the difference in the asymptotic variances of the two test statistics under the null hypothesis.
Keywords :
statistical distributions; statistical testing; Hoeffding test; Kullback-Leibler divergence; composite hypothesis testing; finite-dimensional parameter; generalized likelihood-ratio test; mismatched divergence; null hypothesis distribution; universal hypothesis testing; Approximation methods; Convergence; Entropy; Robustness; Source coding; Testing; Training; Generalized likelihood-ratio test; Kullback–Leibler (K-L) information; hypothesis testing; online detection;
fLanguage :
English
Journal_Title :
Information Theory, IEEE Transactions on
Publisher :
ieee
ISSN :
0018-9448
Type :
jour
DOI :
10.1109/TIT.2011.2104670
Filename :
5714276
Link To Document :
بازگشت