Title :
k-nearest neighbor estimation of entropies with confidence
Author :
Sricharan, Kumar ; Raich, Raviv ; Hero, Alfred O., III
Author_Institution :
Dept. of EECS, Univ. of Michigan, Ann Arbor, MI, USA
fDate :
July 31 2011-Aug. 5 2011
Abstract :
We analyze a k-nearest neighbor (k-NN) class of plug-in estimators for estimating Shannon entropy and Rényi entropy. Based on the statistical properties of k-NN balls, we derive explicit rates for the bias and variance of these plug-in estimators in terms of the sample size, the dimension of the samples and the underlying probability distribution. In addition, we establish a central limit theorem for the plug-in estimator that allows us to specify confidence intervals on the entropy functionals. As an application, we use our theory in anomaly detection problems to specify thresholds for achieving desired false alarm rates.
Keywords :
entropy; estimation theory; learning (artificial intelligence); pattern classification; probability; Renyi entropy; Shannon entropy estimation; anomaly detection problem; central limit theorem; entropy functional; false alarm rate; k-nearest neighbor estimation; plug-in estimator; statistical property; underlying probability distribution; Convergence; Entropy; Estimation; Information theory; Kernel; Random variables; Wireless sensor networks; central limit theorem; confidence intervals; entropy estimation; k-NN density estimation; plug-in estimation;
Conference_Titel :
Information Theory Proceedings (ISIT), 2011 IEEE International Symposium on
Conference_Location :
St. Petersburg
Print_ISBN :
978-1-4577-0596-0
Electronic_ISBN :
2157-8095
DOI :
10.1109/ISIT.2011.6033726