Title :
Divergence Estimation for Multidimensional Densities Via
-Nearest-Neighbor Distances
Author :
Wang, Qing ; Kulkarni, Sanjeev R. ; VerdÙ, Sergio
Author_Institution :
Credit Suisse Group, New York, NY
fDate :
5/1/2009 12:00:00 AM
Abstract :
A new universal estimator of divergence is presented for multidimensional continuous densities based on k-nearest-neighbor (k-NN) distances. Assuming independent and identically distributed (i.i.d.) samples, the new estimator is proved to be asymptotically unbiased and mean-square consistent. In experiments with high-dimensional data, the k-NN approach generally exhibits faster convergence than previous algorithms. It is also shown that the speed of convergence of the k-NN method can be further improved by an adaptive choice of k.
Keywords :
mean square error methods; multidimensional signal processing; signal sampling; divergence estimation; independent-identically distributed sample; k-nearest-neighbor distance; mean-square consistent; multidimensional continuous density; signal processing; Convergence; Density measurement; Frequency estimation; Information theory; Laboratories; Multidimensional systems; Mutual information; Neuroscience; Partitioning algorithms; Probability distribution; Divergence; Kullback–Leibler; information measure; nearest-neighbor; partition; random vector; universal estimation;
Journal_Title :
Information Theory, IEEE Transactions on
DOI :
10.1109/TIT.2009.2016060