Title :
On the Estimation of Differential Entropy From Data Located on Embedded Manifolds
Author :
Nilsson, Mattias ; Kleijn, W. Bastiaan
Author_Institution :
Skype, Stockholm
fDate :
7/1/2007 12:00:00 AM
Abstract :
Estimation of the differential entropy from observations of a random variable is of great importance for a wide range of signal processing applications such as source coding, pattern recognition, hypothesis testing, and blind source separation. In this paper, we present a method for estimation of the Shannon differential entropy that accounts for embedded manifolds. The method is based on high-rate quantization theory and forms an extension of the classical nearest-neighbor entropy estimator. The estimator is consistent in the mean square sense and an upper bound on the rate of convergence of the estimator is given. Because of the close connection between compression and Shannon entropy, the proposed method has an advantage over methods estimating the Renyi entropy. Through experiments on uniformly distributed data on known manifolds and real-world speech data we show the accuracy and usefulness of our proposed method.
Keywords :
entropy; signal processing; Shannon differential entropy; classical nearest-neighbor entropy estimator; convergence rate; differential entropy estimation; embedded manifolds; high-rate quantization theory; Blind source separation; Convergence; Entropy; Pattern recognition; Quantization; Random variables; Signal processing; Source coding; Testing; Upper bound; Convergence rate; Shannon differential entropy; manifolds; nearest-neighbor distance;
Journal_Title :
Information Theory, IEEE Transactions on
DOI :
10.1109/TIT.2007.899533