DocumentCode :
1502357
Title :
Best asymptotic normality of the kernel density entropy estimator for smooth densities
Author :
Eggermont, Paul P B ; LaRiccia, Vincent N.
Author_Institution :
Dept. of Math. Sci., Delaware Univ., Newark, DE, USA
Volume :
45
Issue :
4
fYear :
1999
fDate :
5/1/1999 12:00:00 AM
Firstpage :
1321
Lastpage :
1326
Abstract :
In the random sampling setting we estimate the entropy of a probability density distribution by the entropy of a kernel density estimator using the double exponential kernel. Under mild smoothness and moment conditions we show that the entropy of the kernel density estimator equals a sum of independent and identically distributed (i.i.d.) random variables plus a perturbation which is asymptotically negligible compared to the parametric rate n-1/2. An essential part in the proof is obtained by exhibiting almost sure bounds for the Kullback-Leibler divergence between the kernel density estimator and its expected value. The basic technical tools are Doob´s submartingale inequality and convexity (Jensen´s inequality)
Keywords :
entropy; parameter estimation; probability; random processes; signal sampling; smoothing methods; Doob´s submartingale inequality; Jensen´s inequality; Kullback-Leibler divergence; best asymptotic normality; bounds; convexity; double exponential kernel; i.i.d random variables; independent identically distributed variables; kernel density entropy estimator; moment conditions; parametric rate; perturbation; probability density distribution; random sampling; smooth densities; Deconvolution; Distribution functions; Entropy; Kernel; Probability density function; Random variables; Sampling methods; Stochastic processes;
fLanguage :
English
Journal_Title :
Information Theory, IEEE Transactions on
Publisher :
ieee
ISSN :
0018-9448
Type :
jour
DOI :
10.1109/18.761291
Filename :
761291
Link To Document :
بازگشت