Title :
Asymptotic properties of sample-based entropy, information divergence, and related metrics
Author :
Reznik, Yuriy A.
Author_Institution :
RealNetworks Inc., Seattle, WA, USA
Abstract :
Summary form only given. Given a sample produced by an unknown memoryless source, we show how to estimate its entropy with much higher precision by adding a simple correction term to a commonly used expression. We also derive asymptotic expansions for sample-based entropy of mixtures and sample-based mutual information. These results are obtained using technique discussed by P. Flajolet (see Theoretical Computer Science, vol.215, p.371-81, 1999).
Keywords :
entropy; parameter estimation; signal sampling; entropy estimation; information divergence; memoryless source; mixtures; sample-based entropy; sample-based mutual information; Computer science; Data compression; Electronic mail; Entropy; Error correction; Mutual information;
Conference_Titel :
Data Compression Conference, 2005. Proceedings. DCC 2005
Print_ISBN :
0-7695-2309-9
DOI :
10.1109/DCC.2005.17