Title :
Classifying distributions via symbolic entropy estimation
Author_Institution :
Department of Computer Science, The University of Auckland, Private Bag 92019, Auckland, New Zealand
Abstract :
Shannon observed that the normal distribution has maximal entropy among distributions with a density function and a given variance. This sparked a significant body of research in statistics, broadly concerned with goodness-of-fit estimators based on Shannon entropy for a variety of distributions and, in particular, normality testing. The present paper proposes to use compression algorithms and other parsing-based entropy estimators to match samples in sampling order to one of a set of distributions with the observed μ and, where applicable, a, using the distributions´ quantile functions to convert the samples into a string of symbols for entropy estimation. The paper demonstrates with a series of Monte-Carlo simulations that the proposed technique may be able to distinguish between a number of common distributions even if the samples themselves are not i.i.d.
Keywords :
"Entropy","Estimation","Complexity theory","Compressors","Gaussian distribution","Probability density function","Standards"
Conference_Titel :
Information, Communications and Signal Processing (ICICS), 2015 10th International Conference on
DOI :
10.1109/ICICS.2015.7459844