DocumentCode :
445951
Title :
An optimal entropy estimator for discrete random variables
Author :
Shiga, Motoki ; Yokota, Yasunari
Author_Institution :
Graduate Sch. of Eng., Gifu Univ., Japan
Volume :
2
fYear :
2005
fDate :
31 July-4 Aug. 2005
Firstpage :
1280
Abstract :
This paper presents analytical formulations of the most important estimation errors-averaged squared bias error and mean squared error - for the class of entropy estimator expressed as a sum of single variable functions. The class of entropy estimator includes almost all important entropy estimators that have been proposed heretofore. Furthermore, this paper presents an optimal entropy estimator that can minimize mean squared error of the estimate under the condition that averaged squared bias error of the estimate is restricted to below an arbitrary value. A numerical experiment demonstrates that the proposed entropy estimator provides a lower mean squared error than conventional entropy estimators when entropy is estimated as an ensemble mean over plural entropy estimates obtained for different independent data. Such estimation is often utilized for biological signals, e.g., neural signals, because of biological tiredness and adaptation property.
Keywords :
maximum likelihood estimation; mean square error methods; signal processing; biological signals; discrete random variables; mean squared error; neural signals; optimal entropy estimator; single variable functions; Entropy; Error correction; Estimation error; Least squares approximation; Maximum likelihood estimation; Minimax techniques; Neurons; Neuroscience; Random variables; Signal analysis;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Neural Networks, 2005. IJCNN '05. Proceedings. 2005 IEEE International Joint Conference on
Print_ISBN :
0-7803-9048-2
Type :
conf
DOI :
10.1109/IJCNN.2005.1556038
Filename :
1556038
Link To Document :
بازگشت