DocumentCode :
2485975
Title :
Entropy and information rates for hidden Markov models
Author :
Ko, Hanseok ; Baran, R.H.
Author_Institution :
Sch. of Electr. Eng., Korea Univ., Seoul, South Korea
fYear :
1998
fDate :
16-21 Aug 1998
Firstpage :
374
Abstract :
A practical approach to statistical inference for hidden Markov models (HMMs) requires expressions for the mean and variance of the log-probability of an observed T-long sequence given the model parameters. From the viewpoint of Shannon theory, in the limit of large T, the expected value of the per step log-probability is minus one times the mean entropy rate at the output of a noisy channel driven by the Markov source. A novel procedure for finding the entropy rate is presented. The rate distortion function of the Markov source, subject to the requirement of instantaneous coding, is a by-product of the derivation
Keywords :
encoding; entropy; hidden Markov models; probability; rate distortion theory; sequences; telecommunication channels; HMM; Markov source; Shannon theory; hidden Markov models; information rate; instantaneous coding; log-probability; mean entropy rate; model parameters; noisy channel; per step log-probability; rate distortion function; sequence; statistical inference; variance; Distortion measurement; Entropy; Error analysis; Frequency; Information filtering; Information filters; Information rates; Mutual information; Nonlinear distortion; Nonlinear equations;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Information Theory, 1998. Proceedings. 1998 IEEE International Symposium on
Conference_Location :
Cambridge, MA
Print_ISBN :
0-7803-5000-6
Type :
conf
DOI :
10.1109/ISIT.1998.708979
Filename :
708979
Link To Document :
بازگشت