DocumentCode :
2948724
Title :
Upper Bound Kullback-Leibler Divergence for Hidden Markov Models with Application as Discrimination Measure for Speech Recognition
Author :
Silva, Jorge ; Narayanan, Shrikanth
Author_Institution :
Dept. of Electr. Eng., Southern California Univ., CA
fYear :
2006
fDate :
9-14 July 2006
Firstpage :
2299
Lastpage :
2303
Abstract :
This paper presents a criterion for defining an upper bound Kullback-Leibler divergence (UB-KLD) for Gaussian mixtures models (GMMs). An information theoretic interpretation of this indicator and an algorithm for calculating it based on similarity alignment between mixture components of the models are proposed. This bound is used to characterize an upper bound closed-form expression for the Kullback-Leibler divergence (KLD) for left-to-right transient hidden Markov models (HMMs), where experiments based on real speech data show that this indicator precisely follows the discrimination tendency of the actual KLD
Keywords :
Gaussian processes; hidden Markov models; speech recognition; Gaussian mixtures models; hidden Markov models; information theoretic interpretation; speech recognition; upper bound Kullback-Leibler divergence; Automatic speech recognition; Closed-form solution; Context modeling; Electric variables measurement; Hidden Markov models; Hydrogen; Probability density function; Speech analysis; Speech recognition; Upper bound;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Information Theory, 2006 IEEE International Symposium on
Conference_Location :
Seattle, WA
Print_ISBN :
1-4244-0505-X
Electronic_ISBN :
1-4244-0504-1
Type :
conf
DOI :
10.1109/ISIT.2006.261977
Filename :
4036380
Link To Document :
بازگشت