DocumentCode
1900774
Title
Comparing Distance Measures for Hidden Markov Models
Author
Mohammad, Maruf ; Tranter, W.H.
Author_Institution
Mobile & Portable Radio Res. Group, Virginia Tech, VA
fYear
2005
fDate
March 31 2005-April 2 2005
Firstpage
256
Lastpage
260
Abstract
In this paper, several distance measures for hidden Markov models (HMMs) are compared. The most commonly used distance measure between two HMMs is Kullback-Leibler divergence (KLD). Since there is no closed form solution, Monte-Carlo method is usually applied to calculate the KLD. However, the computational complexity in Monte-Carlo estimation may be prohibitive in practical applications, which motivated researchers to propose new distance measures for HMMs. Numerical examples are presented comparing three such distance measures against the Monte-Carlo method. Results show that it is possible to approximate the KLD with a saving of hundreds of times in computational complexity
Keywords
computational complexity; hidden Markov models; speech recognition; HMM; Kullback-Leibler divergence; computational complexity; hidden Markov models; speech recognition; Automatic speech recognition; Closed-form solution; Computational complexity; Digital signal processing; Hidden Markov models; Probability distribution; Signal processing algorithms; Speech recognition; Statistics; Vocabulary;
fLanguage
English
Publisher
ieee
Conference_Titel
SoutheastCon, 2006. Proceedings of the IEEE
Conference_Location
Memphis, TN
Print_ISBN
1-4244-0168-2
Type
conf
DOI
10.1109/second.2006.1629360
Filename
1629360
Link To Document