Title :
Variational Kullback-Leibler divergence for Hidden Markov models
Author :
Hershey, John R. ; Olsen, Peder A. ; Rennie, Steven J.
Author_Institution :
IBM Thomas J. Watson Res. Center, Yorktown Heights
Abstract :
Divergence measures are widely used tools in statistics and pattern recognition. The Kullback-Leibler (KL) divergence between two hidden Markov models (HMMs) would be particularly useful in the fields of speech and image recognition. Whereas the KL divergence is tractable for many distributions, including Gaussians, it is not in general tractable for mixture models or HMMs. Recently, variational approximations have been introduced to efficiently compute the KL divergence and Bhattacharyya divergence between two mixture models, by reducing them to the divergences between the mixture components. Here we generalize these techniques to approach the divergence between HMMs using a recursive backward algorithm. Two such methods are introduced, one of which yields an upper bound on the KL divergence, the other of which yields a recursive closed-form solution. The KL and Bhattacharyya divergences, as well as a weighted edit-distance technique, are evaluated for the task of predicting the confusability of pairs of words.
Keywords :
Gaussian distribution; approximation theory; hidden Markov models; variational techniques; Bhattacharyya divergence; Gaussian distribution; hidden Markov model; image recognition; pattern recognition; recursive backward algorithm; speech recognition; statistical recognition; variational Kullback-Leibler divergence approximation; weighted edit-distance technique; Closed-form solution; Entropy; Gaussian distribution; Gaussian processes; Hidden Markov models; Image recognition; Pattern recognition; Speech; Statistical distributions; Upper bound; Bhattacharyya divergence; Kullback-Leibler divergence; hidden Markov models (HMMs); mixture models; variational methods; weighted edit distance;
Conference_Titel :
Automatic Speech Recognition & Understanding, 2007. ASRU. IEEE Workshop on
Conference_Location :
Kyoto
Print_ISBN :
978-1-4244-1746-9
Electronic_ISBN :
978-1-4244-1746-9
DOI :
10.1109/ASRU.2007.4430132