DocumentCode :
2701506
Title :
Approximating the Kullback Leibler Divergence Between Gaussian Mixture Models
Author :
Hershey, John R. ; Olsen, Peder A.
Author_Institution :
IBM Thomas J. Watson Res. Center, NY, USA
Volume :
4
fYear :
2007
fDate :
15-20 April 2007
Abstract :
The Kullback Leibler (KL) divergence is a widely used tool in statistics and pattern recognition. The KL divergence between two Gaussian mixture models (GMMs) is frequently needed in the fields of speech and image recognition. Unfortunately the KL divergence between two GMMs is not analytically tractable, nor does any efficient computational algorithm exist. Some techniques cope with this problem by replacing the KL divergence with other functions that can be computed efficiently. We introduce two new methods, the variational approximation and the variational upper bound, and compare them to existing methods. We discuss seven different techniques in total and weigh the benefits of each one against the others. To conclude we evaluate the performance of each one through numerical experiments.
Keywords :
Monte Carlo methods; image recognition; speech recognition; Gaussian mixture models; Kullback Leibler divergence; Monte Carlo sampling; image recognition; pattern recognition; speech recognition; Algorithm design and analysis; Entropy; Image recognition; Monte Carlo methods; Pattern recognition; Probability density function; Speech; Statistical distributions; Statistics; Upper bound; Kullback Leibler divergence; gaussian mixture models; unscented transformation; variational methods;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Acoustics, Speech and Signal Processing, 2007. ICASSP 2007. IEEE International Conference on
Conference_Location :
Honolulu, HI
ISSN :
1520-6149
Print_ISBN :
1-4244-0727-3
Type :
conf
DOI :
10.1109/ICASSP.2007.366913
Filename :
4218101
Link To Document :
بازگشت