DocumentCode :
3165789
Title :
Lower and upper bounds for approximation of the Kullback-Leibler divergence between Gaussian Mixture Models
Author :
Durrieu, J.-L. ; Thiran, J. -Ph ; Kelly, F.
Author_Institution :
Signal Process. Lab. (LTS5), Ecole Polytech. Fed. de Lausanne (EPFL), Lausanne, Switzerland
fYear :
2012
fDate :
25-30 March 2012
Firstpage :
4833
Lastpage :
4836
Abstract :
Many speech technology systems rely on Gaussian Mixture Models (GMMs). The need for a comparison between two GMMs arises in applications such as speaker verification, model selection or parameter estimation. For this purpose, the Kullback-Leibler (KL) divergence is often used. However, since there is no closed form expression to compute it, it can only be approximated. We propose lower and upper bounds for the KL divergence, which lead to a new approximation and interesting insights into previously proposed approximations. An application to the comparison of speaker models also shows how such approximations can be used to validate assumptions on the models.
Keywords :
Gaussian processes; parameter estimation; speaker recognition; Gaussian mixture models; Kullback Leibler divergence; model selection; parameter estimation; speaker models; speaker verification; speech technology systems; Approximation methods; Closed-form solutions; Estimation; Hidden Markov models; Speech processing; Upper bound; Gaussian Mixture Model (GMM); Kullback-Leibler Divergence; speaker comparison; speech processing;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Acoustics, Speech and Signal Processing (ICASSP), 2012 IEEE International Conference on
Conference_Location :
Kyoto
ISSN :
1520-6149
Print_ISBN :
978-1-4673-0045-2
Electronic_ISBN :
1520-6149
Type :
conf
DOI :
10.1109/ICASSP.2012.6289001
Filename :
6289001
Link To Document :
بازگشت