DocumentCode :
2506339
Title :
Combining the Likelihood and the Kullback-Leibler Distance in Estimating the Universal Background Model for Speaker Verification Using SVM
Author :
Lei, Zhenchun
Author_Institution :
Sch. of Comput. & Inf. Eng., Jiangxi Normal Univ., Nanchang, China
fYear :
2010
fDate :
23-26 Aug. 2010
Firstpage :
4553
Lastpage :
4556
Abstract :
The state-of-the-art methods for speaker verification are based on the support vector machine. The Gaussian supervector SVM is a typical method which uses the Gaussian mixture model for creating “feature vectors” for the discriminative SVM. And all GMMs are adapted from the same universal background model, which is got by maximum likelihood estimation on a large number of data sets. So the UBM should cover the feature space widely as possible. We propose a new method to estimate the parameters of the UBM by combining the likelihood and the Kullback-Leibler distances in the UBM. Its aim is to find the model parameters which get the high likelihood value and all Gaussian distributions are dispersed to cover the feature space in a great measuring. Experiments on NIST 2001 task show that our method can improve the performance obviously.
Keywords :
Gaussian distribution; maximum likelihood estimation; speaker recognition; support vector machines; Gaussian distributions; Gaussian mixture model; Gaussian supervector; Kullback-Leibler distance; SVM; feature vectors; maximum likelihood estimation; speaker verification; support vector machine; universal background model; Adaptation model; Kernel; Maximum likelihood estimation; NIST; Speaker recognition; Support vector machines; Training; Kullback-Leibler distance; speaker verification; universal background model;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Pattern Recognition (ICPR), 2010 20th International Conference on
Conference_Location :
Istanbul
ISSN :
1051-4651
Print_ISBN :
978-1-4244-7542-1
Type :
conf
DOI :
10.1109/ICPR.2010.1106
Filename :
5597370
Link To Document :
بازگشت