Title :
I-smooth for improved minimum classification error training
Author :
Li, Haozheng ; Munteanu, Cosmin
Abstract :
Increasing the generalization capability of Discriminative Training (DT) of Hidden Markov Models (HMM) has recently gained an increased interest within the speech recognition field. In particular, achieving such increases with only minor modifications to the existing DT method is of significant practical importance. In this paper, we propose a solution for increasing the generalization capability of a widely-used training method - the Minimum Classification Error (MCE) training of HMM - with limited changes to its original framework. For this, we define boundary data - obtained by applying a large steep parameter, and confusion data - obtained by applying a small steep parameter on the training samples, and then do a soft interpolation between these according to the number points of occupancies of boundary data and the number points ratio between the boundary and the confusion occupancies. The final HMM parameters are then tuned in the same manner as in MCE by using the interpolated boundary data. We show that the proposed method achieves lower error rates than a standard HMM training framework on a phoneme classification task for the TIMIT speech corpus.
Keywords :
hidden Markov models; interpolation; speech processing; speech recognition; TIMIT speech corpus; discriminative training method; hidden Markov models; interpolated boundary data; minimum classification error training; phoneme classification task; speech recognition; Councils; Error analysis; Hidden Markov models; Interpolation; Maximum likelihood estimation; Mutual information; Speech recognition; Testing; Hidden Markov Model; Minimum Classification Errors; Speech Recognition;
Conference_Titel :
Acoustics Speech and Signal Processing (ICASSP), 2010 IEEE International Conference on
Conference_Location :
Dallas, TX
Print_ISBN :
978-1-4244-4295-9
Electronic_ISBN :
1520-6149
DOI :
10.1109/ICASSP.2010.5495109