DocumentCode
417289
Title
Minimum classification error training of landmark models for real-time continuous speech recognition
Author
McDermott, Erik ; Hazen, Timothy J.
Author_Institution
NTT Commun. Sci. Labs., NTT Corp., Kyoto, Japan
Volume
1
fYear
2004
fDate
17-21 May 2004
Abstract
Though many studies have shown the effectiveness of the minimum classification error (MCE) approach to discriminative training of HMM for speech recognition, few if any have reported MCE results for large (> 100 hours) training sets in the context of real-world, continuous speech recognition. Here we report large gains in performance for the MIT JUPITER weather information task as a result of MCE-based batch optimization of acoustic models. Investigation of word error rate versus computation time showed that small MCE models significantly outperform the maximum likelihood (ML) baseline at all points of equal computation time, resulting in up to 20% word error rate reduction for in-vocabulary utterances. The overall MCE loss function was minimized using Quickprop, a simple but effective second-order optimization method suited to parallelization over large training sets.
Keywords
error statistics; gradient methods; hidden Markov models; maximum likelihood estimation; minimisation; speech recognition; HMM; MCE loss function minimization; MCE-based batch optimization; MIT JUPITER weather information task; Quickprop; acoustic models; computation time; in-vocabulary utterances; landmark models; large training sets; minimum classification error training; performance; real-time continuous speech recognition; second-order optimization method; small MCE models; word error rate; word error rate reduction; Computer errors; Computer science; Error analysis; Jupiter; Laboratories; Lattices; Management training; Maximum likelihood estimation; Optimization methods; Speech recognition;
fLanguage
English
Publisher
ieee
Conference_Titel
Acoustics, Speech, and Signal Processing, 2004. Proceedings. (ICASSP '04). IEEE International Conference on
ISSN
1520-6149
Print_ISBN
0-7803-8484-9
Type
conf
DOI
10.1109/ICASSP.2004.1326141
Filename
1326141
Link To Document