Title :
A New Minimum Divergence Approach to Discriminative Training
Author :
Jun Du ; Peng Liu ; Hui Jiang ; Soong, Frank K. ; Ren-Hua Wang
Author_Institution :
Univ. of Sci. & Technol. of China, Hefei, China
Abstract :
We propose to use minimum divergence, where acoustic similarity between HMMs is characterized by Kullback-Leibler divergence, for discriminative training. The MD objective function is defined as a posterior weighted divergence measured over the whole training set. Different from our earlier work, where KLD-based acoustic similarity is pre-computed for all initial models and stays invariant in the optimization procedure, here we propose to jointly optimize the whole variable MD by adjusting HMM parameters since MD is a function of the adjusted HMM parameters. An EBW optimization method is derived to minimize the whole MD objective function. The new MD formulation is evaluated on the TIDIGITS and Switchboard databases. Experimental results show that the new MD yields relative word error rate reductions of 62.1% on TIDIGITS and 8.8% on Switchboard databases when compared with the best ML-trained systems. It is also shown the new MD consistently outperforms other discriminative training criteria, such as MPE.
Keywords :
audio databases; hidden Markov models; speech recognition; EBW optimization method; HMM; Kullback-Leibler divergence; Switchboard databases; TIDIGITS databases; discriminative training; minimum divergence approach; speech recognition; word error rate reductions; Acoustic measurements; Asia; Automatic speech recognition; Databases; Error analysis; Hidden Markov models; Mutual information; Optimization methods; Speech recognition; Weight measurement; discriminative training;
Conference_Titel :
Acoustics, Speech and Signal Processing, 2007. ICASSP 2007. IEEE International Conference on
Conference_Location :
Honolulu, HI
Print_ISBN :
1-4244-0727-3
DOI :
10.1109/ICASSP.2007.367003