Title :
Subvector-quantized high-density discrete hidden Markov model and its re-estimation
Author :
Ye, Guoli ; Mak, Brian
Author_Institution :
Dept. of Comput. Sci. & Eng., Hong Kong Univ. of Sci. & Technol., Hong Kong, China
fDate :
Nov. 29 2010-Dec. 3 2010
Abstract :
We investigated two methods to improve the performance of high-density discrete hidden Markov model (HDDHMM). HDDHMM employs discrete densities with a very large codebook consisting of thousands to tens of thousands of vector quantization (VQ) codewords which are constructed as the product of per-dimension scalar quantization (SQ) codewords. Although the subsequent HDDHMM is fast in decoding, it is not accurate enough. In this paper, making use of the fact that, for a fixed number of bits, VQ is more efficient than SQ, subvector quantization (SVQ) was investigated to improve the quantization efficiency while keeping the (time and space) complexity of the quantizer sufficiently low. Model parameters of the resulting SVQ-HDDHMM were further re-estimated. For the Wall Street Journal 5K-vocabulary task, it is found that the proposed SVQ-HDDHMM could be a better model (both in terms of recognition time and error rate) than conventional continuous-density HMM for practical deployment.
Keywords :
decoding; hidden Markov models; speech recognition; vector quantisation; Wall Street Journal 5K-vocabulary task; codebook; codewords; decoding; per-dimension scalar quantization; subvector-quantized high-density discrete hidden Markov model; Acoustics; Approximation methods; Bit rate; Computational modeling; Hidden Markov models; Quantization; Training;
Conference_Titel :
Chinese Spoken Language Processing (ISCSLP), 2010 7th International Symposium on
Conference_Location :
Tainan
Print_ISBN :
978-1-4244-6244-5
DOI :
10.1109/ISCSLP.2010.5684838