DocumentCode
591771
Title
Context dependant phone mapping for cross-lingual acoustic modeling
Author
Van Hai Do ; Xiong Xiao ; Eng Siong Chng ; Haizhou Li
Author_Institution
Sch. of Comput. Eng., Nanyang Technol. Univ., Singapore, Singapore
fYear
2012
fDate
5-8 Dec. 2012
Firstpage
16
Lastpage
20
Abstract
This paper presents a novel method for acoustic modeling with limited training data. The idea is to leverage on a well-trained acoustic model of a source language. In this paper, a conventional HMM/GMM triphone acoustic model of the source language is used to derive likelihood scores for each feature vector of the target language. These scores are then mapped to triphones of the target language using neural networks. We conduct a case study where Malay is the source language while English (Aurora-4 task) is the target language. Experimental results on the Aurora-4 (clean test set) show that by using only 7, 16, and 55 minutes of English training data, we achieve 21.58%, 17.97%, and 12.93% word error rate, respectively. These results outperform the conventional HMM/GMM and hybrid systems significantly.
Keywords
Gaussian processes; acoustic signal processing; error statistics; hidden Markov models; linguistics; natural language processing; neural nets; speech recognition; Aurora-4 task; English language; English training data; HMM/GMM triphone acoustic model; Malay language; context dependant phone mapping; cross-lingual acoustic modeling; feature vector; likelihood score; neural network; source language; speech recognition; word error rate; Acoustics; Data models; Hidden Markov models; Speech; Training; Training data; Vectors; context dependant; cross-lingual LVCSR; phone mapping; speech recognition; under-resourced language;
fLanguage
English
Publisher
ieee
Conference_Titel
Chinese Spoken Language Processing (ISCSLP), 2012 8th International Symposium on
Conference_Location
Kowloon
Print_ISBN
978-1-4673-2506-6
Electronic_ISBN
978-1-4673-2505-9
Type
conf
DOI
10.1109/ISCSLP.2012.6423496
Filename
6423496
Link To Document