DocumentCode :
2254082
Title :
Language modeling with stochastic automata
Author :
Hu, Jianying ; Turin, William ; Brown, Michael K.
Author_Institution :
AT&T Bell Labs., Murray Hill, NJ, USA
Volume :
1
fYear :
1996
fDate :
3-6 Oct 1996
Firstpage :
406
Abstract :
It is well known that language models are effective for increasing the accuracy of speech and handwriting recognizers, but large language models are often required to achieve low model perplexity (or entropy) and yet still have adequate language coverage. We study three efficient methods for stochastic language modeling in the context of the stochastic pattern recognition problem (variable-length Markov models, variable n-gram stochastic automata and refined probabilistic finite automata), and we give the results of a comparative performance analysis. In addition, we show that a method which combines two of these language modeling techniques yields an even better performance than the best of the single techniques tested
Keywords :
Markov processes; computational linguistics; entropy; finite automata; natural languages; nomograms; pattern recognition; performance index; probabilistic automata; stochastic automata; accuracy; entropy; handwriting recognition; language coverage; model perplexity; performance analysis; refined probabilistic finite automata; speech recognition; stochastic automata; stochastic language modeling; stochastic pattern recognition; variable n-gram stochastic automata; variable-length Markov models; Automata; Automatic speech recognition; Context modeling; Entropy; Handwriting recognition; Natural languages; Pattern recognition; Performance analysis; Speech recognition; Stochastic processes;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Spoken Language, 1996. ICSLP 96. Proceedings., Fourth International Conference on
Conference_Location :
Philadelphia, PA
Print_ISBN :
0-7803-3555-4
Type :
conf
DOI :
10.1109/ICSLP.1996.607140
Filename :
607140
Link To Document :
بازگشت