Title :
Exact training of a neural syntactic language model
Author :
Emami, Ahmad ; Jelinek, Frederick
Author_Institution :
Center for Language & Speech Process., Johns Hopkins Univ., Baltimore, MD, USA
Abstract :
The structured language model (SLM) aims at predicting the next word in a given word string by making a syntactical analysis of the preceding words. However, it faces the data sparseness problem because of the large dimensionality and diversity of the information available in the syntactic parsing. Previously, we proposed using neural network models for the SLM (Emami, A. et al., Proc. ICASSP, 2003; Emami, Proc. EUROSPEECH´03., 2003). The neural network model is better suited to tackle the data sparseness problem and its use gave significant improvements in perplexity and word error rate over the baseline SLM. We present a new method of training the neural net based SLM. This procedure makes use of the partial parsing hypothesized by the SLM itself, and is more expensive than the approximate training method used previously. Experiments with the new training method on the UPenn and WSJ corpora show significant reductions in perplexity and word error rate, achieving the lowest published results for the given corpora.
Keywords :
learning (artificial intelligence); natural languages; neural nets; speech recognition; text analysis; data sparseness problem; exact training; neural network models; neural syntactic language model; perplexity; speech recognition; structured language model; syntactic parsing; syntactical analysis; word error rate; Error analysis; History; Natural languages; Neural networks; Predictive models; Probability; Speech analysis; Speech processing;
Conference_Titel :
Acoustics, Speech, and Signal Processing, 2004. Proceedings. (ICASSP '04). IEEE International Conference on
Print_ISBN :
0-7803-8484-9
DOI :
10.1109/ICASSP.2004.1325968