DocumentCode :
1696142
Title :
Comparison of feedforward and recurrent neural network language models
Author :
Sundermeyer, M. ; Oparin, Ilya ; Gauvain, Jean-Luc ; Freiberg, B. ; Schluter, Ralf ; Ney, Hermann
Author_Institution :
Comput. Sci. Dept., RWTH Aachen Univ., Aachen, Germany
fYear :
2013
Firstpage :
8430
Lastpage :
8434
Abstract :
Research on language modeling for speech recognition has increasingly focused on the application of neural networks. Two competing concepts have been developed: On the one hand, feedforward neural networks representing an n-gram approach, on the other hand recurrent neural networks that may learn context dependencies spanning more than a fixed number of predecessor words. To the best of our knowledge, no comparison has been carried out between feedforward and state-of-the-art recurrent networks when applied to speech recognition. This paper analyzes this aspect in detail on a well-tuned French speech recognition task. In addition, we propose a simple and efficient method to normalize language model probabilities across different vocabularies, and we show how to speed up training of recurrent neural networks by parallelization.
Keywords :
feedforward neural nets; natural language processing; probability; recurrent neural nets; speech recognition; French speech recognition task; context dependency; feedforward neural network language models; hand recurrent neural networks; language modeling; n-gram approach; normalize language model probability; recurrent neural network language models; vocabulary; Artificial neural networks; Computational modeling; Feedforward neural networks; Speech recognition; Training; Vocabulary; Automatic speech recognition; feedforward neural networks; recurrent neural networks;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Acoustics, Speech and Signal Processing (ICASSP), 2013 IEEE International Conference on
Conference_Location :
Vancouver, BC
ISSN :
1520-6149
Type :
conf
DOI :
10.1109/ICASSP.2013.6639310
Filename :
6639310
Link To Document :
بازگشت