DocumentCode :
1547667
Title :
LSTM recurrent networks learn simple context-free and context-sensitive languages
Author :
Gers, Felix A. ; Schmidhuber, Jürgen
Author_Institution :
IDSIA, Manno, Switzerland
Volume :
12
Issue :
6
fYear :
2001
fDate :
11/1/2001 12:00:00 AM
Firstpage :
1333
Lastpage :
1340
Abstract :
Previous work on learning regular languages from exemplary training sequences showed that long short-term memory (LSTM) outperforms traditional recurrent neural networks (RNNs). We demonstrate LSTMs superior performance on context-free language benchmarks for RNNs, and show that it works even better than previous hardwired or highly specialized architectures. To the best of our knowledge, LSTM variants are also the first RNNs to learn a simple context-sensitive language, namely anbncn
Keywords :
context-free languages; context-sensitive languages; learning (artificial intelligence); recurrent neural nets; context-free language; context-sensitive language; long short-term memory; recurrent neural networks; regular languages; Backpropagation algorithms; Bridges; Computational complexity; Delay effects; Hidden Markov models; Learning automata; Neural networks; Recurrent neural networks; Resonance light scattering; State-space methods;
fLanguage :
English
Journal_Title :
Neural Networks, IEEE Transactions on
Publisher :
ieee
ISSN :
1045-9227
Type :
jour
DOI :
10.1109/72.963769
Filename :
963769
Link To Document :
بازگشت