DocumentCode :
3494170
Title :
Learning to forget: continual prediction with LSTM
Author :
Gers, Felix A. ; Schmidhuber, Jiirgen ; Cummins, Fred
Author_Institution :
IDSIA, Lugano, Switzerland
Volume :
2
fYear :
1999
fDate :
1999
Firstpage :
850
Abstract :
Long short-term memory (LSTM) can solve many tasks not solvable by previous learning algorithms for recurrent neural networks (RNNs). We identify a weakness of LSTM networks processing continual input streams without explicitly marked sequence ends. Without resets, the internal state values may grow indefinitely and eventually cause the network to break down. Our remedy is an adaptive “forget gate” that enables an LSTM cell to learn to reset itself at appropriate times, thus releasing internal resources. We review an illustrative benchmark problem on which standard LSTM outperforms other RNN algorithms. All algorithms (including LSTM) fail to solve a continual version of that problem. LSTM with forget gates, however, easily solves it in an elegant way
Keywords :
recurrent neural nets; adaptive forget gate; learning; long short-term memory; recurrent neural networks; resource allocation;
fLanguage :
English
Publisher :
iet
Conference_Titel :
Artificial Neural Networks, 1999. ICANN 99. Ninth International Conference on (Conf. Publ. No. 470)
Conference_Location :
Edinburgh
ISSN :
0537-9989
Print_ISBN :
0-85296-721-7
Type :
conf
DOI :
10.1049/cp:19991218
Filename :
818041
Link To Document :
بازگشت