Title :
Learning long-term dependencies with gradient descent is difficult
Author :
Bengio, Yoshua ; Simard, Patrice ; Frasconi, Paolo
Author_Institution :
Dept. d´´Inf. et de Recherche Oper., Montreal Univ., Que., Canada
fDate :
3/1/1994 12:00:00 AM
Abstract :
Recurrent neural networks can be used to map input sequences to output sequences, such as for recognition, production or prediction problems. However, practical difficulties have been reported in training recurrent neural networks to perform tasks in which the temporal contingencies present in the input/output sequences span long intervals. We show why gradient based learning algorithms face an increasingly difficult problem as the duration of the dependencies to be captured increases. These results expose a trade-off between efficient learning by gradient descent and latching on information for long periods. Based on an understanding of this problem, alternatives to standard gradient descent are considered
Keywords :
learning (artificial intelligence); numerical analysis; recurrent neural nets; efficient learning; gradient descent; input/output sequence mapping; long-term dependencies; prediction problems; production problems; recognition; recurrent neural network training; temporal contingencies; Computer networks; Cost function; Delay effects; Discrete transforms; Displays; Intelligent networks; Neural networks; Neurofeedback; Production; Recurrent neural networks;
Journal_Title :
Neural Networks, IEEE Transactions on