Title :
Efficient gradient computation for continuous and discrete time-dependent neural networks
Author :
Miesbach, Stefan
Author_Institution :
Math. Inst., Tech. Univ. Munchen, Germany
Abstract :
The author provides calculus-of-variations techniques for the construction of backpropagation-through-time (BTT) algorithms for arbitrary time-dependent recurrent neural networks with both continuous and discrete dynamics. The backpropagated error signals are essentially Lagrange multipliers. The techniques are easy to handle because they can be embedded into the Hamiltonian formalism widely used in optimal control theory. Three examples of important extensions to the standard BTT-algorithm provide proof of the power of the method. An implementation of the BTT-algorithms which overcomes the storage drawbacks is suggested
Keywords :
neural nets; variational techniques; Hamiltonian formalism; backpropagated error signals; backpropagation-through-time algorithms; calculus-of-variations techniques; continuous neural nets; discrete time-dependent neural networks; recurrent neural networks; Adaptive control; Backpropagation algorithms; Computer networks; High performance computing; Neural networks; Neurons; Performance analysis; Planning; Programmable control; Recurrent neural networks;
Conference_Titel :
Neural Networks, 1991. 1991 IEEE International Joint Conference on
Print_ISBN :
0-7803-0227-3
DOI :
10.1109/IJCNN.1991.170737