Title :
Backpropagation Algorithms for a Broad Class of Dynamic Networks
Author :
De Jesús, Orlando ; Hagan, Martin T.
Author_Institution :
Res. Dept., Halliburton Energy Services, Dallas, TX
Abstract :
This paper introduces a general framework for describing dynamic neural networks-the layered digital dynamic network (LDDN). This framework allows the development of two general algorithms for computing the gradients and Jacobians for these dynamic networks: backpropagation-through-time (BPTT) and real-time recurrent learning (RTRL). The structure of the LDDN framework enables an efficient implementation of both algorithms for arbitrary dynamic networks. This paper demonstrates that the BPTT algorithm is more efficient for gradient calculations, but the RTRL algorithm is more efficient for Jacobian calculations
Keywords :
Jacobian matrices; backpropagation; gradient methods; recurrent neural nets; Jacobian calculations; backpropagation-through-time; dynamic neural networks; gradient calculations; layered digital dynamic network; real-time recurrent learning; Backpropagation algorithms; Computer networks; Delay lines; Heuristic algorithms; Jacobian matrices; Neural networks; Neurofeedback; Output feedback; Power engineering and energy; Recurrent neural networks; Backpropagation through time (BPTT); Jacobian; dynamic neural networks; gradient; layered digital dynamic network (LDDN); real-time recurrent learning (RTRL); recurrent neural networks; Algorithms; Artificial Intelligence; Cluster Analysis; Computing Methodologies; Neural Networks (Computer); Pattern Recognition, Automated; Signal Processing, Computer-Assisted;
Journal_Title :
Neural Networks, IEEE Transactions on
DOI :
10.1109/TNN.2006.882371