DocumentCode :
1557488
Title :
Simple and Fast Calculation of the Second-Order Gradients for Globalized Dual Heuristic Dynamic Programming in Neural Networks
Author :
Fairbank, Michael ; Alonso, E. ; Prokhorov, Danil
Author_Institution :
Dept. of Comput., City Univ. London, London, UK
Volume :
23
Issue :
10
fYear :
2012
Firstpage :
1671
Lastpage :
1676
Abstract :
We derive an algorithm to exactly calculate the mixed second-order derivatives of a neural network´s output with respect to its input vector and weight vector. This is necessary for the adaptive dynamic programming (ADP) algorithms globalized dual heuristic programming (GDHP) and value-gradient learning. The algorithm calculates the inner product of this second-order matrix with a given fixed vector in a time that is linear in the number of weights in the neural network. We use a “forward accumulation” of the derivative calculations which produces a much more elegant and easy-to-implement solution than has previously been published for this task. In doing so, the algorithm makes GDHP simple to implement and efficient, bridging the gap between the widely used DHP and GDHP ADP methods.
Keywords :
dynamic programming; matrix algebra; neural nets; vectors; adaptive dynamic programming algorithm; derivative calculation; forward accumulation; globalized dual heuristic dynamic programming; globalized dual heuristic programming; input vector; mixed second-order derivative; neural network; second-order gradient; second-order matrix; value-gradient learning; weight vector; Backpropagation; Equations; Heuristic algorithms; Neural networks; Training; Trajectory; Vectors; Adaptive dynamic programming; dual heuristic programming; neural networks; value-gradient learning;
fLanguage :
English
Journal_Title :
Neural Networks and Learning Systems, IEEE Transactions on
Publisher :
ieee
ISSN :
2162-237X
Type :
jour
DOI :
10.1109/TNNLS.2012.2205268
Filename :
6239600
Link To Document :
بازگشت