DocumentCode :
839207
Title :
Computing Gradient Vector and Jacobian Matrix in Arbitrarily Connected Neural Networks
Author :
Wilamowski, Bogdan M. ; Cotton, Nicholas J. ; Kaynak, Okyay ; Dundar, Gunhan
Author_Institution :
Dept. of Electr. & Comput. Eng., Auburn Univ., Auburn, AL
Volume :
55
Issue :
10
fYear :
2008
Firstpage :
3784
Lastpage :
3790
Abstract :
This paper describes a new algorithm with neuron-by-neuron computation methods for the gradient vector and the Jacobian matrix. The algorithm can handle networks with arbitrarily connected neurons. The training speed is comparable with the Levenberg-Marquardt algorithm, which is currently considered by many as the fastest algorithm for neural network training. More importantly, it is shown that the computation of the Jacobian, which is required for second-order algorithms, has a similar computation complexity as the computation of the gradient for first-order learning methods. This new algorithm is implemented in the newly developed software, Neural Network Trainer, which has unique capabilities of handling arbitrarily connected networks. These networks with connections across layers can be more efficient than commonly used multilayer perceptron networks.
Keywords :
Jacobian matrices; computational complexity; gradient methods; learning (artificial intelligence); mathematics computing; neural nets; Jacobian matrix computation; Levenberg-Marquardt neural network training algorithm; Neural Network Trainer software; arbitrarily connected neural network; computational complexity; gradient vector computation; multilayer perceptron network; neuron-by-neuron computation method; Learning; neural network;
fLanguage :
English
Journal_Title :
Industrial Electronics, IEEE Transactions on
Publisher :
ieee
ISSN :
0278-0046
Type :
jour
DOI :
10.1109/TIE.2008.2003319
Filename :
4602720
Link To Document :
بازگشت