DocumentCode :
3569395
Title :
Exponential convergence of a gradient descent algorithm for a class of recurrent neural networks
Author :
Bartlett, Peter ; Dasgupta, Soura
Author_Institution :
Dept. of Syst. Eng., Australian Nat. Univ., Canberra, ACT, Australia
Volume :
1
fYear :
1995
Firstpage :
497
Abstract :
This paper considers the convergence of an approximate gradient descent back propagation algorithm for a one hidden layer neural network whose output is an affine combination of certain nonlinear functions of the outputs of biased infinite impulse response affine systems. We give a persistent excitation condition that guarantees local convergence of the algorithm. We show that this condition holds for generic parameter values whenever one applies generic periodic inputs of period at least N, N being the number of parameters
Keywords :
approximation theory; backpropagation; convergence of numerical methods; parameter estimation; recurrent neural nets; affine combination; approximate gradient descent back propagation algorithm; biased infinite impulse response affine systems; exponential convergence; generic periodic inputs; gradient descent algorithm; local convergence guarantee; nonlinear functions; one hidden layer neural network; parameter estimates; persistent excitation condition; recurrent neural networks; Computer architecture; Computer networks; Convergence; Feedforward systems; Finite impulse response filter; IIR filters; Neural networks; Parameter estimation; Recurrent neural networks; Systems engineering and theory;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Circuits and Systems, 1995., Proceedings., Proceedings of the 38th Midwest Symposium on
Print_ISBN :
0-7803-2972-4
Type :
conf
DOI :
10.1109/MWSCAS.1995.504485
Filename :
504485
Link To Document :
بازگشت