DocumentCode :
3392375
Title :
Stochastic analysis of chaos dynamics in recurrent neural networks
Author :
Homma, Noriyasu ; Sakai, Masao ; Gupta, Madan M. ; Abe, Ken-ichi
Author_Institution :
Graduate Sch. of Eng., Tohoku Univ., Sendai, Japan
Volume :
1
fYear :
2001
fDate :
25-28 July 2001
Firstpage :
298
Abstract :
The paper demonstrates that the largest Lyapunov exponent λ of recurrent neural networks can be controlled efficiently by a stochastic gradient method. An essential core of the proposed method is a novel stochastic approximate formulation of the Lyapunov exponent λ as a function of the network parameters such as connection weights and thresholds of neural activation functions. By a gradient method, a direct calculation to minimize a square error (λ-λobj)2, where λobj is a desired exponent value, needs gradient collection through time which are given by a recursive calculation from past to present values. The collection is computationally expensive and causes unstable control of the exponent for networks with chaotic dynamics because of chaotic instability. The stochastic formulation derived in the paper gives us an approximation of the gradient collection in a fashion without the recursive calculation. This approximation can realize not only a faster calculation of the gradients, where only O(N2) run time is required while a direct calculation needs O(N5T) run time for networks with N neurons and T evolution, but also stable control for chaotic dynamics. It is also shown by simulation studies that the approximation is a robust formulation for the network size and that proposed method can control the chaos dynamics in recurrent neural networks effectively
Keywords :
Lyapunov methods; chaos; computational complexity; minimisation; nonlinear dynamical systems; recurrent neural nets; stochastic processes; chaotic dynamics; chaotic instability; computationally expensive; connection weights; direct calculation; gradient method; largest Lyapunov exponent; network parameters; neural activation functions; recurrent neural networks; recursive calculation; robust formulation; square error minimization; stable control; stochastic approximate formulation; stochastic formulation; stochastic gradient method; unstable control; Chaos; Chaotic communication; Educational institutions; Gradient methods; Intelligent networks; Laboratories; Learning systems; Optimization methods; Recurrent neural networks; Stochastic processes;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
IFSA World Congress and 20th NAFIPS International Conference, 2001. Joint 9th
Conference_Location :
Vancouver, BC
Print_ISBN :
0-7803-7078-3
Type :
conf
DOI :
10.1109/NAFIPS.2001.944268
Filename :
944268
Link To Document :
بازگشت