Title :
Complexity control method for recurrent neural networks
Author :
Sakai, Masao ; Honma, Noriyasu ; Abe, Kenichi
Author_Institution :
Graduate Sch. of Eng., Tohoku Univ., Sendai, Japan
Abstract :
This paper demonstrates that the Lyapunov exponents of recurrent neural networks can be controlled by our proposed methods. One of the control methods minimizes a squared error eλ=(λ-λ obj)2/2 by a gradient method, where λ is the largest Lyapunov exponent of the network and λobj is a desired exponent. λ implying the dynamical complexity is calculated by observing the state transition for a long-term period. This method is, however, computationally expensive for large-scale recurrent networks and the control is unstable for recurrent networks with chaotic dynamics since a gradient correction through time diverges due to the chaotic instability. We also propose an approximation method in order to reduce the computational cost and realize a “stable” control for chaotic networks. The new method is based on a stochastic relation which allows us to calculate the correction through time in a fashion without time evolution. Simulation results show that the approximation method can control the exponent for recurrent networks with chaotic dynamics under a restriction
Keywords :
Lyapunov methods; computational complexity; recurrent neural nets; stability; Lyapunov exponents; approximation method; chaotic dynamics; chaotic instability; complexity control method; recurrent neural networks; squared error; Approximation methods; Chaos; Computational efficiency; Computational modeling; Computer networks; Error correction; Gradient methods; Large-scale systems; Recurrent neural networks; Stochastic processes;
Conference_Titel :
Systems, Man, and Cybernetics, 1999. IEEE SMC '99 Conference Proceedings. 1999 IEEE International Conference on
Conference_Location :
Tokyo
Print_ISBN :
0-7803-5731-0
DOI :
10.1109/ICSMC.1999.814139