DocumentCode :
3414371
Title :
Global stability analysis of discrete-time recurrent neural networks
Author :
Barabanov, Nikita E. ; Prokhorov, Danil V.
Author_Institution :
St. Petersburg State Electrotechnical Univ., Russia
Volume :
6
fYear :
2001
fDate :
2001
Firstpage :
4550
Abstract :
We address the problem of Lyapunov stability of discrete-time recurrent neural networks (RNN). We assume that network weights are fixed. Based on classical results of the theory of absolute stability, we propose a new approach to stability analysis of RNN with sector-type monotone nonlinearities. We devise a simple state space transformation to convert the original RNN equations to a form suitable for our stability analysis. We then write appropriate linear matrix inequalities (LMI) to be solved to determine whether the RNN is globally exponentially stable. Unlike previous treatments, our approach naturally permits to account for nonzero biases usually present in RNN for improved approximation capabilities. We illustrate how to use our approach with an example
Keywords :
Lyapunov methods; recurrent neural nets; stability; Lyapunov stability; RNN; absolute stability; discrete-time recurrent neural networks; linear matrix inequalities; network weights; stability analysis; Control systems; Feedforward neural networks; Laboratories; Linear matrix inequalities; Lyapunov method; Neural networks; Neurofeedback; Recurrent neural networks; Stability analysis; State-space methods;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
American Control Conference, 2001. Proceedings of the 2001
Conference_Location :
Arlington, VA
ISSN :
0743-1619
Print_ISBN :
0-7803-6495-3
Type :
conf
DOI :
10.1109/ACC.2001.945696
Filename :
945696
Link To Document :
بازگشت