DocumentCode :
1133915
Title :
Absolute exponential stability of a class of continuous-time recurrent neural networks
Author :
Hu, Sanqing ; Wang, Jun
Author_Institution :
Dept. of Autom. & Comput.-Aided Eng., Chinese Univ. of Hong Kong, China
Volume :
14
Issue :
1
fYear :
2003
fDate :
1/1/2003 12:00:00 AM
Firstpage :
35
Lastpage :
45
Abstract :
This paper presents a new result on absolute exponential stability (AEST) of a class of continuous-time recurrent neural networks with locally Lipschitz continuous and monotone nondecreasing activation functions. The additively diagonally stable connection weight matrices are proven to be able to guarantee AEST of the neural networks. The AEST result extends and improves the existing absolute stability and AEST ones in the literature.
Keywords :
asymptotic stability; matrix algebra; recurrent neural nets; H-matrix; absolute exponential stability; additively diagonally stable connection weight matrices; continuous-time recurrent neural networks; diagonal semistability; locally Lipschitz continuous functions; monotone nondecreasing activation functions; Automation; Councils; Eigenvalues and eigenfunctions; Neural networks; Neurons; Recurrent neural networks; Stability analysis; Sufficient conditions; Symmetric matrices;
fLanguage :
English
Journal_Title :
Neural Networks, IEEE Transactions on
Publisher :
ieee
ISSN :
1045-9227
Type :
jour
DOI :
10.1109/TNN.2002.806954
Filename :
1176125
Link To Document :
بازگشت