DocumentCode :
962773
Title :
Global output convergence of a class of continuous-time recurrent neural networks with time-varying thresholds
Author :
Liu, Derong ; Hu, Sanqing ; Wang, Jun
Author_Institution :
Dept. of Electr. & Comput. Eng., Univ. of Illinois, Chicago, IL, USA
Volume :
51
Issue :
4
fYear :
2004
fDate :
4/1/2004 12:00:00 AM
Firstpage :
161
Lastpage :
167
Abstract :
This paper discusses the global output convergence of a class of continuous-time recurrent neural networks (RNNs) with globally Lipschitz continuous and monotone nondecreasing activation functions and locally Lipschitz continuous time-varying thresholds. We establish one sufficient condition to guarantee the global output convergence of this class of neural networks. The present result does not require symmetry in the connection weight matrix. The convergence result is useful in the design of recurrent neural networks with time-varying thresholds.
Keywords :
continuous time systems; convergence; recurrent neural nets; time-varying systems; transfer functions; Lipschitz continuity; Lyapunov diagonal semistability; connection weight matrix; continuous-time RNN; global output convergence; globally Lipschitz continuous function; locally Lipschitz continuous time-varying thresholds; monotone nondecreasing activation function; neural network class; recurrent neural networks; time-varying threshold; Asymptotic stability; CADCAM; Computer aided manufacturing; Convergence; Hopfield neural networks; Linear programming; Neural networks; Recurrent neural networks; Sufficient conditions; Vectors;
fLanguage :
English
Journal_Title :
Circuits and Systems II: Express Briefs, IEEE Transactions on
Publisher :
ieee
ISSN :
1549-7747
Type :
jour
DOI :
10.1109/TCSII.2004.824041
Filename :
1288419
Link To Document :
بازگشت