DocumentCode :
1391653
Title :
Absolute exponential stability of neural networks with a general class of activation functions
Author :
Liang, Xue-Bin ; Wang, Jun
Author_Institution :
Dept. of Electr. & Comput. Eng., Delaware Univ., Newark, DE, USA
Volume :
47
Issue :
8
fYear :
2000
fDate :
8/1/2000 12:00:00 AM
Firstpage :
1258
Lastpage :
1263
Abstract :
The authors investigate the absolute exponential stability (AEST) of neural networks with a general class of partially Lipschitz continuous (defined in Section II) and monotone increasing activation functions. The main obtained result is that if the interconnection matrix T of the network system satisfies that -T is an H-matrix with nonnegative diagonal elements, then the neural network system is absolutely exponentially stable (AEST); i.e., that the network system is globally exponentially stable (GES) for any activation functions in the above class, any constant input vectors and any other network parameters. The obtained AEST result extends the existing ones of absolute stability (ABST) of neural networks with special classes of activation functions in the literature
Keywords :
absolute stability; asymptotic stability; matrix algebra; neural nets; transfer functions; H-matrix; absolute exponential stability; activation functions; constant input vectors; globally exponentially stable; interconnection matrix; monotone increasing activation functions; neural networks; nonnegative diagonal elements; partially Lipschitz continuous; Automation; Computer science; Councils; Integrated circuit interconnections; Neural networks; Neurons; Quadratic programming; Stability analysis;
fLanguage :
English
Journal_Title :
Circuits and Systems I: Fundamental Theory and Applications, IEEE Transactions on
Publisher :
ieee
ISSN :
1057-7122
Type :
jour
DOI :
10.1109/81.873882
Filename :
873882
Link To Document :
بازگشت