Title :
Global lagrange stability of recurrent neural networks with infinity distributed delays
Author :
Xiaohong Wang ; Xuhui Zhao ; Jiexin Pu ; Wenshao Bu ; Xingjun Chen
Author_Institution :
Coll. of Inf. Eng., Henan Univ. of Sci. & Technol., Luoyang, China
Abstract :
This paper is concerned with global exponential Lagrange stability for recurrent neural networks (RNNs) with general activation functions and infinity distributed delays. By employing a new differential inequality and abandoning the limitation on activation functions being bounded, monotonous and differentiable, several sufficient conditions are derived in terms of linear matrix inequalities (LMIs) which can be easily checked by LMI Control Toolbox in Matlab. Moreover, detailed estimations of the globally exponentially attractive sets are given out. Compared with the previous methods, the results obtained are independent of the time-varying delays and do not require the differentiability of delay functions. They extend and improve the earlier publications. Finally, a numerical example is provided to demonstrate the potential effectiveness of the proposed results.
Keywords :
asymptotic stability; delays; linear matrix inequalities; recurrent neural nets; LMI control toolbox; Matlab; RNN; activation function; delay functions; differential inequality; global exponential Lagrange stability; globally exponentially attractive sets; infinity distributed delays; linear matrix inequalities; recurrent neural networks; sufficient condition; Delays; Linear matrix inequalities; Numerical stability; Recurrent neural networks; Stability criteria; Globally exponentially attractive set; Linear matrix inequalities; Recurrent Neural network; global exponential stability in Lagrange sense; infinity distributed delays;
Conference_Titel :
Information and Automation (ICIA), 2014 IEEE International Conference on
Conference_Location :
Hailar
DOI :
10.1109/ICInfA.2014.6932729