Title :
Global exponential stability of neural networks with globally Lipschitz continuous activations and its application to linear variational inequality problem
Author :
Liang, Xue-Bin ; Si, Jennie
Author_Institution :
Dept. of Electr. & Comput. Eng., Delaware Univ., Newark, DE, USA
fDate :
3/1/2001 12:00:00 AM
Abstract :
This paper investigates the existence, uniqueness, and global exponential stability (GES) of the equilibrium point for a large class of neural networks with globally Lipschitz continuous activations including the widely used sigmoidal activations and the piecewise linear activations. The provided sufficient condition for GES is mild and some conditions easily examined in practice are also presented. The GES of neural networks in the case of locally Lipschitz continuous activations is also obtained under an appropriate condition. The analysis results given in the paper extend substantially the existing relevant stability results in the literature, and therefore expand significantly the application range of neural networks in solving optimization problems. As a demonstration, we apply the obtained analysis results to the design of a recurrent neural network (RNN) for solving the linear variational inequality problem (VIP) defined on any nonempty and closed box set, which includes the box constrained quadratic programming and the linear complementarity problem as the special cases. It can be inferred that the linear VIP has a unique solution for the class of Lyapunov diagonally stable matrices, and that the synthesized RNN is globally exponentially convergent to the unique solution. Some illustrative simulation examples are also given
Keywords :
Lyapunov matrix equations; asymptotic stability; linear programming; neural nets; piecewise linear techniques; quadratic programming; variational techniques; GES; Lyapunov diagonally stable matrices; box constrained quadratic programming; equilibrium point; global exponential stability; globally Lipschitz continuous activations; linear VIP; linear complementarity problem; linear variational inequality problem; locally Lipschitz continuous activations; nonempty closed box set; optimization problems; piecewise linear activations; recurrent neural network design; sigmoidal activations; stability results; synthesized RNN; Associative memory; Asymptotic stability; Linear matrix inequalities; Network synthesis; Neural networks; Piecewise linear techniques; Quadratic programming; Recurrent neural networks; Stability analysis; Sufficient conditions;
Journal_Title :
Neural Networks, IEEE Transactions on