Abstract :
The paper introduces a general class of neural networks where the neuron activations are modeled by discontinuous functions. The neural networks have an additive interconnecting structure and they include as particular cases the Hopfield neural networks (HNNs), and the standard cellular neural networks (CNNs), in the limiting situation where the HNNs and CNNs possess neurons with infinite gain. Conditions are derived which ensure the existence of a unique equilibrium point, and a unique output equilibrium point, which are globally attractive for the state and the output trajectories of the neural network, respectively. These conditions, which are applicable to general nonsymmetric neural networks, are based on the concept of Lyapunov diagonally-stable neuron interconnection matrices, and they can be thought of as a generalization to the discontinuous case of previous results established for neural networks possessing smooth neuron activations. Moreover, by suitably exploiting the presence of sliding modes, entirely new conditions are obtained which ensure global convergence in finite time, where the convergence time can be easily estimated on the basis of the relevant neural-network parameters. The analysis in the paper employs results from the theory of differential equations with discontinuous right-hand side as introduced by Filippov. In particular, global convergence is addressed by using a Lyapunov-like approach based on the concept of monotone trajectories of a differential inclusion.
Keywords :
Hopfield neural nets; Lyapunov matrix equations; cellular neural nets; convergence; differential equations; interconnected systems; Filippov solutions; Hopfield neural networks; Lyapunov diagonally-stable neuron interconnection matrices; additive interconnecting structure; cellular neural networks; convergence time; differential equations; differential inclusion; discontinuous neuron activations; discontinuous right-hand side; general nonsymmetric neural networks; generalized gradient; global attractivity; global convergence; monotone trajectories; neural networks; output trajectories; sliding modes; state trajectories; unique equilibrium point; unique output equilibrium point; Cellular neural networks; Convergence; Differential equations; Diodes; Eigenvalues and eigenfunctions; Hopfield neural networks; Kernel; Neural networks; Neurons; Symmetric matrices;