Author_Institution :
Dept. of Electr. Eng., Santa Clara Univ., CA, USA
Abstract :
Stability of the Hopfield neural network N:x˙i=-x i+Σj=1neijwij gj(xj)+Ii, i=1, 2, …, n y i=gi(xi) with a learning rule L:e˙ ij=μhij(e, y, y*), i, j=1, 2, …, n is analyzed using the concept of equilibrium manifold. We show that the concept provides an ideal setting for the formulation of a learning rule L that adaptively teaches network N to acquire y* as one of its asymptotically stable equilibria. Connective stability of a moving equilibrium in the composite two-time-scale system N&L, is established for bounded interconnection weights eijwij and a sufficiently small learning rate μ. The conditions for connective stability, which are derived using the M-matrices and the concept of vector Lyapunov functions, are suitable for studying the design trade-offs between the bounds on the nominal weights wij , the shape of the sigmoid-function gi(xi), the learning rate μ, and the size of the stability region containing y*
Keywords :
Hopfield neural nets; Lyapunov methods; learning (artificial intelligence); matrix algebra; stability; Hopfield neural network; M-matrices; asymptotically stable equilibria; bounded interconnection weights; connective stability; dynamic neural networks; equilibrium manifold; learning rate; learning scheme; recurrent neural networks; sigmoid function; two-time-scale system; vector Lyapunov functions; Hopfield neural networks; Interconnected systems; Manifolds; Neural networks; Neurons; Recurrent neural networks; Servomechanisms; Stability analysis; Stability criteria; Symmetric matrices;