Title :
Structure-based neural network learning
Author :
Peterfreund, N. ; Guez, A.
Author_Institution :
Center for Eng. Syst. Adv. Res., Oak Ridge Nat. Lab., TN, USA
fDate :
12/1/1997 12:00:00 AM
Abstract :
We present a new learning algorithm for the structure of recurrent neural networks. It is shown that any m linearly independent n-dimensional vectors can be stored in at most (n+m-2)-dimensional symmetric network. A storage procedure which satisfies this bound is presented. We propose a new learning procedure for the domain of attraction which preserves both the equilibrium set and the stability property of the original system. It is shown that previously learned attraction regions remain invariant under the proposed learning rule, Our emphasis throughout this brief is on the design of associative memories and classifiers
Keywords :
content-addressable storage; learning (artificial intelligence); nonlinear dynamical systems; recurrent neural nets; associative memories; attraction regions; domain of attraction; equilibrium set; linearly independent n-dimensional vectors; multidimensional symmetric network; recurrent neural networks; stability property; storage procedure; structure-based neural network learning; Associative memory; Control system synthesis; Learning systems; Network synthesis; Network topology; Neural networks; Nonlinear dynamical systems; Recurrent neural networks; Stability; Steady-state;
Journal_Title :
Circuits and Systems I: Fundamental Theory and Applications, IEEE Transactions on