Title :
Feedback stability and unsupervised learning
Author_Institution :
Dept. of Electr. Eng., Univ. of Southern California, Los Angeles, CA, USA
Abstract :
Global stability is examined for nonlinear feedback dynamical systems subject to unsupervised learning. Only differentiable neural models are discussed. The unconditional stability of Hebbian learning systems is summarized in the adaptive bidirectional associative memory (ABAM) theorem. When no learning occurs, the resulting BAM models include Cohen-Grossberg autoassociators, Hopfield circuits, brain-state-in-a box models, and masking field models. The ABAM theorem is extended to arbitrary higher-order Hebbian learning. Conditions for exponential convergence are discussed. Sufficient conditions for global stability are established for dynamical systems that adapt according to competitive and differential Hebbian learning laws.<>
Keywords :
content-addressable storage; feedback; learning systems; neural nets; nonlinear systems; stability; Cohen-Grossberg autoassociators; Hebbian learning systems; Hopfield circuits; adaptive bidirectional associative memory; brain-state-in-a box models; content addressable storage; differentiable neural models; exponential convergence; feedback dynamical systems; global stability; learning systems; masking field models; neural nets; nonlinear systems; unconditional stability; unsupervised learning; Associative memories; Feedback; Learning systems; Neural networks; Nonlinear systems; Stability;
Conference_Titel :
Neural Networks, 1988., IEEE International Conference on
Conference_Location :
San Diego, CA, USA
DOI :
10.1109/ICNN.1988.23842