Title :
A bifurcation theory approach to vector field programming for periodic attractors
Author_Institution :
Dept. of Biophys., California Univ., Berkeley, CA, USA
Abstract :
Analytic methods of bifurcation theory are used to design algorithms for determining synaptic weights in recurrent analog neural network architectures. Nonnumerical formulas, as well as numerical learning algorithms using hidden units for the storage of static and periodic attractors, are introduced. These algorithms allow programming of the network vector field regardless of the patterns to be stored. Stability of patterns and cycles, basin geometry, and rates of convergence may be controlled. For a network of n nodes, n static of n/2 periodic attractors can be stored by a projection algorithm. For orthogonal patterns, this learning operation reduces to a kind of periodic outer product rule that allows local, additive, and commutative incremental learning. Standing- or traveling-wave cycles may be stored to mimic the kind of oscillating spatial patterns that appear in the neural activity of the olfactory bulb and prepyriform cortex during inspiration, and suffice, in the bulb, to predict the pattern recognition behavior of rabbits in classical conditioning experiments. These attractors arise, during simulated inspiration, through a multiple Hopf bifurcation, which can act as a critical decision point for their selection by very small input patterns.<>
Keywords :
eigenvalues and eigenfunctions; learning systems; matrix algebra; neural nets; basin geometry; bifurcation theory; conditioning experiments; hidden units; inspiration; multiple Hopf bifurcation; neural activity; numerical learning algorithms; olfactory bulb; orthogonal patterns; oscillating spatial patterns; pattern recognition behavior; periodic attractors; prepyriform cortex; projection algorithm; rabbits; rates of convergence; recurrent analog neural network architectures; standing-wave cycles; static attractors; synaptic weights; traveling-wave cycles; vector field programming; Eigenvalues and eigenfunctions; Learning systems; Matrices; Neural networks;
Conference_Titel :
Neural Networks, 1989. IJCNN., International Joint Conference on
Conference_Location :
Washington, DC, USA
DOI :
10.1109/IJCNN.1989.118612