Title :
Network capacity for latent attractor computation
Author :
Doboli, Simona ; Minai, Ali A.
Author_Institution :
Cincinnati Univ., OH, USA
Abstract :
We (1999) have proposed a paradigm called “latent attractors” where attractors embedded in a recurrent network via Hebbian learning are used to channel network response to external input rather than becoming manifest themselves. This allows the network to generate context-sensitive internal codes in complex situations. Latent attractors are particularly helpful in explaining computations within the hippocampus-a brain region of fundamental significance for memory and spatial learning. The performance of latent attractor networks depends on the number of such attractors that a network can sustain. Following methods developed for associative memory networks, we present analytical and computational results on the capacity of latent attractor networks
Keywords :
Hebbian learning; content-addressable storage; recurrent neural nets; Hebbian learning; associative memory networks; context-sensitive internal codes; hippocampus; latent attractor; recurrent neural network; spatial learning; Adaptive systems; Associative memory; Computational modeling; Computer networks; Hebbian theory; Hippocampus; Laboratories; Metastasis; Nervous system; Neurons;
Conference_Titel :
Neural Networks, 2000. IJCNN 2000, Proceedings of the IEEE-INNS-ENNS International Joint Conference on
Conference_Location :
Como
Print_ISBN :
0-7695-0619-4
DOI :
10.1109/IJCNN.2000.857840