Title :
Computational Implications of Lognormally Distributed Synaptic Weights
Author :
Teramae, Jun-nosuke ; Fukai, Tomoki
Author_Institution :
Grad. Sch. of Inf. Sci. & Technol., Osaka Univ., Suita, Japan
Abstract :
The connectivity structure of neural networks has significant implications for neural information processing, and much experimental effort has been made to clarify the structure of neural networks in the brain, i.e., both graph structure and weight structure of synaptic connections. A traditional view of neural information processing suggests that neurons compute in a highly parallel and distributed manner, in which the cooperation of many weak synaptic inputs is necessary to activate a single neuron. Recent experiments, however, have shown that not all synapses are weak in cortical circuits, but some synapses are extremely strong (several tens of times larger than the average weight). In fact, the weights of excitatory synapses between cortical excitatory neurons often obey a lognormal distribution with a long tail of strong synapses. Here, we review some of our important and recent works on computation with sparsely distributed synaptic weights and discuss the possible implications of this synaptic principle for neural computation by spiking neurons. We demonstrate that internal noise emerges from long-tailed distributions of synaptic weights to produce stochastic resonance effect in the reverberating synaptic pathways constituted by strong synapses. We show a spike-timing-dependent plasticity rule and other mechanisms that produce such weight distributions. A possible hardware realization of lognormally connected networks is also shown.
Keywords :
bioelectric potentials; brain; log normal distribution; medical computing; neural nets; neurophysiology; stochastic processes; brain; computational implications; connectivity structure; cortical circuits; cortical excitatory neurons; distributed computation; excitatory synapse weights; graph structure; hardware realization; internal noise; lognormal distribution; long-tailed distributions; neural computation; neural information processing; neural network structure; parallel computation; reverberating synaptic pathways; single neuron; sparsely distributed synaptic weights; spike-timing-dependent plasticity rule; spiking neurons; stochastic resonance effect; strong synapses; synaptic connection; synaptic principle; weak synaptic inputs; weight distribution; weight structure; Biological neural networks; Integrated circuit modeling; Memory; Neurons; Noise measurement; Principal component analysis; Stability analysis; Stochastic resonance; Stochastic systems; Associative memory (AM); feedforward networks; network connectivity; neural dynamics; neuromorphic engineering; principal component analysis (PCA); recurrent networks; sparse coding; spike sequence; spike-timing-dependent plasticity (STDP); stochastic resonance;
Journal_Title :
Proceedings of the IEEE
DOI :
10.1109/JPROC.2014.2306254