Title of article :
Hebbian errors in learning: An analysis using the Oja model
Author/Authors :
Ra?dulescu، نويسنده , , Anca and Cox، نويسنده , , Kingsley and Adams، نويسنده , , Paul، نويسنده ,
Issue Information :
روزنامه با شماره پیاپی سال 2009
Pages :
13
From page :
489
To page :
501
Abstract :
Background: Recent work on long term potentiation in brain slices shows that Hebbʹs rule is not completely synapse-specific, probably due to intersynapse diffusion of calcium or other factors. We previously suggested that such errors in Hebbian learning might be analogous to mutations in evolution. s and findings: We examine this proposal quantitatively, extending the classical Oja unsupervised model of learning by a single linear neuron to include Hebbian inspecificity. We introduce an error matrix E , which expresses possible crosstalk between updating at different connections. When there is no inspecificity, this gives the classical result of convergence to the first principal component of the input distribution (PC1). We show the modified algorithm converges to the leading eigenvector of the matrix EC , where C is the input covariance matrix. In the most biologically plausible case when there are no intrinsically privileged connections, E has diagonal elements Q and off-diagonal elements ( 1 - Q ) / ( n - 1 ) , where Q, the quality, is expected to decrease with the number of inputs n and with a synaptic parameter b that reflects synapse density, calcium diffusion, etc. We study the dependence of the learning accuracy on b, n and the amount of input activity or correlation (analytically and computationally). We find that accuracy increases (learning becomes gradually less useful) with increases in b, particularly for intermediate (i.e., biologically realistic) correlation strength, although some useful learning always occurs up to the trivial limit Q = 1 / n . sions and significance: We discuss the relation of our results to Hebbian unsupervised learning in the brain. When the mechanism lacks specificity, the network fails to learn the expected, and typically most useful, result, especially when the input correlation is weak. Hebbian crosstalk would reflect the very high density of synapses along dendrites, and inevitably degrades learning.
Keywords :
synaptic plasticity , Information transfer , Neural computation , Learning algorithm , dynamical system
Journal title :
Journal of Theoretical Biology
Serial Year :
2009
Journal title :
Journal of Theoretical Biology
Record number :
1539712
Link To Document :
بازگشت