DocumentCode :
3587749
Title :
A Hebbian/Anti-Hebbian network for online sparse dictionary learning derived from symmetric matrix factorization
Author :
Tao Hu ; Pehlevan, Cengiz ; Chklovskii, Dmitri B.
Author_Institution :
Texas A&M Univ., College Station, TX, USA
fYear :
2014
Firstpage :
613
Lastpage :
619
Abstract :
Olshausen and Field (OF) proposed that neural computations in the primary visual cortex (V1) can be partially modelled by sparse dictionary learning. By minimizing the regularized representation error they derived an online algorithm, which learns Gabor-filter receptive fields from a natural image ensemble in agreement with physiological experiments. Whereas the OF algorithm can be mapped onto the dynamics and synaptic plasticity in a single-layer neural network, the derived learning rule is nonlocal - the synaptic weight update depends on the activity of neurons other than just pre- and postsynaptic ones - and hence biologically implausible. Here, to overcome this problem, we derive sparse dictionary learning from a novel cost-function - a regularized error of the symmetric factorization of the input´s similarity matrix. Our algorithm maps onto a neural network of the same architecture as OF but using only biologically plausible local learning rules. When trained on natural images our network learns Gabor-filter receptive fields and reproduces the correlation among synaptic weights hard-wired in the OF network. Therefore, online symmetric matrix factorization may serve as an algorithmic theory of neural computation.
Keywords :
Gabor filters; Hebbian learning; image processing; matrix decomposition; neurophysiology; Gabor-filter receptive field learning; Hebbian network; OF algorithm; V1; algorithmic theory; antiHebbian network; biologically plausible local learning rules; cost-function; input similarity matrix; natural image ensemble; neural computation; neural network architecture; neural network mapping; neuron activity; nonlocal learning rule; online algorithm; online sparse dictionary learning; online symmetric matrix factorization; physiological experiments; postsynaptic; presynaptic; primary visual cortex; regularized error; regularized representation error minimization; single-layer neural network; symmetric factorization; symmetric matrix factorization; synaptic plasticity; synaptic weight update; synaptic weights; Biological neural networks; Computational modeling; Cost function; Feedforward neural networks; Neurons; Sparse matrices; Symmetric matrices; matrix factorization; neuromorphic computing; neuron; online algorithm; sparse dictionary learning;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Signals, Systems and Computers, 2014 48th Asilomar Conference on
Print_ISBN :
978-1-4799-8295-0
Type :
conf
DOI :
10.1109/ACSSC.2014.7094519
Filename :
7094519
Link To Document :
بازگشت