DocumentCode :
2366683
Title :
Unsupervised Hebbian learning by recurrent multilayer neural networks for temporal hierarchical pattern recognition
Author :
Lo, James Ting-Ho
Author_Institution :
Dept. of Math. & Stat., Univ. of Maryland Baltimore County, Baltimore, MD, USA
fYear :
2010
fDate :
17-19 March 2010
Firstpage :
1
Lastpage :
6
Abstract :
Recurrent multilayer network structures and Hebbian learning are two essential features of biological neural networks. An artificial recurrent multilayer neural network that performs supervised Hebbian learning, called probabilistic associative memory (PAM), was recently proposed. PAM is a recurrent multilayer network of processing units (PUs), each processing unit comprising a group of novel artificial neurons, which generate spike trains. PUs are detectors and recognizers of the feature subvectors appearing in their receptive fields. In supervised learning by a PU, the label of the feature subvector is provided from outside PAM. Since the feature subvector may be shared by many causes and may contain parts from many causes, the label of the feature subvector is sometimes difficult to obtain, not to mention the cost, especially if there are many hidden layers and feedbacks. This paper presents an unsupervised learning scheme, which is Hebbian in the following sense: The strength of a synapse increases if the outputs of the presynaptic and postsynaptic neurons are identical and decreases otherwise. This unsupervised Hebbian learning capability makes PAM a good functional model of neuronal networks as well as a good learning machine for temporal hierarchical pattern recognition.
Keywords :
Hebbian learning; pattern classification; recurrent neural nets; unsupervised learning; artificial neurons; biological neural networks; feature subvectors; learning machine; probabilistic associative memory; processing unit; recurrent multilayer neural networks; spike trains; supervised learning; temporal hierarchical pattern recognition; unsupervised Hebbian learning; Artificial neural networks; Associative memory; Biological neural networks; Detectors; Hebbian theory; Multi-layer neural network; Neural networks; Neurons; Pattern recognition; Recurrent neural networks; Hebbian learning; learning machine; maximal generalization; multilayer neural network; orthogonal expansion; probability distribution; recurrent neural network; spike trains; unsupervised learning;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Information Sciences and Systems (CISS), 2010 44th Annual Conference on
Conference_Location :
Princeton, NJ
Print_ISBN :
978-1-4244-7416-5
Electronic_ISBN :
978-1-4244-7417-2
Type :
conf
DOI :
10.1109/CISS.2010.5464925
Filename :
5464925
Link To Document :
بازگشت