DocumentCode
508138
Title
Word Learning by a Extended BAM Network
Author
Chen, Qinghua ; Liu, Kai ; Fang, Fukang
Author_Institution
Dept. of Syst. Sci., Beijing Normal Univ., Beijing, China
Volume
1
fYear
2009
fDate
14-16 Aug. 2009
Firstpage
387
Lastpage
391
Abstract
Word learning has been a hot issue in cognitive science for many years. So far there are mainly two theories on it, hypothesis elimination and associative learning, yet none of them could explain the recognized experiments approvingly. By integrating advantages of these two approaches, a Bayesian inference framework was proposed recently, which fits some important experiments much better, though its algorithm is somewhat too complicated. Here we propose an extended BAM model which needs only simple calculation but is well consistent with the experiment data of how brain learns a word´s meaning from just one or only a few positive examples and responses properly to different amounts of samples as well as samples from different spans, which might provide a new and promising approach to the scholars on word learning.
Keywords
belief networks; cognitive systems; learning (artificial intelligence); neural nets; BAM network; Bayesian inference framework; bidirectional associative memory; cognitive science; word learning; Bayesian methods; Cognitive science; Computer network management; Computer networks; Conference management; Humans; Inference algorithms; Magnesium compounds; Neurons; Neuroscience; bidirectional associative memory model; word learning;
fLanguage
English
Publisher
ieee
Conference_Titel
Natural Computation, 2009. ICNC '09. Fifth International Conference on
Conference_Location
Tianjin
Print_ISBN
978-0-7695-3736-8
Type
conf
DOI
10.1109/ICNC.2009.373
Filename
5365627
Link To Document