DocumentCode :
445981
Title :
Evolutionary supervision of a dynamical neural network allows learning with on-going weights
Author :
Meunier, David ; Paugam-Moisy, Hélène
Author_Institution :
Inst. des Sci. Cognitives, UMR CNRS, Bron, France
Volume :
3
fYear :
2005
fDate :
31 July-4 Aug. 2005
Firstpage :
1493
Abstract :
Recent electrophysiological data show that synaptic weights are highly influenced by electrical activities displayed by neurons. Weights are not stable as assumed in classical neural network models. What is the nature of engrains, if not stored in synaptic weights? Adopting the theory of dynamical systems, which allows an implicit form of memory, we propose a new framework for learning, where synaptic weights are continuously adapted. Evolutionary computation has been applied to a population of dynamic neural networks evolving in a prey-predator environment. Each individual develops complex dynamic patterns of neuronal activity, underlied by multiple recurrent connections. We show that this method allows the emergence of learning capability through generations, as a byproduct of evolution, since the behavioural performance of the network is not a priori based on this property.
Keywords :
evolutionary computation; learning (artificial intelligence); neural nets; adapted synaptic weight; complex dynamic pattern; dynamical neural network model; dynamical system theory; evolutionary computation; learning framework; multiple recurrent connection; prey-predator environment; Biological neural networks; Brain modeling; Computer networks; Electronic mail; Electrophysiology; Evolution (biology); Evolutionary computation; Network topology; Neural networks; Neurons;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Neural Networks, 2005. IJCNN '05. Proceedings. 2005 IEEE International Joint Conference on
Print_ISBN :
0-7803-9048-2
Type :
conf
DOI :
10.1109/IJCNN.2005.1556097
Filename :
1556097
Link To Document :
بازگشت