Title :
Quantifying the Effect of Learning on Recurrent Spikin Neurons
Author_Institution :
Dept. of Comput. Sci. & Software Eng., Concordia Univ., Montreal, QC
Abstract :
This work is concerned with measuring what is the response of recurrent spiking neurons when a learning rule is applied to them, in a liquid state machine context. Two indicators are considered for monitoring on-line the effect of learning: the separation property, which has already been studied in previous works, and an incremental version of the statistical complexity measure that is introduced expressly for our needs. It is found that while separation increases, a neuron´s average statistical complexity decreases when a learning rule is applied. This means that neurons become more predictable and their behavior is simplified as an effect of learning. A key feature of this work is to provide a quantification of this phenomenon.
Keywords :
learning (artificial intelligence); recurrent neural nets; statistical analysis; incremental version; liquid state machine context; machine learning; recurrent spiking neuron; statistical complexity measure; Neural networks; Neurons;
Conference_Titel :
Neural Networks, 2007. IJCNN 2007. International Joint Conference on
Conference_Location :
Orlando, FL
Print_ISBN :
978-1-4244-1379-9
Electronic_ISBN :
1098-7576
DOI :
10.1109/IJCNN.2007.4371009