DocumentCode :
2707092
Title :
Improving the separability of a reservoir facilitates learning transfer
Author :
Norton, David ; Ventura, Dan
Author_Institution :
Comput. Sci. Dept., Brigham Young Univ., Provo, UT, USA
fYear :
2009
fDate :
14-19 June 2009
Firstpage :
2288
Lastpage :
2293
Abstract :
We use a type of reservoir computing called the liquid state machine (LSM) to explore learning transfer. The liquid state machine (LSM) is a neural network model that uses a reservoir of recurrent spiking neurons as a filter for a readout function. We develop a method of training the reservoir, or liquid, that is not driven by residual error. Instead, the liquid is evaluated based on its ability to separate different classes of input into different spatial patterns of neural activity. Using this method, we train liquids on two qualitatively different types of artificial problems. Resulting liquids are shown to substantially improve performance on either problem regardless of which problem was used to train the liquid, thus demonstrating a significant level of learning transfer.
Keywords :
learning (artificial intelligence); natural sciences computing; neural nets; artificial problems; liquid state machine; neural activity; neural network model; reservoir facilitates learning transfer; residual error; Computer networks; Filters; Kernel; Liquids; Machine learning; Neural networks; Neurons; Recurrent neural networks; Reservoirs; Support vector machines;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Neural Networks, 2009. IJCNN 2009. International Joint Conference on
Conference_Location :
Atlanta, GA
ISSN :
1098-7576
Print_ISBN :
978-1-4244-3548-7
Electronic_ISBN :
1098-7576
Type :
conf
DOI :
10.1109/IJCNN.2009.5178656
Filename :
5178656
Link To Document :
بازگشت