DocumentCode :
2496923
Title :
Memory in reservoirs for high dimensional input
Author :
Hermans, Michiel ; Schrauwen, Benjamin
Author_Institution :
Ghent Univ., Ghent, Belgium
fYear :
2010
fDate :
18-23 July 2010
Firstpage :
1
Lastpage :
7
Abstract :
Reservoir Computing (RC) is a recently introduced scheme to employ recurrent neural networks while circumventing the difficulties that typically appear when training the recurrent weights. The `reservoir´ is a fixed randomly initiated recurrent network which receives input via a random mapping. Only an instantaneous linear mapping from the network to the output is trained which can be done with linear regression. In this paper we study dynamical properties of reservoirs receiving a high number of inputs. More specifically, we investigate how the internal state of the network retains fading memory of its input signal. Memory properties for random recurrent networks have been thoroughly examined in past research, but only for one-dimensional input. Here we take into account statistics which will typically occur in high dimensional signals. We find useful empirical data which expresses how memory in recurrent networks is distributed over the individual principal components of the input.
Keywords :
learning (artificial intelligence); recurrent neural nets; regression analysis; reservoirs; dynamical properties; linear regression; random mapping; recurrent neural networks; reservoir computing; Equations; Mathematical model; Memory management; Neurons; Noise; Recurrent neural networks; Reservoirs;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Neural Networks (IJCNN), The 2010 International Joint Conference on
Conference_Location :
Barcelona
ISSN :
1098-7576
Print_ISBN :
978-1-4244-6916-1
Type :
conf
DOI :
10.1109/IJCNN.2010.5596884
Filename :
5596884
Link To Document :
بازگشت