Title :
Augmented Echo State Networks with a feature layer and a nonlinear readout
Author :
Rachez, Arnaud ; Hagiwara, Masafumi
Author_Institution :
Dept. of Inf. & Comput. Sci., Keio Univ., Yokohama, Japan
Abstract :
Echo State Networks (ESNs) are an alternative to fully trained Recurrent Neural Networks (RNNs) showing State of the Art performance when applied to time series prediction. However, they have seldom been applied to abstract tasks and in the case of language modeling they require a number of units far superior to traditional RNNs in order to achieve similar performance. In this paper we propose a novel architecture by extending a conventional Echo State Network with a pre-recurrent feature layer and a nonlinear readout. The features are learned in a supervised way using a computationally cheap version of gradient descent and automatically capture grammatical similarity between words. They modify the dynamic of the network in a way that allows it to significantly outperform an ESN alone. The addition of a nonlinear readout is also investigated making the global system similar to a feed forward network with a memory layer.
Keywords :
recurrent neural nets; statistical analysis; ESN; RNN; augmented echo state networks; feature layer; feed forward network; gradient descent; language modeling; memory layer; nonlinear readout; prerecurrent feature layer; recurrent neural networks; statistical language models; time series prediction; Accuracy; Grammar; Reservoirs; Testing; Training; Vectors; Vocabulary;
Conference_Titel :
Neural Networks (IJCNN), The 2012 International Joint Conference on
Conference_Location :
Brisbane, QLD
Print_ISBN :
978-1-4673-1488-6
Electronic_ISBN :
2161-4393
DOI :
10.1109/IJCNN.2012.6252505