DocumentCode :
2626210
Title :
Learning out of time series with an extended recurrent neural network
Author :
Schuster, Mike
Author_Institution :
ATR Interpreting Telecommun. Res. Labs., Kyoto, Japan
fYear :
1996
fDate :
4-6 Sep 1996
Firstpage :
170
Lastpage :
179
Abstract :
In this paper an extension to a regular recurrent neural network (ERNN) is presented. It allows to train the ERNN without the limitation of using input information just up to a preset future frame. It is possible to train the ERNN simultaneously in positive and negative time direction, leading in regression and classification experiments to results better than merging the outputs of separate networks trained in positive and negative time direction alone. The network structure is designed to be trained at least with any form of backpropagation through time. Structure and training procedure of the proposed network are explained. Results for classification experiments with an ERNN trained as a classifier and regression experiments with an ERNN trained to minimize the mean squared error on artificial data are reported and compared with previous approaches using merged outputs of regular RNNs. For real data, a classification experiment for speech feature vectors to phone classes is reported
Keywords :
backpropagation; recurrent neural nets; speech processing; time series; backpropagation; classification experiments; extended recurrent neural network; phone classes; regression; speech feature vectors; time series; Backpropagation; Delay effects; Merging; Multi-layer neural network; Neural networks; Neurons; Parameter estimation; Recurrent neural networks; Speech; Training data;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Neural Networks for Signal Processing [1996] VI. Proceedings of the 1996 IEEE Signal Processing Society Workshop
Conference_Location :
Kyoto
ISSN :
1089-3555
Print_ISBN :
0-7803-3550-3
Type :
conf
DOI :
10.1109/NNSP.1996.548347
Filename :
548347
Link To Document :
بازگشت