Title :
One step Backpropagation Through Time for learning input mapping in reservoir computing applied to speech recognition
Author :
Hermans, Michiel ; Schrauwen, Benjamin
Author_Institution :
ELIS Dept., Ghent Univ., Ghent, Belgium
fDate :
May 30 2010-June 2 2010
Abstract :
Recurrent neural networks are very powerful engines for processing information that is coded in time, however, many problems with common training algorithms, such as Backpropagation Through Time, remain. Because of this, another important learning setup known as Reservoir Computing has appeared in recent years, where one uses an essentially untrained network to perform computations. Though very successful in many applications, using a random network can be quite inefficient when considering the required number of neurons and the associated computational costs. In this paper we introduce a highly simplified version of Backpropagation Through Time by basically truncating the error backpropagation to one step back in time, and we combine this with the classic Reservoir Computing setup using an instantaneous linear readout. We apply this setup to a spoken digit recognition task and show it to give very good results for small networks.
Keywords :
backpropagation; learning (artificial intelligence); recurrent neural nets; speech recognition; computational costs; error backpropagation truncation; information processing; learning input mapping; one step backpropagation through time; recurrent neural networks; reservoir computing; speech recognition; spoken digit recognition task; Backpropagation; Reservoirs; Speech recognition;
Conference_Titel :
Circuits and Systems (ISCAS), Proceedings of 2010 IEEE International Symposium on
Conference_Location :
Paris
Print_ISBN :
978-1-4244-5308-5
Electronic_ISBN :
978-1-4244-5309-2
DOI :
10.1109/ISCAS.2010.5537568