Title :
Application of fully recurrent neural networks for speech recognition
Author :
Lee, Sung Jun ; Kim, Ki Chul ; Yoon, Hyunsoo ; Cho, Jung Wan
Author_Institution :
Korea Adv. Inst. of Sci. & Technol., Cheongryang, Seoul, South Korea
Abstract :
The authors describe an extended backpropagation algorithm for fully connected recurrent neural networks applied to speech recognition. The extended delta rule is approximated by excluding some of the past activities of the dynamic neurons to reduce computational complexity without performance degradation. In speaker-dependent recognition of a confusable syllable set, the fully recurrent neural network with the approximated backpropagation algorithm showed better performance than the multilayer perceptron and the self-recurrent network with comparable time complexity. In addition, it is found that most self-recurrent connections become excitatory and most mutual recurrent connections become inhibitory
Keywords :
computational complexity; neural nets; speech recognition; computational complexity; confusable syllable set; dynamic neurons; extended backpropagation algorithm; extended delta rule; fully connected recurrent neural networks; multilayer perceptron; self-recurrent network; speaker-dependent recognition; speech recognition; time complexity; Application software; Backpropagation algorithms; Computational complexity; Computer architecture; Computer science; Degradation; Neural networks; Neurons; Recurrent neural networks; Speech recognition;
Conference_Titel :
Acoustics, Speech, and Signal Processing, 1991. ICASSP-91., 1991 International Conference on
Conference_Location :
Toronto, Ont.
Print_ISBN :
0-7803-0003-3
DOI :
10.1109/ICASSP.1991.150282