DocumentCode :
3130933
Title :
Second-order recurrent neural network for word sequence learning
Author :
Kwan, H.K. ; Yan, J.
Author_Institution :
Dept. of Electr. & Comput. Eng., Windsor Univ., Ont., Canada
fYear :
2001
fDate :
2001
Firstpage :
405
Lastpage :
408
Abstract :
This paper presents a genetic algorithm (GA)-based 2ndorder recurrent neural network (GRNN). Feedbacks in the structure enable the network to remember cues from the recent past of a word sequence. The GA is used to help design an improved network by evolving weights and connections dynamically. Simulation results on learning 50 commands of up to 3 words and 24 phone numbers of 10 digits illustrate that the GRNN is most efficient in error performance and recall accuracy as compared to other backpropagation-based recurrent and feedforward networks. The effects of population size, crossover probability and mutation rate on the performance of the GRNN are presented
Keywords :
backpropagation; genetic algorithms; performance evaluation; recurrent neural nets; speech recognition; backpropagation; crossover probability; error performance; feedback; genetic algorithm; mutation rate; population size; recall accuracy; second-order recurrent neural network; simulation; word sequence learning; Backpropagation algorithms; Biological cells; Genetic algorithms; Genetic mutations; Limit-cycles; Neural networks; Neurofeedback; Neurons; Recurrent neural networks; Speech;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Intelligent Multimedia, Video and Speech Processing, 2001. Proceedings of 2001 International Symposium on
Conference_Location :
Hong Kong
Print_ISBN :
962-85766-2-3
Type :
conf
DOI :
10.1109/ISIMP.2001.925419
Filename :
925419
Link To Document :
بازگشت