Title : 
Recurrent neural network design for temporal sequence learning
         
        
            Author : 
Kwan, H.K. ; Yan, J.
         
        
            Author_Institution : 
Dept. of Electr. & Comput. Eng., Windsor Univ., Ont., Canada
         
        
        
        
        
        
            Abstract : 
Presents two designs of a recurrent neural network with 1st and 2nd order self-feedback at the hidden layer. The first design is based on a gradient descent algorithm and the second design is based on a genetic algorithm (GA). The simulation results of the single hidden layer network and those of a single hidden layer feedforward neural network for learning 50 commands of up to 3 words and 24 phone numbers of 10 digits are presented. Results indicate that the GA-based dynamic recurrent neural network is best in both convergence and error performance
         
        
            Keywords : 
feedforward neural nets; genetic algorithms; gradient methods; learning (artificial intelligence); recurrent neural nets; speech recognition; convergence; error performance; feedforward neural network; genetic algorithm; gradient descent algorithm; hidden layer; recurrent neural network; self-feedbacks; speech recognition; temporal sequence learning; Algorithm design and analysis; Computer networks; Convergence; Design engineering; Feedforward neural networks; Feedforward systems; Genetic algorithms; Neural networks; Neurons; Recurrent neural networks;
         
        
        
        
            Conference_Titel : 
Circuits and Systems, 2000. Proceedings of the 43rd IEEE Midwest Symposium on
         
        
            Conference_Location : 
Lansing, MI
         
        
            Print_ISBN : 
0-7803-6475-9
         
        
        
            DOI : 
10.1109/MWSCAS.2000.952884