• DocumentCode
    375512
  • Title

    Recurrent neural network design for temporal sequence learning

  • Author

    Kwan, H.K. ; Yan, J.

  • Author_Institution
    Dept. of Electr. & Comput. Eng., Windsor Univ., Ont., Canada
  • Volume
    2
  • fYear
    2000
  • fDate
    2000
  • Firstpage
    832
  • Abstract
    Presents two designs of a recurrent neural network with 1st and 2nd order self-feedback at the hidden layer. The first design is based on a gradient descent algorithm and the second design is based on a genetic algorithm (GA). The simulation results of the single hidden layer network and those of a single hidden layer feedforward neural network for learning 50 commands of up to 3 words and 24 phone numbers of 10 digits are presented. Results indicate that the GA-based dynamic recurrent neural network is best in both convergence and error performance
  • Keywords
    feedforward neural nets; genetic algorithms; gradient methods; learning (artificial intelligence); recurrent neural nets; speech recognition; convergence; error performance; feedforward neural network; genetic algorithm; gradient descent algorithm; hidden layer; recurrent neural network; self-feedbacks; speech recognition; temporal sequence learning; Algorithm design and analysis; Computer networks; Convergence; Design engineering; Feedforward neural networks; Feedforward systems; Genetic algorithms; Neural networks; Neurons; Recurrent neural networks;
  • fLanguage
    English
  • Publisher
    ieee
  • Conference_Titel
    Circuits and Systems, 2000. Proceedings of the 43rd IEEE Midwest Symposium on
  • Conference_Location
    Lansing, MI
  • Print_ISBN
    0-7803-6475-9
  • Type

    conf

  • DOI
    10.1109/MWSCAS.2000.952884
  • Filename
    952884