• DocumentCode
    394141
  • Title

    An experimental comparison of recurrent neural network for natural language production

  • Author

    Nakagama, Hayato ; Tanaka, Shigeru

  • Author_Institution
    Lab. for Visual Neurocomputing, RIKEN, Saitama, Japan
  • Volume
    2
  • fYear
    2002
  • fDate
    18-22 Nov. 2002
  • Firstpage
    736
  • Abstract
    We study the performance of three types of recurrent neural networks (RNN) for the production of natural language sentences: Simple Recurrent Networks (SRN), Back-Propagation Through Time (BPTT) and Sequential Recursive Auto-Associative Memory (SRAAM). We used simple and complex grammars to compare the ability of learning and being scaled up. Among them, SRAAM is found to have highest performance of training and producing fairly complex and long sentences.
  • Keywords
    backpropagation; content-addressable storage; grammars; linguistics; natural languages; recurrent neural nets; BPTT; Back-Propagation Through Time; RNN; SRAAM; SRN; Sequential Recursive Auto-Associative Memory; Simple Recurrent Networks; complex grammars; natural language production; natural language sentences; recurrent neural network; Biological neural networks; Laboratories; Learning systems; Natural languages; Production; Recurrent neural networks;
  • fLanguage
    English
  • Publisher
    ieee
  • Conference_Titel
    Neural Information Processing, 2002. ICONIP '02. Proceedings of the 9th International Conference on
  • Print_ISBN
    981-04-7524-1
  • Type

    conf

  • DOI
    10.1109/ICONIP.2002.1198155
  • Filename
    1198155