• DocumentCode
    286262
  • Title

    Learning context-free grammar with enhanced neural network pushdown automaton

  • Author

    Sun, Guo-Zheng

  • Author_Institution
    Inst. for Adv. Comput. Studies, Maryland Univ., College Park, MD, USA
  • fYear
    1993
  • fDate
    22-23 Apr 1993
  • Abstract
    Previously (C. Sreerupa Das et al., 1992, 1993; C.L. Giles et al., 1990; G.Z. Sun et al., 1990, 1991), a model was developed of neural network pushdown automata (NNPDA). NNPDA is a hybrid system which couples the neural network finite controller with an external continuous stack memory. It learns context-free grammars from examples by minimizing the properly defined objective function. In the original version of the NNPDA, the neural network controller is built with high-order connected recurrent neural nets. However, due to the complexity of the learning surface, this model could not learn several non-trivial grammars, among which the Palindrome grammar is considered a very difficult one. The author analyzes the difficulty of original model and makes several modifications and thus enhanced the learning power of NNPDA. The major enhancement is the introduction of a linear `full order´ recurrent neural network as the stack controller
  • Keywords
    automata theory; context-free grammars; learning (artificial intelligence); recurrent neural nets; Palindrome grammar; context-free grammars; external continuous stack memory; full order recurrent neural nets; high-order connected recurrent neural nets; hybrid system; learning power; learning surface; minimizing; neural network finite controller; neural network pushdown automata; non-trivial grammars; objective function; stack controller;
  • fLanguage
    English
  • Publisher
    iet
  • Conference_Titel
    Grammatical Inference: Theory, Applications and Alternatives, IEE Colloquium on
  • Conference_Location
    Colchester
  • Type

    conf

  • Filename
    243123