• DocumentCode
    3590751
  • Title

    Learning regular languages via recurrent higher-order neural networks

  • Author

    Tanaka, Ken ; Kumazawa, Itsuo

  • Author_Institution
    Fac. of Eng., Niigata Univ., Japan
  • Volume
    2
  • fYear
    1996
  • Firstpage
    1378
  • Abstract
    Learning regular languages is accomplished by the acquisition of finite state automata. In order for a neural network to acquire an arbitrary FSA, the network must first have a representation for every FSA state, and furthermore be able to realize an arbitrary FSA state transfer function. We show that if the network model of Giles et al. (1992) represents each FSA state using local state representation, then it can realize any FSA state transfer function. However, this may be difficult to acquire by learning, and is a reason why the model is not necessarily successful at learning some regular languages. In order to overcome this problem we propose the recurrent higher-order neural network (RHON). We show the order of connections sufficient to realize any FSA state transfer function regardless of the networks representation of states. After deriving the learning algorithm, we show the learning superiority of RHON to the model of Giles et al. through computer simulation
  • Keywords
    finite automata; formal languages; learning (artificial intelligence); recurrent neural nets; finite state automata; local state representation; recurrent higher-order neural networks; regular languages; state transfer function; Gradient methods; Learning automata; NP-hard problem; Neural networks; Recurrent neural networks; Transfer functions;
  • fLanguage
    English
  • Publisher
    ieee
  • Conference_Titel
    Neural Networks, 1996., IEEE International Conference on
  • Print_ISBN
    0-7803-3210-5
  • Type

    conf

  • DOI
    10.1109/ICNN.1996.549100
  • Filename
    549100