Title :
Rule refinement with recurrent neural networks
Author :
Giles, C. Lee ; Omlin, Christian W.
Author_Institution :
NEC Res. Inst., Princeton, NJ, USA
Abstract :
Recurrent neural networks can be trained to behave like deterministic finite-state automata (DFAs) and methods have been developed for extracting grammatical rules from trained networks. Using a simple method for inserting prior knowledge of a subset of the DFA state transitions into recurrent neural networks, it is shown that recurrent neural networks are able to perform rule refinement. The results from training a recurrent neural network to recognize a known nontrivial randomly generated regular grammar show that not only do the networks preserve correct prior knowledge, but they are able to correct through training inserted prior knowledge which was wrong. By wrong, it is meant that the inserted rules were not the ones in the randomly generated grammar
Keywords :
deterministic automata; finite automata; grammars; pattern recognition; recurrent neural nets; DFA state transitions; deterministic finite-state automata; grammar recognition; grammatical rule extraction; nontrivial randomly generated regular grammar; recurrent neural networks; rule refinement; Automata; Clustering algorithms; Computer networks; Computer science; Doped fiber amplifiers; Educational institutions; National electric code; Neurons; Production; Recurrent neural networks;
Conference_Titel :
Neural Networks, 1993., IEEE International Conference on
Conference_Location :
San Francisco, CA
Print_ISBN :
0-7803-0999-5
DOI :
10.1109/ICNN.1993.298658