Title :
Rule revision with recurrent neural networks
Author :
Omlin, Christian W. ; Giles, C. Lee
Author_Institution :
NEC Res. Inst., Princeton, NJ, USA
fDate :
2/1/1996 12:00:00 AM
Abstract :
Recurrent neural networks readily process, recognize and generate temporal sequences. By encoding grammatical strings as temporal sequences, recurrent neural networks can be trained to behave like deterministic sequential finite-state automata. Algorithms have been developed for extracting grammatical rules from trained networks. Using a simple method for inserting prior knowledge (or rules) into recurrent neural networks, we show that recurrent neural networks are able to perform rule revision. Rule revision is performed by comparing the inserted rules with the rules in the finite-state automata extracted from trained networks. The results from training a recurrent neural network to recognize a known non-trivial, randomly-generated regular grammar show that not only do the networks preserve correct rules but that they are able to correct through training inserted rules which were initially incorrect (i.e. the rules were not the ones in the randomly generated grammar)
Keywords :
deterministic automata; finite automata; formal languages; grammars; learning (artificial intelligence); recurrent neural nets; sequences; temporal reasoning; correct rule preservation; deterministic sequential finite-state automata; grammatical rules extraction; grammatical string encoding; non-trivial randomly-generated regular grammar; prior knowledge insertion; recurrent neural networks; rule revision; temporal sequences; trained networks; Automata; Control system synthesis; Control systems; Fault diagnosis; Intelligent robots; Neural networks; Real time systems; Recurrent neural networks; Sensor phenomena and characterization; Sensor systems and applications;
Journal_Title :
Knowledge and Data Engineering, IEEE Transactions on