Title :
Pruning recurrent neural networks for improved generalization performance
Author :
Omlin, Christian W. ; Giles, C. Lee
Author_Institution :
NEC Res. Inst., Princeton, NJ, USA
Abstract :
The experimental results in this paper demonstrate that a simple pruning/retraining method effectively improves the generalization performance of recurrent neural networks trained to recognize regular languages. The technique also permits the extraction of symbolic knowledge in the form of deterministic finite-state automata (DFA) which are more consistent with the rules to be learned. Weight decay has also been shown to improve a network´s generalization performance. Simulations with two small DFA (⩽10 states) and a large finite-memory machine (64 states) demonstrate that the performance improvement due to pruning/retraining is generally superior to the improvement due to training with weight decay. In addition, there is no need to guess a `good´ decay rate
Keywords :
finite automata; generalisation (artificial intelligence); knowledge acquisition; learning (artificial intelligence); recurrent neural nets; decay rate; deterministic finite-state automata; finite-memory machine; generalization performance; network pruning; recurrent neural networks; symbolic knowledge extraction; weight decay; Computer science; Doped fiber amplifiers; Educational institutions; Electronic mail; Learning automata; National electric code; Neurons; Recurrent neural networks;
Conference_Titel :
Neural Networks for Signal Processing [1994] IV. Proceedings of the 1994 IEEE Workshop
Conference_Location :
Ermioni
Print_ISBN :
0-7803-2026-3
DOI :
10.1109/NNSP.1994.365996