DocumentCode :
1902268
Title :
Self-clustering recurrent networks
Author :
Zeng, Zheng ; Goodman, Rodney M. ; Smyth, Padhraic
Author_Institution :
California Inst. of Technol., Pasadena, CA, USA
fYear :
1993
fDate :
1993
Firstpage :
33
Abstract :
It is shown, based on empirical analyses, that second-order recurrent neural networks which are trained to learn finite state automata (FSAs) tend to form discrete clusters as the state representation in the hidden unit activation space. This observation is used to define self-clustering networks which automatically extract discrete state machines from the learning network. To address the problem of instability a network structure is introduced, whereby the network uses quantization in the feedback path to force the learning of discrete states. Experimental results show that the method learns FSAs just as well as existing methods in the literature but with the significant advantage of being stable on test strings of arbitrary length
Keywords :
finite automata; learning (artificial intelligence); recurrent neural nets; discrete clusters; discrete state machines; finite state automata; hidden unit activation space; learning network; quantization; second-order recurrent neural networks; self-clustering networks; state representation; test strings; Force feedback; Laboratories; Learning automata; Neural networks; Propulsion; Quantization; Space technology; Stability; State feedback; Testing;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Neural Networks, 1993., IEEE International Conference on
Conference_Location :
San Francisco, CA
Print_ISBN :
0-7803-0999-5
Type :
conf
DOI :
10.1109/ICNN.1993.298535
Filename :
298535
Link To Document :
بازگشت