DocumentCode :
1092845
Title :
First-order versus second-order single-layer recurrent neural networks
Author :
Goudreau, Mark W. ; Giles, C. Lee ; Chakradhar, Srimat T. ; Chen, D.
Author_Institution :
Princeton Univ., NJ, USA
Volume :
5
Issue :
3
fYear :
1994
fDate :
5/1/1994 12:00:00 AM
Firstpage :
511
Lastpage :
513
Abstract :
We examine the representational capabilities of first-order and second-order single-layer recurrent neural networks (SLRNN´s) with hard-limiting neurons. We show that a second-order SLRNN is strictly more powerful than a first-order SLRNN. However, if the first-order SLRNN is augmented with output layers of feedforward neurons, it can implement any finite-state recognizer, but only if state-splitting is employed. When a state is split, it is divided into two equivalent states. The judicious use of state-splitting allows for efficient implementation of finite-state recognizers using augmented first-order SLRNN´s
Keywords :
feedforward neural nets; pattern recognition; recurrent neural nets; feedforward neurons; finite-state recognizer; first-order single-layer recurrent neural networks; hard-limiting neurons; output layers; representational capabilities; second-order single-layer recurrent neural networks; state-splitting; Automata; Circuits; Computer science; Educational institutions; Latches; National electric code; Neural networks; Neurons; Recurrent neural networks; USA Councils;
fLanguage :
English
Journal_Title :
Neural Networks, IEEE Transactions on
Publisher :
ieee
ISSN :
1045-9227
Type :
jour
DOI :
10.1109/72.286928
Filename :
286928
Link To Document :
بازگشت