DocumentCode :
1089888
Title :
Comments on "Constructive learning of recurrent neural networks: limitations of recurrent cascade correlation and a simple solution"
Author :
Kremer, S.C.
Author_Institution :
Communication Res. Centre, Ottawa, Ont., Canada
Volume :
7
Issue :
4
fYear :
1996
fDate :
7/1/1996 12:00:00 AM
Firstpage :
1047
Lastpage :
1051
Abstract :
Giles et al. (1995) have proven that Fahlman´s recurrent cascade correlation (RCC) architecture is not capable of realizing finite state automata that have state-cycles of length more than two under a constant input signal. This paper extends the conclusions of Giles et al. by showing that there exists a corollary to their original proof which identifies a large second class of automata, that is also unrepresentable by RCC.
Keywords :
automata theory; learning (artificial intelligence); recurrent neural nets; constructive learning; finite state automata; recurrent cascade correlation; recurrent neural networks; state-cycles; Labeling; Learning automata; Neural networks; Recurrent neural networks; Sun;
fLanguage :
English
Journal_Title :
Neural Networks, IEEE Transactions on
Publisher :
ieee
ISSN :
1045-9227
Type :
jour
DOI :
10.1109/72.508949
Filename :
508949
Link To Document :
بازگشت