DocumentCode
800969
Title
Constructive learning of recurrent neural networks: limitations of recurrent cascade correlation and a simple solution
Author
Giles, C. Lee ; Chen, Dong ; Sun, Guo-Zheng ; Chen, Hsing-Hen ; Lee, Yee-Chung ; Goudreau, Mark W.
Author_Institution
Inst. for Adv. Comput. Studies, Maryland Univ., College Park, MD, USA
Volume
6
Issue
4
fYear
1995
fDate
7/1/1995 12:00:00 AM
Firstpage
829
Lastpage
836
Abstract
It is often difficult to predict the optimal neural network size for a particular application. Constructive or destructive methods that add or subtract neurons, layers, connections, etc. might offer a solution to this problem. We prove that one method, recurrent cascade correlation, due to its topology, has fundamental limitations in representation and thus in its learning capabilities. It cannot represent with monotone (i.e., sigmoid) and hard-threshold activation functions certain finite state automata. We give a “preliminary” approach on how to get around these limitations by devising a simple constructive training method that adds neurons during training while still preserving the powerful fully-recurrent structure. We illustrate this approach by simulations which learn many examples of regular grammars that the recurrent cascade correlation method is unable to learn
Keywords
correlation methods; learning (artificial intelligence); optimisation; recurrent neural nets; constructive learning; destructive methods; hard-threshold activation functions; monotone activation functions; optimal neural network size; recurrent cascade correlation; recurrent neural networks; regular grammars; sigmoid activation functions; Biological neural networks; Computational modeling; Correlation; Learning automata; Network topology; Neural networks; Neurons; Recurrent neural networks; Sun; Training data;
fLanguage
English
Journal_Title
Neural Networks, IEEE Transactions on
Publisher
ieee
ISSN
1045-9227
Type
jour
DOI
10.1109/72.392247
Filename
392247
Link To Document