DocumentCode :
2705352
Title :
Self organizing modular neural networks
Author :
Smotroff, Ira G. ; Friedman, David H. ; Connolly, Dennis
Author_Institution :
Mitre Corp., Bedford, MA, USA
fYear :
1991
fDate :
8-14 Jul 1991
Firstpage :
187
Abstract :
An argument is made for the advantages of self-scaling neural network learning algorithms as opposed to fixed-topology algorithms such as backpropagation. Cascade correlation is shown to be a self-scaling learning algorithm of great promise that suffers from some bad characteristics. These drawbacks include degradation of learning speed and quality with the size of the network and the development of deep networks with high fan-in rates to hidden units. The iterative atrophy algorithm is introduced as an enhancement of cascade correlation. It preserves the good features of cascade correlation while eliminating the worst characteristics. It is also shown that the new algorithm extends nicely into the realm of modular learning algorithms, where it is named modular iterative atrophy
Keywords :
iterative methods; learning systems; neural nets; self-adjusting systems; cascade correlation; fan-in rates; learning algorithms; machine learning; modular iterative atrophy; modular learning algorithms; self organising neural nets; self-scaling neural network; Atrophy; Backpropagation algorithms; Biological neural networks; Degradation; Iterative algorithms; Network topology; Neural networks; Neurons; Optical network units; Organizing;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Neural Networks, 1991., IJCNN-91-Seattle International Joint Conference on
Conference_Location :
Seattle, WA
Print_ISBN :
0-7803-0164-1
Type :
conf
DOI :
10.1109/IJCNN.1991.155336
Filename :
155336
Link To Document :
بازگشت