Title :
A modular architecture for efficient learning
Abstract :
A modular architecture for supervised learning is presented. Three ways to modularize the learning are investigated: (1) preprocessing the input by a self-organizing subnetwork to extract strong features from the data, (2) supervised feature extraction, and (3) an iterative learning cycle in which only one layer learns at a time, with the output-layer weights learned by an exact method. With this modular architecture, only a small fraction of connection weights is determined by the gradient-descent method. A series of computational experiments shows the superiority of the modular model in learning quality and speed. The author begins by considering the decomposition of a feedforward network into a feature-discovery and a supervised-learning modules, then he introduces supervised feature discovery, and finally he describes a modular backpropagation algorithm
Keywords :
learning systems; neural nets; backward inhibition model; computational experiments; connection weights; delta rule self-organization; feedforward network; four layer architecture; gradient-descent method; iterative learning cycle; learning quality; modular architecture; modular backpropagation algorithm; nonlinear hard learning core; output-layer weights; self-organizing subnetwork; supervised feature discovery; supervised feature extraction; supervised learning; thyroid data set; two-phase learning;
Conference_Titel :
Neural Networks, 1990., 1990 IJCNN International Joint Conference on
Conference_Location :
San Diego, CA, USA
DOI :
10.1109/IJCNN.1990.137625