Title :
Simple addition to back-propagation learning for dynamic weight pruning, sparse network extraction and faster learning
Author :
Heywood, Malcolm ; Noakes, Peter
Author_Institution :
Dept. of Electron Syst. Eng., Essex Univ., Colchester, UK
Abstract :
The enhancement to the backpropagation algorithm presented results from the need to extract sparsely connected networks from networks employing product terms. The enhancement works in conjunction with the backpropagation weight update process, so that the actions of weight zeroing and weight stimulation enhance each other. It is shown that the error measure can also be interpreted as rate of weight change (as opposed to ΔWij), and consequently is used to determine when weights have reached a stable state. Weights judged to be stable are then compared to a zero weight threshold. Should they fall below this threshold, the weight in question is zeroed. Simulation of such a system is shown to return improved learning rates and reduce network connection requirements with respect to the optimal network solution, trained using the normal backpropagation algorithm for multi-layer perceptron (MLP), higher order neural network (HONN) and sigma-pi networks
Keywords :
backpropagation; neural nets; backpropagation learning; dynamic weight pruning; higher order neural network; sigma-pi networks; sparse network extraction; weight stimulation; weight update; weight zeroing; zero weight threshold; Convergence; Modeling; Multi-layer neural network; Multilayer perceptrons; Network topology; Neural network hardware; Neural networks; Systems engineering and theory; Very large scale integration;
Conference_Titel :
Neural Networks, 1993., IEEE International Conference on
Conference_Location :
San Francisco, CA
Print_ISBN :
0-7803-0999-5
DOI :
10.1109/ICNN.1993.298546