Title :
Structural simplification of a feed-forward, multilayer perceptron artificial neural network
Author :
Hu, Y.-H. ; Xue, Qiuzhen ; Tompkins, W.J.
Author_Institution :
Dept. of Electr. & Comput. Eng., Wisconsin Univ., WI, USA
Abstract :
Several methods to reduce the excessive number of neurons and synaptic weights in a feedforward, multilayer perceptron artificial neural network (ANN) are presented. To reduce the synaptic weights, the authors replace the original weight matrix by a product of two smaller matrices so that the number of multiplications required can be reduced. To reduce the hidden units, they exploit the correlation among the outputs of the hidden neurons in the same layer. A method to identify and remove redundant hidden units and update the weights of the remaining neurons is proposed. This approach offers potentially good performance without retraining. When retraining is applied to fine-tune the reduced network, the updated weights become very good initial conditions enabling much faster training compared with training with random initial conditions
Keywords :
matrix algebra; neural nets; artificial neural network; feedforward neural network; hidden units; initial conditions; multilayer perceptron; multiplications; neurons; performance; retraining; synaptic weights; updated weights; weight matrix; Artificial neural networks; Ash; Cost function; Electronic mail; Feedforward systems; Multilayer perceptrons; Neural networks; Neurons; Redundancy; Testing;
Conference_Titel :
Acoustics, Speech, and Signal Processing, 1991. ICASSP-91., 1991 International Conference on
Conference_Location :
Toronto, Ont.
Print_ISBN :
0-7803-0003-3
DOI :
10.1109/ICASSP.1991.150536