DocumentCode :
309303
Title :
Dynamically deactivating hidden neurons in a multilayer perceptron neural network
Author :
Amin, H. ; Curtis, K.M. ; Hayes-Gill, B.R.
Author_Institution :
Dept. of Electr. & Electron. Eng., Nottingham Univ., UK
Volume :
1
fYear :
1996
fDate :
13-16 Oct 1996
Firstpage :
291
Abstract :
In this paper we present an approach that terminates the processing of hidden nodes, within a multilayer perceptron (MLP) neural network, if they become inactive during the learning process. The determination of the activity and non-activity of hidden nodes are based on the mean deviation of changes in the average derivative of the hidden nodes within an interval of several iterations. A decreasing threshold value is used to evaluate the mean deviation and hence to deactivate the hidden nodes accordingly
Keywords :
backpropagation; multilayer perceptrons; MLP neural network; dynamic deactivation; hidden neurons deactivation; learning process; mean deviation evaluation; multilayer perceptron neural network; threshold value reduction; Backpropagation algorithms; Intelligent networks; Iterative algorithms; Mean square error methods; Monitoring; Multi-layer neural network; Multilayer perceptrons; Neural networks; Neurons; Nonlinear equations;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Electronics, Circuits, and Systems, 1996. ICECS '96., Proceedings of the Third IEEE International Conference on
Conference_Location :
Rodos
Print_ISBN :
0-7803-3650-X
Type :
conf
DOI :
10.1109/ICECS.1996.582807
Filename :
582807
Link To Document :
بازگشت