Title :
Modified backpropagation algorithms for training the multilayer feedforward neural networks with hard-limiting neurons
Author :
Yu, Xiangui ; Loh, Nan K. ; Jullien, G.A. ; Miller, W.C.
Author_Institution :
Dept. of Electr. Eng., Windsor Univ., Ont., Canada
Abstract :
This paper introduces modified backpropagation algorithms for training multilayer feedforward neural networks with hard-limiting neurons. Transforming neuron activation functions are used in all the hidden layers, which are the modified continuous sigmoidal functions with an adaptive steepness factors. In the training process, this steepness factor varies from a small positive value to infinite with the decrease of the sum square error. Thus, a multilayer feedforward neural network can be trained with the resultant architecture only composed of hard limiting neurons. The learning algorithm is similar to the conventional backpropagation algorithm. Only the derivatives of the hidden neural activation functions are modified. Extensive numerical simulations are presented to show the feasibility of the proposed algorithm. In addition, the numerical properties of the proposed algorithm are discussed and comparisons of the proposed algorithm with other algorithms are made
Keywords :
backpropagation; feedforward neural nets; transfer functions; adaptive steepness factors; hard-limiting neurons; hidden layers; learning algorithm; modified backpropagation algorithms; modified continuous sigmoidal functions; multilayer feedforward neural networks; neuron activation functions; numerical properties; numerical simulations; sum square error; training; Backpropagation algorithms; Electronic mail; Feedforward neural networks; Multi-layer neural network; Multilayer perceptrons; Neural networks; Neurons; Numerical simulation; Robotics and automation; Very large scale integration;
Conference_Titel :
Electrical and Computer Engineering, 1993. Canadian Conference on
Conference_Location :
Vancouver, BC
Print_ISBN :
0-7803-2416-1
DOI :
10.1109/CCECE.1993.332191