Title :
Efficient activation functions for the back-propagation neural network
Author :
Kenue, Surender K.
Author_Institution :
General Motors Res. Lab., Warren, MI, USA
Abstract :
Summary form only given. A new family of activation functions for the back-propagation algorithm has been proposed, whose derivatives belong to the Sechn (x) family for n=1,2, . . .. The maximum value of the derivatives varies from 0.637 to 1.875 for n=1-6, and thus a member of the activation function family can be selected to suit the problem. Results of using this family of activation functions show orders of magnitude savings in computation. A discrete version of these functions was also proposed for efficient implementation. For the parity 8 problem with 16 hidden units, the new activation function f3 uses 300 epochs for learning as compared to 500000 epochs used by the standard activation function
Keywords :
learning systems; neural nets; activation functions; back-propagation neural network; epochs; learning; maximum value; parity 8 problem; Artificial neural networks; Computational modeling; Convergence; Error correction; Humans; Laboratories; Logistics; Neural networks; Psychology; Vehicles;
Conference_Titel :
Neural Networks, 1991., IJCNN-91-Seattle International Joint Conference on
Conference_Location :
Seattle, WA
Print_ISBN :
0-7803-0164-1
DOI :
10.1109/IJCNN.1991.155549