DocumentCode :
1748948
Title :
Highly successful learning based on modified Catmull-Rom spline activation function
Author :
Sunat, Khamron ; Lursinsap, Chidchanok
Author_Institution :
Dept. of Math., Chulalongkorn Univ., Bangkok, Thailand
Volume :
4
fYear :
2001
fDate :
2001
Firstpage :
2783
Abstract :
A sigmoidal-like activation function, a modified Catmull-Rom spline activation function, is proposed. The advantages of the proposed activation are low complexity of the hardware implementation and arithmetic operation. Since it gives a good error signal for updating the weight values, the usual gradient based training algorithm can be used to train the network with the proposed activation function. Our experiments based on the well established benchmarks, such as LT, SONAR, IRIS, OCTANE, and 5-BIT COUNT data sets, show that the convergent speed and the successful rate based on the proposed model is more than those of the existing models using the sigmoidal function. The network obtained by using the proposed activation function is more compacted than the network obtained under the sigmoidal function in term of the arithmetic operations and the network components
Keywords :
convergence; gradient methods; learning (artificial intelligence); neural nets; splines (mathematics); transfer functions; Catmull-Rom spline activation function; arithmetic operations; convergence; gradient based learning; neural networks; sigmoidal function; weight values; Backpropagation algorithms; Brain; Costs; Hardware; Humans; Iris; Mathematics; Neurons; Sonar; Spline;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Neural Networks, 2001. Proceedings. IJCNN '01. International Joint Conference on
Conference_Location :
Washington, DC
ISSN :
1098-7576
Print_ISBN :
0-7803-7044-9
Type :
conf
DOI :
10.1109/IJCNN.2001.938814
Filename :
938814
Link To Document :
بازگشت