DocumentCode
274189
Title
Training networks with discontinuous activation functions
Author
Findlay, D.A.
fYear
1989
fDate
16-18 Oct 1989
Firstpage
361
Lastpage
363
Abstract
This paper presents a learning algorithm which may be used to train networks whose neurons may have discontinuous or nondifferentiable activation functions. The algorithm has been demonstrated using several different neuron activation functions. Although it shares several features with the error back-propagation algorithm, the heuristic derivation presented does not appeal to the highly mathematical derivation of the error back-propagation algorithm. The error back-propagation learning algorithm is shown to be at least reasonable. The learning algorithm derived could be argued to be successful just because of its similarity with the error back-propagation algorithm. Alternatively, it may be that the success of the error back-propagation algorithm, in that it does not seem to suffer from the problems normally associated with gradient descent procedures, is due to its similarity with the algorithm presented
fLanguage
English
Publisher
iet
Conference_Titel
Artificial Neural Networks, 1989., First IEE International Conference on (Conf. Publ. No. 313)
Conference_Location
London
Type
conf
Filename
51993
Link To Document