DocumentCode :
1583752
Title :
Modification of Backpropagation Algorithm and Its Application for Neural Networks with Threshold Activation Function
Author :
Ptitchkin, V.A.
Author_Institution :
Belarusian State Univ. of Inf. & Radioelectronics, Minsk
Volume :
1
fYear :
2007
Firstpage :
227
Lastpage :
231
Abstract :
The method for determination of gradient of quadratic quality index of multi-layer neural network (MLNN) in one forward passage is proposed. Here, dependence of the gradient on the derivatives of the activation functions (AF) shall become obvious. Replacing the derivatives by the linearization coefficients of the activation functions shall make it possible to determine the coefficients of linearization of the quadratic quality index and to use these coefficients for determination of new values of the synaptic matrices in the supervisory learning procedure. As a result, extension of Backpropagation Algorithm (BPA) application to the networks with nondifferentiable and even discontinuous activation functions shall become possible. As an example, simple algorithm is proposed for determining the coefficients of linearization of the threshold-type activation function.
Keywords :
backpropagation; multilayer perceptrons; transfer function matrices; backpropagation algorithm; linearization coefficients; multilayer neural network; quadratic quality index; supervisory learning procedure; synaptic matrices; threshold activation function; Backpropagation algorithms; Equations; Feedforward neural networks; Informatics; Information processing; Multi-layer neural network; Neural networks; Neurons; Proposals; Supervised learning;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Natural Computation, 2007. ICNC 2007. Third International Conference on
Conference_Location :
Haikou
Print_ISBN :
978-0-7695-2875-5
Type :
conf
DOI :
10.1109/ICNC.2007.480
Filename :
4344187
Link To Document :
بازگشت