DocumentCode :
3190885
Title :
Generalization of the cross-entropy error function to improve the error backpropagation algorithm
Author :
Oh, Sang-Hoon
Author_Institution :
Mobile Protocol & Signaling Section, Electron. & Telecommun. Res. Inst., Daejeon, South Korea
Volume :
3
fYear :
1997
fDate :
9-12 Jun 1997
Firstpage :
1856
Abstract :
This paper generalizes the cross-entropy error function to improve the EBP (error back propagation) algorithm of multilayer perceptrons. The generalized error function reduces the probability that output nodes are near the wrong extreme value as well as the correct extreme value of sigmoid function. As a result, we can accelerate the learning speed of the EBP algorithm with improved generalization performance. The effectiveness of the proposed method is demonstrated in a handwritten digit recognition task
Keywords :
backpropagation; entropy; errors; generalisation (artificial intelligence); multilayer perceptrons; optical character recognition; EBP; cross-entropy error function; error backpropagation; generalization; handwritten digit recognition task; multilayer perceptrons; probability; sigmoid function; Acceleration; Backpropagation algorithms; Databases; Error correction; Handwriting recognition; Iterative algorithms; Mobile communication; Multilayer perceptrons; Pattern recognition; Protocols;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Neural Networks,1997., International Conference on
Conference_Location :
Houston, TX
Print_ISBN :
0-7803-4122-8
Type :
conf
DOI :
10.1109/ICNN.1997.614181
Filename :
614181
Link To Document :
بازگشت