DocumentCode :
2737406
Title :
Neural classifiers using loss functions
Author :
Hrycej, Tomas
Author_Institution :
Daimler-Benz AG, Ulm-Boefingen, Germany
fYear :
1991
fDate :
8-14 Jul 1991
Abstract :
Summary form only given. While a vast majority of backpropagation-based classifiers use mean squared error (MSE) as an error measure, it can be shown that MSE is inadequate for classification for three reasons: (1) its minimum is different from the minimum of misclassification loss, (2) it is unable to account for class-specific misclassification losses, and (3) it slows learning by imposing unnecessary constraints. By contrast, using the misclassification loss as error measure overcomes all these problems. Its use with backpropagation is as easy as that of MSE. Computational experiments with a differentiable approximation of the misclassification loss have confirmed its superiority over the MSE in terms of both convergence speed and misclassification rates
Keywords :
learning systems; neural nets; pattern recognition; backpropagation-based classifiers; class-specific misclassification; convergence speed; differentiable approximation; error measure; learning; mean squared error; misclassification loss; Backpropagation; Computer networks; Convergence; Linearity; Loss measurement; Multi-layer neural network; Neural networks;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Neural Networks, 1991., IJCNN-91-Seattle International Joint Conference on
Conference_Location :
Seattle, WA
Print_ISBN :
0-7803-0164-1
Type :
conf
DOI :
10.1109/IJCNN.1991.155547
Filename :
155547
Link To Document :
بازگشت