DocumentCode :
3614100
Title :
Boosting in probabilistic neural networks
Author :
J. Grim;P. Pudil;P. Somol
Author_Institution :
Inst. of Inf. Theor. & Autom., Acad. of Sci. of the Czech Republic, Prague, Czech Republic
Volume :
2
fYear :
2002
fDate :
6/24/1905 12:00:00 AM
Firstpage :
136
Abstract :
The basic idea of boosting is to increase the pattern recognition accuracy by combining classifiers which have been derived from differently weighted versions of the original training data. It has been verified in practical experiments that the resulting classification performance can be improved by increasing the weights of misclassified training samples. However in statistical pattern recognition, the weighted data may influence the form of the estimated conditional distributions and therefore the theoretically achievable classification error could increase. We prove that in case of maximum-likelihood estimation the weighting of discrete data vectors is asymptotically equivalent to multiplication of the estimated discrete conditional distributions by a positive bounded function. Consequently, the Bayesian decision-making is shown to be asymptotically invariant with respect to arbitrary weighting of data provided that (a) the weighting function is defined identically for all classes and (b) the prior probabilities are properly modified.
Keywords :
"Boosting","Intelligent networks","Neural networks","Pattern recognition","Bayesian methods","Decision making","Sampling methods","Training data","Maximum likelihood estimation","Information theory"
Publisher :
ieee
Conference_Titel :
Pattern Recognition, 2002. Proceedings. 16th International Conference on
ISSN :
1051-4651
Print_ISBN :
0-7695-1695-X
Type :
conf
DOI :
10.1109/ICPR.2002.1048256
Filename :
1048256
Link To Document :
بازگشت