Title :
Boosting in probabilistic neural networks
Author :
J. Grim;P. Pudil;P. Somol
Author_Institution :
Inst. of Inf. Theor. & Autom., Acad. of Sci. of the Czech Republic, Prague, Czech Republic
fDate :
6/24/1905 12:00:00 AM
Abstract :
The basic idea of boosting is to increase the pattern recognition accuracy by combining classifiers which have been derived from differently weighted versions of the original training data. It has been verified in practical experiments that the resulting classification performance can be improved by increasing the weights of misclassified training samples. However in statistical pattern recognition, the weighted data may influence the form of the estimated conditional distributions and therefore the theoretically achievable classification error could increase. We prove that in case of maximum-likelihood estimation the weighting of discrete data vectors is asymptotically equivalent to multiplication of the estimated discrete conditional distributions by a positive bounded function. Consequently, the Bayesian decision-making is shown to be asymptotically invariant with respect to arbitrary weighting of data provided that (a) the weighting function is defined identically for all classes and (b) the prior probabilities are properly modified.
Keywords :
"Boosting","Intelligent networks","Neural networks","Pattern recognition","Bayesian methods","Decision making","Sampling methods","Training data","Maximum likelihood estimation","Information theory"
Conference_Titel :
Pattern Recognition, 2002. Proceedings. 16th International Conference on
Print_ISBN :
0-7695-1695-X
DOI :
10.1109/ICPR.2002.1048256