DocumentCode
744820
Title
Upper bounds for error rates of linear combinations of classifiers
Author
Murua, Alejandro
Author_Institution
Insightful Corp., Seattle, WA, USA
Volume
24
Issue
5
fYear
2002
fDate
5/1/2002 12:00:00 AM
Firstpage
591
Lastpage
602
Abstract
A useful notion of weak dependence between many classifiers constructed with the same training data is introduced. It is shown that if both this weak dependence is low and the expected margins are large, then decision rules based on linear combinations of these classifiers can achieve error rates that decrease exponentially fast. Empirical results with randomized trees and trees constructed via boosting and bagging show that weak dependence is present in these type of trees. Furthermore, these results also suggest that there is a trade-off between weak dependence and expected margins, in the sense that to compensate for low expected margins, there should be low mutual dependence between the classifiers involved in the linear combination
Keywords
error statistics; pattern classification; trees (mathematics); bagging; boosting; classification trees; decision rules; error rates; expected margins; exponential bounds; linear classifier combinations; machine learning; mutual dependence; randomized trees; training data; upper bounds; weak dependence; Error analysis;
fLanguage
English
Journal_Title
Pattern Analysis and Machine Intelligence, IEEE Transactions on
Publisher
ieee
ISSN
0162-8828
Type
jour
DOI
10.1109/34.1000235
Filename
1000235
Link To Document