DocumentCode :
2096426
Title :
The Compound Effect of Boosting and Stratified Sampling on Decision Tree Accuracy
Author :
Gill, A.A. ; Smith, George D.
Author_Institution :
Dept. of Comput. Sci., SZABIST, Islamabad
fYear :
2006
fDate :
13-14 Nov. 2006
Firstpage :
587
Lastpage :
592
Abstract :
It is generally recognized that recursive partitioning, as used in the construction of classification trees, is inherently unstable, particularly for small data sets. Classification accuracy and, by implication, tree structure, are sensitive to changes in the training data. Successful approaches to counteract this effect include multiple classifiers, e.g. boosting, bagging or windowing. The downside of these multiple classification models, however, is the plethora of trees that result, often making it difficult to extract the classifier in a meaningful manner. We show that, by using some very weak knowledge in the sampling stage, when the data set is partitioned into the training and test sets, a more consistent and improved performance is achieved by a single decision tree classifier. The reductions in error rate attained are comparable with those attained using boosting. In addition, we demonstrate that the combination of such sampling, combined with boosting, yields significant reductions in error rates
Keywords :
data mining; decision trees; learning (artificial intelligence); pattern classification; boosting; classification accuracy; data mining; decision tree accuracy; recursive partitioning; stratified sampling; Bagging; Boosting; Classification tree analysis; Data mining; Decision trees; Error analysis; Sampling methods; Testing; Training data; Tree data structures;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Emerging Technologies, 2006. ICET '06. International Conference on
Conference_Location :
Peshawar
Print_ISBN :
1-4244-0502-5
Electronic_ISBN :
1-4244-0503-3
Type :
conf
DOI :
10.1109/ICET.2006.335905
Filename :
4136873
Link To Document :
بازگشت