DocumentCode :
3090450
Title :
Incorporating Bagging into Boosting
Author :
Jain, Kunal ; Kulkarni, Santosh
Author_Institution :
Dept. of Comput. Sci., Univ. of Mumbai, Mumbai, India
fYear :
2012
fDate :
4-7 Dec. 2012
Firstpage :
443
Lastpage :
448
Abstract :
In classification learning, classifier group learning approach provides better results to predict accuracy. There are various classifiers to form a group by repeating single based learning algorithm. The members of the group make a final classification by voting. Boosting and Bagging are two popular methods from this group and decreases error rate of decision tree learning. Boosting is more accurate than Bagging, but the former is more variable than the later. In this paper, our aim is to review the state of the art on group learning techniques in the frame work of imbalanced data sets. We propose a new group learning algorithm called Incorporating Bagging in to Boosting (IB), which creates number of subgroups by incorporating Bagging into Boosting. Experimental results on natural domains show that on an average IB is more stable than either Bagging or Boosting. It is more stable than Boosting. These characteristics make IB a good choice as one of the group learning techniques.
Keywords :
data handling; decision trees; learning (artificial intelligence); pattern classification; IB; classifier group learning approach; decision tree learning; group learning techniques; imbalanced data sets; incorporating bagging in to boosting; Bagging; Barium; Boosting; Classification algorithms; Error analysis; Radio frequency; Training; Bagging; Boosting; classification group learning algorithm; random forest;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Hybrid Intelligent Systems (HIS), 2012 12th International Conference on
Conference_Location :
Pune
Print_ISBN :
978-1-4673-5114-0
Type :
conf
DOI :
10.1109/HIS.2012.6421375
Filename :
6421375
Link To Document :
بازگشت