DocumentCode
1675141
Title
Improved Boosting algorithm with adaptive filtration
Author
Gao, Yunlong ; Gao, Feng ; Guan, Xiaohong
Author_Institution
State Key Lab. for Manuf. Syst. Eng., Xi´´an Jiaotong Univ., Xi´´an, China
fYear
2010
Firstpage
3173
Lastpage
3178
Abstract
AdaBoost is known as an effective method to improve the performance of base classifiers both theoretically and empirically. However, previous studies have shown that AdaBoost is always prone to overfitting especially in noisy case. In addition, most current works on Boosting assume that the loss function is fixed and therefore do not take the distinction between noisy case and noise-free case into consideration. In this paper, an improved Boosting algorithm with adaptive filtration is proposed. A filtering algorithm is designed firstly based on Hoeffding Inequality to identify mislabeled or atypical samples. By introducing the filtering algorithm, we manage to modify the loss function such that influences of mislabeled or atypical samples are penalized. Experiments performed on eight different UCI data sets show that the new Boosting algorithm almost always obtains considerably better classification accuracy than AdaBoost. Furthermore, experiments on data with artificially controlled noise indicate that the new Boosting algorithm is more robust to noise than AdaBoost.
Keywords
adaptive filters; learning (artificial intelligence); signal classification; AdaBoost; Hoeffding inequality; UCI data sets; adaptive filtration; classification accuracy; filtering algorithm; improved boosting algorithm; loss function; overfitting; Algorithm design and analysis; Boosting; Classification algorithms; Filtering algorithms; Filtration; Noise measurement; Training; AdaBoost; Filter; overfitting; variable loss function;
fLanguage
English
Publisher
ieee
Conference_Titel
Intelligent Control and Automation (WCICA), 2010 8th World Congress on
Conference_Location
Jinan
Print_ISBN
978-1-4244-6712-9
Type
conf
DOI
10.1109/WCICA.2010.5553968
Filename
5553968
Link To Document