DocumentCode :
2511843
Title :
Multi-class AdaBoost with Hypothesis Margin
Author :
Jin, Xiaobo ; Hou, Xinwen ; Liu, Cheng-Lin
Author_Institution :
Nat. Lab. of Pattern Recognition, Chinese Acad. of Sci., Beijing, China
fYear :
2010
fDate :
23-26 Aug. 2010
Firstpage :
65
Lastpage :
68
Abstract :
Most AdaBoost algorithms for multi-class problems have to decompose the multi-class classification into multiple binary problems, like the Adaboost.MH and the LogitBoost. This paper proposes a new multi-class AdaBoost algorithm based on hypothesis margin, called AdaBoost.HM, which directly combines multi-class weak classifiers. The hypothesis margin maximizes the output about the positive class meanwhile minimizes the maximal outputs about the negative classes. We discuss the upper bound of the training error about AdaBoost.HM and a previous multi-class learning algorithm AdaBoost.M1. Our experiments using feed forward neural networks as weak learners show that the proposed AdaBoost.HM yields higher classification accuracies than the AdaBoost.M1 and the AdaBoost.MH, and meanwhile, AdaBoost.HM is computationally efficient in training.
Keywords :
Hamming codes; binary sequences; feedforward neural nets; heuristic programming; learning (artificial intelligence); pattern classification; AdaBoost.HM; feedforward neural networks; hypothesis margin; multiclass AdaBoost; multiclass classification; multiclass learning algorithm; multiclass weak classifiers; multiple binary problems; training error; upper bound; Accuracy; Additives; Artificial neural networks; Boosting; Error analysis; Training; Upper bound;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Pattern Recognition (ICPR), 2010 20th International Conference on
Conference_Location :
Istanbul
ISSN :
1051-4651
Print_ISBN :
978-1-4244-7542-1
Type :
conf
DOI :
10.1109/ICPR.2010.25
Filename :
5597629
Link To Document :
بازگشت