Title :
Multi-class AdaBoost with Hypothesis Margin
Author :
Jin, Xiaobo ; Hou, Xinwen ; Liu, Cheng-Lin
Author_Institution :
Nat. Lab. of Pattern Recognition, Chinese Acad. of Sci., Beijing, China
Abstract :
Most AdaBoost algorithms for multi-class problems have to decompose the multi-class classification into multiple binary problems, like the Adaboost.MH and the LogitBoost. This paper proposes a new multi-class AdaBoost algorithm based on hypothesis margin, called AdaBoost.HM, which directly combines multi-class weak classifiers. The hypothesis margin maximizes the output about the positive class meanwhile minimizes the maximal outputs about the negative classes. We discuss the upper bound of the training error about AdaBoost.HM and a previous multi-class learning algorithm AdaBoost.M1. Our experiments using feed forward neural networks as weak learners show that the proposed AdaBoost.HM yields higher classification accuracies than the AdaBoost.M1 and the AdaBoost.MH, and meanwhile, AdaBoost.HM is computationally efficient in training.
Keywords :
Hamming codes; binary sequences; feedforward neural nets; heuristic programming; learning (artificial intelligence); pattern classification; AdaBoost.HM; feedforward neural networks; hypothesis margin; multiclass AdaBoost; multiclass classification; multiclass learning algorithm; multiclass weak classifiers; multiple binary problems; training error; upper bound; Accuracy; Additives; Artificial neural networks; Boosting; Error analysis; Training; Upper bound;
Conference_Titel :
Pattern Recognition (ICPR), 2010 20th International Conference on
Conference_Location :
Istanbul
Print_ISBN :
978-1-4244-7542-1
DOI :
10.1109/ICPR.2010.25