DocumentCode :
1442862
Title :
On the Dual Formulation of Boosting Algorithms
Author :
Shen, Chunhua ; Li, Hanxi
Author_Institution :
Canberra Res. Lab., NICTA, Canberra, ACT, Australia
Volume :
32
Issue :
12
fYear :
2010
Firstpage :
2216
Lastpage :
2231
Abstract :
We study boosting algorithms from a new perspective. We show that the Lagrange dual problems of ℓ1-norm-regularized AdaBoost, LogitBoost, and soft-margin LPBoost with generalized hinge loss are all entropy maximization problems. By looking at the dual problems of these boosting algorithms, we show that the success of boosting algorithms can be understood in terms of maintaining a better margin distribution by maximizing margins and at the same time controlling the margin variance. We also theoretically prove that approximately, ℓ1-norm-regularized AdaBoost maximizes the average margin, instead of the minimum margin. The duality formulation also enables us to develop column-generation-based optimization algorithms, which are totally corrective. We show that they exhibit almost identical classification results to that of standard stagewise additive boosting algorithms but with much faster convergence rates. Therefore, fewer weak classifiers are needed to build the ensemble using our proposed optimization technique.
Keywords :
convergence; learning (artificial intelligence); linear programming; Lagrange dual problem; LogitBoost; boosting algorithm; column generation based optimization algorithm; convergence rate; duality formulation; entropy maximization problem; generalized hinge loss; l1-norm-regularized adaboost; margin distribution; margin variance; soft margin LPBoost; AdaBoost; LPBoost; Lagrange duality; LogitBoost; entropy maximization.; linear programming;
fLanguage :
English
Journal_Title :
Pattern Analysis and Machine Intelligence, IEEE Transactions on
Publisher :
ieee
ISSN :
0162-8828
Type :
jour
DOI :
10.1109/TPAMI.2010.47
Filename :
5432192
Link To Document :
بازگشت