DocumentCode :
1416333
Title :
Boosting Through Optimization of Margin Distributions
Author :
Shen, Chunhua ; Li, Hanxi
Author_Institution :
Canberra Res. Lab., NICTA, Canberra, ACT, Australia
Volume :
21
Issue :
4
fYear :
2010
fDate :
4/1/2010 12:00:00 AM
Firstpage :
659
Lastpage :
666
Abstract :
Boosting has been of great interest recently in the machine learning community because of the impressive performance for classification and regression problems. The success of boosting algorithms may be interpreted in terms of the margin theory. Recently, it has been shown that generalization error of classifiers can be obtained by explicitly taking the margin distribution of the training data into account. Most of the current boosting algorithms in practice usually optimize a convex loss function and do not make use of the margin distribution. In this brief, we design a new boosting algorithm, termed margin-distribution boosting (MDBoost), which directly maximizes the average margin and minimizes the margin variance at the same time. This way the margin distribution is optimized. A totally corrective optimization algorithm based on column generation is proposed to implement MDBoost. Experiments on various data sets show that MDBoost outperforms AdaBoost and LPBoost in most cases.
Keywords :
learning (artificial intelligence); boosting algorithm; classification problems; convex loss function; generalization error; machine learning; margin distributions; margin theory; margin-distribution boosting; optimization; regression problems; training data; AdaBoost; boosting; column generation; margin distribution; Algorithms; Automatic Data Processing; Computer Simulation; Generalization (Psychology); Humans; Neural Networks (Computer);
fLanguage :
English
Journal_Title :
Neural Networks, IEEE Transactions on
Publisher :
ieee
ISSN :
1045-9227
Type :
jour
DOI :
10.1109/TNN.2010.2040484
Filename :
5411921
Link To Document :
بازگشت