DocumentCode :
2948379
Title :
Convergence and Consistency of Recursive Boosting
Author :
Lozano, Aurélie C. ; Kulkarni, Sanjeev R.
Author_Institution :
Dept. of Electr. Eng., Princeton Univ., NJ
fYear :
2006
fDate :
9-14 July 2006
Firstpage :
2185
Lastpage :
2189
Abstract :
We study the convergence and consistency of boosting algorithms for classification. The standard method, as the sample size increases say from m to m+1, is to re-initialize the boosting algorithm with an arbitrary prediction rule. In contrast to this "batch" approach, we propose a boosting procedure that is recursive in the sense that for sample size m+1, the algorithm is re-started with the composite classifier that was obtained for sample size m at a specific point, the linking point. We adopt the regularization technique of early stopping, which consists in stopping the procedure based on the 1-norm of the composite classifier. We prove that such recursive boosting methods achieve consistency provided certain stopping and linking points criteria are met. We show that these conditions can be satisfied for widely used loss functions
Keywords :
pattern classification; batch approach; boosting algorithms; composite classifier; recursive boosting; regularization technique; Additives; Boosting; Classification algorithms; Convergence of numerical methods; Extraterrestrial measurements; Hafnium; Iterative algorithms; Joining processes; Pattern recognition; Predictive models;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Information Theory, 2006 IEEE International Symposium on
Conference_Location :
Seattle, WA
Print_ISBN :
1-4244-0505-X
Electronic_ISBN :
1-4244-0504-1
Type :
conf
DOI :
10.1109/ISIT.2006.261938
Filename :
4036357
Link To Document :
بازگشت