DocumentCode :
3517325
Title :
Sparse boosting
Author :
Xiang, Zhen James ; Ramadge, Peter J.
Author_Institution :
Dept. of Electr. Eng., Princeton Univ., Princeton, NJ
fYear :
2009
fDate :
19-24 April 2009
Firstpage :
1625
Lastpage :
1628
Abstract :
We propose a boosting algorithm that seeks to minimize the AdaBoost exponential loss of a composite classifier using only a sparse set of base classifiers. The proposed algorithm is computationally efficient and in test examples produces composite classifiers that are sparser and generalize as well those produced by Adaboost. The algorithm can be viewed as a coordinate descent method for the l1-regularized Adaboost exponential loss function.
Keywords :
learning (artificial intelligence); minimisation; pattern classification; AdaBoost exponential loss minimization; composite classifier; sparse base classifier set; sparse boosting algorithm; Boosting; Classification algorithms; Compressed sensing; Iterative algorithms; Optimization methods; Pattern classification; Signal processing algorithms; Signal representations; Testing; Training data; Algorithms; Optimization methods; Pattern classification; Signal representations;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Acoustics, Speech and Signal Processing, 2009. ICASSP 2009. IEEE International Conference on
Conference_Location :
Taipei
ISSN :
1520-6149
Print_ISBN :
978-1-4244-2353-8
Electronic_ISBN :
1520-6149
Type :
conf
DOI :
10.1109/ICASSP.2009.4959911
Filename :
4959911
Link To Document :
بازگشت