Author :
Xiang, Zhen James ; Ramadge, Peter J.
Author_Institution :
Dept. of Electr. Eng., Princeton Univ., Princeton, NJ
Abstract :
We propose a boosting algorithm that seeks to minimize the AdaBoost exponential loss of a composite classifier using only a sparse set of base classifiers. The proposed algorithm is computationally efficient and in test examples produces composite classifiers that are sparser and generalize as well those produced by Adaboost. The algorithm can be viewed as a coordinate descent method for the l1-regularized Adaboost exponential loss function.
Keywords :
learning (artificial intelligence); minimisation; pattern classification; AdaBoost exponential loss minimization; composite classifier; sparse base classifier set; sparse boosting algorithm; Boosting; Classification algorithms; Compressed sensing; Iterative algorithms; Optimization methods; Pattern classification; Signal processing algorithms; Signal representations; Testing; Training data; Algorithms; Optimization methods; Pattern classification; Signal representations;
Conference_Titel :
Acoustics, Speech and Signal Processing, 2009. ICASSP 2009. IEEE International Conference on
Conference_Location :
Taipei
Print_ISBN :
978-1-4244-2353-8
Electronic_ISBN :
1520-6149
DOI :
10.1109/ICASSP.2009.4959911