DocumentCode :
2239309
Title :
Tree Decomposition for Large-Scale SVM Problems
Author :
Chang, Fu ; Guo, Chien-Yang ; Lin, Xiao-Rong ; Liu, Chan-Cheng ; Lu, Chi-Jen
Author_Institution :
Inst. of Inf. Sci., Acad. Sinica, Taipei, Taiwan
fYear :
2010
fDate :
18-20 Nov. 2010
Firstpage :
233
Lastpage :
240
Abstract :
To handle problems created by large data sets, we propose a method that uses a decision tree to decompose a given data space and trains SVMs on the decomposed regions. Although there are other means of decomposing a data space, we show that the decision tree has several merits for large-scale SVM training. First, it can classify some data points by its own means, thereby reducing the cost of SVM training applied to the remaining data points. Second, it is efficient for seeking the parameter values that maximize the validation accuracy, which helps maintain good test accuracy. For experiment data sets whose size can be handled by current non-linear, or kernel-based, SVM training techniques, the proposed method can speed up the training by a factor of thousands, and still achieve comparable test accuracy.
Keywords :
decision trees; pattern classification; support vector machines; SVM training; data point classification; data sets; decision tree; kernel-based training technique; nonlinear training technique; tree decomposition; CART; CBD; DTSVM; LASVM; LIBLINEAR; LIBSVM; RDSVM; bagging; large-scale SVM; tree decomposition;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Technologies and Applications of Artificial Intelligence (TAAI), 2010 International Conference on
Conference_Location :
Hsinchu City
Print_ISBN :
978-1-4244-8668-7
Electronic_ISBN :
978-0-7695-4253-9
Type :
conf
DOI :
10.1109/TAAI.2010.47
Filename :
5695459
Link To Document :
بازگشت