DocumentCode :
578121
Title :
Sparse LS-SVM two-steps model selection method
Author :
Sun, Binbin ; Yeung, Daniel S.
Author_Institution :
Dept. of Comput. Sci., Harbin Inst. of Technol., Harbin, China
Volume :
2
fYear :
2012
fDate :
15-17 July 2012
Firstpage :
460
Lastpage :
465
Abstract :
Least Square Support Vector Machine (LS-SVM) converts the hinge loss function of SVM into a least square loss function which simplified the original quadratic programming training method to a linear system solving problem. Sparse LS-SVM is obtained with a pruning procedure. The performance of sparse LS-SVM depends on the selection of hyper-parameters (i.e. kernel and penalty parameters). Currently, CV and LOO are the most common methods to select hyper-parameters for LS-SVM. However, CV is computationally expensive while LOO yields a high variance of validation error which may mislead the selection of hyper-parameters. Selecting both kernel and penalty parameters simultaneously needs to search in a high dimensional parameter space. In this work, we propose a new two-step hyper-parameter selection method. Distance between Two Classes (DBTC) method is adopted to select the kernel parameters based on a maximization of between-class separation of the projected samples in the feature space. However, data distribution could not be helpful in penalty parameter selection. Therefore, we propose to select the penalty parameter via a minimization of a Localized Generalization Error to enhance the generalization capability of the LS-SVM. Experimental results comparing to existing methods show the proposed two-step method yields better LS-SVMs in term of average testing accuracies.
Keywords :
least squares approximations; quadratic programming; support vector machines; between-class separation maximization; distance-between-two classes method; feature space; hinge loss function; kernel parameters; least square loss function; least square support vector machine; linear system solving problem; localized generalization error minimization; penalty parameters; pruning procedure; quadratic programming training method; sparse LS-SVM two-steps model selection method; two-step hyper-parameter selection method; Abstracts; Heart; Sonar; Distance Between Two Classes; Localized Generalization Error Model; model selection; sensitivity measure; sparse LS-SVMs;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Machine Learning and Cybernetics (ICMLC), 2012 International Conference on
Conference_Location :
Xian
ISSN :
2160-133X
Print_ISBN :
978-1-4673-1484-8
Type :
conf
DOI :
10.1109/ICMLC.2012.6358967
Filename :
6358967
Link To Document :
بازگشت