DocumentCode
3211877
Title
Tuning SVM hyperparameters in the primal
Author
Dongyuan, Huang ; Xiaoyun, Chen
Author_Institution
Coll. of Math. & Comput. Sci., Fuzhou Univ., Fuzhou, China
Volume
1
fYear
2010
fDate
13-14 Sept. 2010
Firstpage
201
Lastpage
204
Abstract
Choosing optimal hyperparameters for Support Vector Machines (SVMs) is quite difficult but extremely essential in SVM design. This is usually done by minimizing estimates of generalization error such as the k-fold cross-validation error or the upper bound of leave-one-out (LOO) error. However, most of the approaches concentrate on the dual optimization problem of SVM. In this paper, we would like to consider the task of tuning hyperparameters in the primal. We derive a smooth validation function from the k-fold cross-validation, then tune hyperparameters by minimizing the smooth validation function using Quasi- Newton optimization technique. Experimental results not only show that our approach is much faster and provides more precise results than grid search method, but also demonstrate that tuning hyperparameters in the primal would be more efficient than in the dual due to advantages provided by the primal.
Keywords
errors; optimisation; support vector machines; SVM hyperparameter; dual optimization problem; grid search method; k-fold cross validation error; leave-one-out error; primal; quasi-Newton optimization technique; smooth validation function; support vector machine; tuning hyperparameter; Artificial neural networks; Heart;
fLanguage
English
Publisher
ieee
Conference_Titel
Computational Intelligence and Natural Computing Proceedings (CINC), 2010 Second International Conference on
Conference_Location
Wuhan
Print_ISBN
978-1-4244-7705-0
Type
conf
DOI
10.1109/CINC.2010.5643857
Filename
5643857
Link To Document