DocumentCode :
1923169
Title :
Optimizing Support Vector regression hyperparameters based on cross-validation
Author :
Ito, Kentaro ; Nakano, Ryohei
Author_Institution :
Nagoya Inst. of Technol., Japan
Volume :
3
fYear :
2003
fDate :
20-24 July 2003
Firstpage :
2077
Abstract :
This paper proposes a method to optimize hyperparameters for Support Vector (SV) regression so that the cross-validation error is minimized. The performance of SV regression depends on its hyperparameters such as ε (the thickness of a tube), C (a penalty factor), σ (kernel function parameter), and so on. This paper employs the procedure of cross-validation to optimize these hyperparameters together with training the corresponding SV regression models; thus, the learning is performed by using a coordinate descent method. Since an error surface produced by the usual ε-insensitive l1 loss is not smooth, not suitable for our approach, we introduce the ε-insensitive l2 loss. The experiments show the l2 loss produces very smooth error surfaces and our coordinate descent nicely works, reaching the model whose validation performance is globally optimal.
Keywords :
learning (artificial intelligence); optimisation; regression analysis; support vector machines; coordinate descent method; hyper parameters optimisation; kernel function parameter; minimum cross validation; penalty factor; support vector regression; training; Indium tin oxide; Kernel; Neural networks; Optimization methods; Paper technology; Pattern recognition; Performance loss; Smoothing methods; Training data; Vectors;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Neural Networks, 2003. Proceedings of the International Joint Conference on
ISSN :
1098-7576
Print_ISBN :
0-7803-7898-9
Type :
conf
DOI :
10.1109/IJCNN.2003.1223728
Filename :
1223728
Link To Document :
بازگشت