Title :
Optimizing Support Vector regression hyperparameters based on cross-validation
Author :
Ito, Kentaro ; Nakano, Ryohei
Author_Institution :
Nagoya Inst. of Technol., Japan
Abstract :
This paper proposes a method to optimize hyperparameters for Support Vector (SV) regression so that the cross-validation error is minimized. The performance of SV regression depends on its hyperparameters such as ε (the thickness of a tube), C (a penalty factor), σ (kernel function parameter), and so on. This paper employs the procedure of cross-validation to optimize these hyperparameters together with training the corresponding SV regression models; thus, the learning is performed by using a coordinate descent method. Since an error surface produced by the usual ε-insensitive l1 loss is not smooth, not suitable for our approach, we introduce the ε-insensitive l2 loss. The experiments show the l2 loss produces very smooth error surfaces and our coordinate descent nicely works, reaching the model whose validation performance is globally optimal.
Keywords :
learning (artificial intelligence); optimisation; regression analysis; support vector machines; coordinate descent method; hyper parameters optimisation; kernel function parameter; minimum cross validation; penalty factor; support vector regression; training; Indium tin oxide; Kernel; Neural networks; Optimization methods; Paper technology; Pattern recognition; Performance loss; Smoothing methods; Training data; Vectors;
Conference_Titel :
Neural Networks, 2003. Proceedings of the International Joint Conference on
Print_ISBN :
0-7803-7898-9
DOI :
10.1109/IJCNN.2003.1223728