DocumentCode :
2766364
Title :
Revised Optimizer of SVR Hyperparameters Minimizing Cross-Validation Error
Author :
Karasuyama, Masayuki ; Kitakoshi, Daisuke ; Nakano, Ryohei
Author_Institution :
Nagoya Inst. of Technol., Nagoya
fYear :
0
fDate :
0-0 0
Firstpage :
319
Lastpage :
326
Abstract :
The performance of support vector regression (SVR) deeply depends on its hyperparameters such as an insensitive zone thickness epsiv, a penalty factor C, and RBF kernel parameter sigma. A method called MCV-SVR was once proposed, which optimizes SVR hyperparameters so that a cross-validation error is minimized. However, as pointed out in this paper, the MCV-SVR (or its variants) has numerical instability in gradient calculation, which may cause bad influence on performance. Thus, this paper introduces a new method of computing the gradient of parameters with respect to hyperparameters. The revised optimizers incorporating the new method is shown to be free from the instability problem. Our experiments using three data sets showed that the revised optimizers considerably improved generalization performance of the MCV-SVR or its variant, and outperformed other methods such as multi-layer perceptrons or SVR with practical setting of hyperparameters.
Keywords :
gradient methods; radial basis function networks; regression analysis; support vector machines; RBF kernel parameter; cross-validation error; instability problem; pattern classification; penalty factor; support vector regression; Computer science; Kernel; Lagrangian functions; Multilayer perceptrons; Neural networks; Optimization methods; Pattern classification; Support vector machine classification; Support vector machines; Training data;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Neural Networks, 2006. IJCNN '06. International Joint Conference on
Conference_Location :
Vancouver, BC
Print_ISBN :
0-7803-9490-9
Type :
conf
DOI :
10.1109/IJCNN.2006.246698
Filename :
1716109
Link To Document :
بازگشت