Title :
Estimating the Leave-one-out Error for Support Vector Regression
Author :
Liu, Jingxu ; Tan, Yuejin
Author_Institution :
Dept. of Inf. Syst. & Manage., Nat. Univ. of Defense Technol., Changsha
Abstract :
Tuning multiple parameters for support vector machines is usually done by minimizing some estimates of the generalization error. Leave-one-out error is an unbiased estimate of the true generalization error, but the computation classification presented by Joachim, a new bound on of it is time costing. Inspired by the leave-one-out bound for leave-one-out error of support vector regression is derived in this paper. After the solution of the optimization problem for support vector regression is obtained, the bound can be computed with very little additional work. Experiments on benchmark datasets illustrate its ability to estimate the generalization error. The bound can be used to tune hyperparameters for support vector regression
Keywords :
optimisation; regression analysis; support vector machines; leave-one-out error estimation; optimization problem; support vector machines; support vector regression; Computer errors; Information management; Kernel; Management information systems; Noise level; Support vector machine classification; Support vector machines; Technology management; Testing; Training data;
Conference_Titel :
Neural Networks and Brain, 2005. ICNN&B '05. International Conference on
Conference_Location :
Beijing
Print_ISBN :
0-7803-9422-4
DOI :
10.1109/ICNNB.2005.1614599