DocumentCode
2708780
Title
Sparse support vector regressors based on forward basis selection
Author
Muraoka, Shigenori ; Abe, Shigeo
Author_Institution
Grad. Sch. of Eng., Kobe Univ., Kobe, Japan
fYear
2009
fDate
14-19 June 2009
Firstpage
2183
Lastpage
2187
Abstract
Support vector regressors (SVRs) usually give sparse solutions but as a regression problem becomes more difficult the number of support vectors increases and thus sparsity is lost. To solve this problem, in this paper we propose sparse support vector regressors (S-SVRs) trained in the reduced empirical feature space. First by forward selection we select the training data samples, which minimize the regression error estimated by kernel least squares. Then in the reduced empirical feature space spanned by the selected, mapped training data, we train the SVR in the dual form. Since the mapped support vectors obtained by training the S-SVR are expressed by the linear combination of the selected, mapped training data, the support vectors, in the sense that form a solution, are selected training data. By computer simulation, we compare performance of the proposed method with that of the regular SVR and that of the sparse SVR based on Cholesky factorization.
Keywords
error statistics; least squares approximations; matrix decomposition; regression analysis; support vector machines; Cholesky factorization; S-SVR forward basis selection; kernel least square approximation error; mapped training data sample; regression error estimation; sparse support vector regressor; Approximation error; Computer simulation; Eigenvalues and eigenfunctions; Function approximation; Kernel; Least squares approximation; Least squares methods; Neural networks; Training data; Vectors;
fLanguage
English
Publisher
ieee
Conference_Titel
Neural Networks, 2009. IJCNN 2009. International Joint Conference on
Conference_Location
Atlanta, GA
ISSN
1098-7576
Print_ISBN
978-1-4244-3548-7
Electronic_ISBN
1098-7576
Type
conf
DOI
10.1109/IJCNN.2009.5178742
Filename
5178742
Link To Document