DocumentCode :
3661002
Title :
Optimizing working sets for training support vector regressors by Newton´s method
Author :
Shigeo Abe
Author_Institution :
Kobe University, Japan
fYear :
2015
fDate :
7/1/2015 12:00:00 AM
Firstpage :
1
Lastpage :
8
Abstract :
In this paper, we train support vector regressors (SVRs) fusing sequential minimal optimization (SMO) and Newton´s method. We use the SVR formulation that includes the absolute variables. A partial derivative of the absolute variable with respect to the associated variable is indefinite when the variable takes on zero. We determine the derivative value according to whether the optimal solution exits in the positive region, negative region, or at zero. In selecting working set, we use the method that we have developed for the SVM, namely, in addition to the pair of variables selected by SMO, loop variables that repeatedly appear in training, are added to the working set. By this method the working set size is automatically determined. We demonstrate the validity of our method over SMO using several benchmark data sets.
Keywords :
Integrated circuits
Publisher :
ieee
Conference_Titel :
Neural Networks (IJCNN), 2015 International Joint Conference on
Electronic_ISBN :
2161-4407
Type :
conf
DOI :
10.1109/IJCNN.2015.7280309
Filename :
7280309
Link To Document :
بازگشت