DocumentCode
394117
Title
Training of support vector regressors based on the steepest ascent method
Author
Hirokawa, Y. ; Abe, Shigeo
Author_Institution
Graduate Sch. of Sci. & Technol., Kobe Univ., Japan
Volume
2
fYear
2002
fDate
18-22 Nov. 2002
Firstpage
552
Abstract
In this paper, we propose a new method for training support vector regressors. In our method, we partition all the variables into two sets: a working set that consists of more than two variables and a set in which variables are fixed. Then we optimize the variables in the working set using the steepest ascent method. If the Hessian matrix associated with the working set is not positive definite, we calculate corrections only for the independent variable in the working set. We test our method by two benchmark data sets, and show that by increasing the working set size, we can speed up training of support vector regressors.
Keywords
Hessian matrices; differential equations; function approximation; learning (artificial intelligence); optimisation; support vector machines; time series; Hessian matrix; Mackey-Glass differential equation; function approximation; optimization; steepest ascent method; support vector machines; support vector regressor training; time series; water purification plant; Benchmark testing; Differential equations; Function approximation; Lagrangian functions; Optimization methods; Purification; Quadratic programming; Support vector machine classification; Support vector machines; Training data;
fLanguage
English
Publisher
ieee
Conference_Titel
Neural Information Processing, 2002. ICONIP '02. Proceedings of the 9th International Conference on
Print_ISBN
981-04-7524-1
Type
conf
DOI
10.1109/ICONIP.2002.1198117
Filename
1198117
Link To Document