Title :
Global Convergence of SMO Algorithm for Support Vector Regression
Author :
Takahashi, Norikazu ; Guo, Jun ; Nishi, Tetsuo
Author_Institution :
Dept. of Comput. Sci. & Commun. Eng., Kyushu Univ., Fukuoka
fDate :
6/1/2008 12:00:00 AM
Abstract :
Global convergence of the sequential minimal optimization (SMO) algorithm for support vector regression (SVR) is studied in this paper. Given l training samples, SVR is formulated as a convex quadratic programming (QP) problem with l pairs of variables. We prove that if two pairs of variables violating the optimality condition are chosen for update in each step and subproblems are solved in a certain way, then the SMO algorithm always stops within a finite number of iterations after finding an optimal solution. Also, efficient implementation techniques for the SMO algorithm are presented and compared experimentally with other SMO algorithms.
Keywords :
convergence; quadratic programming; support vector machines; convex quadratic programming; global convergence; sequential minimal optimization algorithm; support vector regression; Convergence; quadratic programming (QP); sequential minimal optimization (SMO); support vector regression (SVR); Algorithms; Animals; Artificial Intelligence; Computer Simulation; Humans; Neural Networks (Computer); Pattern Recognition, Automated; Signal Processing, Computer-Assisted;
Journal_Title :
Neural Networks, IEEE Transactions on
DOI :
10.1109/TNN.2007.915116