DocumentCode :
671692
Title :
The convergence rate of linearly separable SMO
Author :
Lopez, J. ; Dorronsoro, Jose R.
Author_Institution :
Inst. de Ing. del Conocimiento (IIC), Univ. Autonoma de Madrid, Madrid, Spain
fYear :
2013
fDate :
4-9 Aug. 2013
Firstpage :
1
Lastpage :
7
Abstract :
It is well known that the dual function value sequence generated by SMO has a linear convergence rate when the kernel matrix is positive definite and sublinear convergence is also known to hold for a general matrix. In this paper we will prove that, when applied to hard-margin, i.e., linearly separable SVM problems, a linear convergence rate holds for the SMO algorithm without any condition on the kernel matrix. Moreover, we will also show linear convergence for the multiplier sequence generated by SMO, the corresponding weight vectors and the KKT gap usually applied to control the number of SMO iterations. This gives a fairly complete picture of the convergence of the various sequences SMO generates. While linear SMO convergence for the general SVM L1 soft margin problem is still open, the approach followed here may lead to such a general result.
Keywords :
convergence; iterative methods; matrix algebra; optimisation; KKT gap; SMO algorithm; SMO iteration; SVM soft margin problem; definite convergence; dual function value sequence; general matrix; kernel matrix; linear SMO convergence; linear convergence rate; linearly separable SMO; linearly separable SVM problems; multiplier sequence; sublinear convergence; weight vectors; Convergence; Equations; Indexes; Kernel; Standards; Support vector machines; Vectors;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Neural Networks (IJCNN), The 2013 International Joint Conference on
Conference_Location :
Dallas, TX
ISSN :
2161-4393
Print_ISBN :
978-1-4673-6128-6
Type :
conf
DOI :
10.1109/IJCNN.2013.6707034
Filename :
6707034
Link To Document :
بازگشت