DocumentCode :
2396946
Title :
On improving sequential minimal optimization
Author :
Wu, Zhi-Li ; Chun-Hung Li
Author_Institution :
Dept. of Comput. Sci., Hong Kong Baptist Univ., China
Volume :
7
fYear :
2004
fDate :
26-29 Aug. 2004
Firstpage :
4308
Abstract :
Platt´s sequential minimal optimization has been widely adopted in modern implementations of support vector machines. This work points out only caching the gradients for unbounded support vectors in sequential minimal optimization affects efficiency. A better principle is to cache gradients for all vectors frequently checked. This paper also shows searching for working pairs which maximizes the gradient differences conducted more aggressively. The results on extending the search for pairs maximizing objective changes shows no extra cost of kernel evaluations, but demonstrates better convergence rate and comparable runtime.
Keywords :
convergence of numerical methods; optimisation; support vector machines; convergence rate; gradients caching; kernel evaluations; maximisation; sequential minimal optimization; support vector machines; Computer science; Costs; Indexing; Kernel; Lagrangian functions; Machine learning; Quadratic programming; Runtime; Support vector machine classification; Support vector machines;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Machine Learning and Cybernetics, 2004. Proceedings of 2004 International Conference on
Print_ISBN :
0-7803-8403-2
Type :
conf
DOI :
10.1109/ICMLC.2004.1384594
Filename :
1384594
Link To Document :
بازگشت