Title :
On improving sequential minimal optimization
Author :
Wu, Zhi-Li ; Chun-Hung Li
Author_Institution :
Dept. of Comput. Sci., Hong Kong Baptist Univ., China
Abstract :
Platt´s sequential minimal optimization has been widely adopted in modern implementations of support vector machines. This work points out only caching the gradients for unbounded support vectors in sequential minimal optimization affects efficiency. A better principle is to cache gradients for all vectors frequently checked. This paper also shows searching for working pairs which maximizes the gradient differences conducted more aggressively. The results on extending the search for pairs maximizing objective changes shows no extra cost of kernel evaluations, but demonstrates better convergence rate and comparable runtime.
Keywords :
convergence of numerical methods; optimisation; support vector machines; convergence rate; gradients caching; kernel evaluations; maximisation; sequential minimal optimization; support vector machines; Computer science; Costs; Indexing; Kernel; Lagrangian functions; Machine learning; Quadratic programming; Runtime; Support vector machine classification; Support vector machines;
Conference_Titel :
Machine Learning and Cybernetics, 2004. Proceedings of 2004 International Conference on
Print_ISBN :
0-7803-8403-2
DOI :
10.1109/ICMLC.2004.1384594