Title :
On Extending the SMO Algorithm Sub-Problem
Author :
Sentelle, Christopher ; Georgiopoulos, Michael ; Anagnostopoulos, Georgios C. ; Young, Cynthia
Author_Institution :
Univ. of Central Florida, Orlando
Abstract :
The support vector machine is a widely employed machine learning model due to its repeatedly demonstrated superior generalization performance. The sequential minimal optimization (SMO) algorithm is one of the most popular SVM training approaches. SMO is fast, as well as easy to implement; however, it has a limited working set size (2 points only). Faster training times can result if the working set size can be increased without significantly increasing the computational complexity. In this paper, we extend the 2-point SMO formulation to a 4-point formulation and address the theoretical issues associated with such an extension. We show that modifying the SMO algorithm to increase the working set size is beneficial in terms of the number of iterations required for convergence, and shows promise for reducing the overall training time.
Keywords :
convergence of numerical methods; generalisation (artificial intelligence); iterative methods; learning (artificial intelligence); optimisation; pattern classification; support vector machines; 2-point SMO formulation; 4-point SMO formulation; SMO algorithm subproblem; SVM training approaches; classification model; convergence; iteration method; machine learning model; repeatedly demonstrated superior generalization performance; sequential minimal optimization; support vector machine; Computer science; Convergence; Kernel; Machine learning; Machine learning algorithms; Neural networks; Nonlinear equations; Risk management; Support vector machine classification; Support vector machines;
Conference_Titel :
Neural Networks, 2007. IJCNN 2007. International Joint Conference on
Conference_Location :
Orlando, FL
Print_ISBN :
978-1-4244-1379-9
Electronic_ISBN :
1098-7576
DOI :
10.1109/IJCNN.2007.4371075