DocumentCode :
1551408
Title :
Successive overrelaxation for support vector machines
Author :
Mangasarian, Olvi L. ; Musicant, David R.
Author_Institution :
Dept. of Comput. Sci., Wisconsin Univ., Madison, WI, USA
Volume :
10
Issue :
5
fYear :
1999
fDate :
9/1/1999 12:00:00 AM
Firstpage :
1032
Lastpage :
1037
Abstract :
Successive overrelaxation (SOR) for symmetric linear complementarity problems and quadratic programs is used to train a support vector machine (SVM) for discriminating between the elements of two massive datasets, each with millions of points. Because SOR handles one point at a time, similar to Platt´s sequential minimal optimization (SMO) algorithm (1999) which handles two constraints at a time and Joachims´ SVMlight (1998) which handles a small number of points at a time, SOR can process very large datasets that need not reside in memory. The algorithm converges linearly to a solution. Encouraging numerical results are presented on datasets with up to 10 000 000 points. Such massive discrimination problems cannot be processed by conventional linear or quadratic programming methods, and to our knowledge have not been solved by other methods. On smaller problems, SOR was faster than SVMlight and comparable or faster than SMO
Keywords :
convergence; learning (artificial intelligence); linear programming; pattern recognition; quadratic programming; relaxation theory; SMO algorithm; SOR; SVMlight; linear convergence; massive discrimination problems; quadratic programs; sequential minimal optimization algorithm; successive overrelaxation; support vector machines; symmetric linear complementarity problems; Constraint optimization; Convergence; Equations; Kernel; Mathematical programming; Military computing; Quadratic programming; Support vector machines; Time factors;
fLanguage :
English
Journal_Title :
Neural Networks, IEEE Transactions on
Publisher :
ieee
ISSN :
1045-9227
Type :
jour
DOI :
10.1109/72.788643
Filename :
788643
Link To Document :
بازگشت