DocumentCode :
814333
Title :
A formal analysis of stopping criteria of decomposition methods for support vector machines
Author :
Lin, Chih-Jen
Author_Institution :
Dept. of Comput. Sci. & Inf. Eng., Nat. Taiwan Univ., Taipei, Taiwan
Volume :
13
Issue :
5
fYear :
2002
fDate :
9/1/2002 12:00:00 AM
Firstpage :
1045
Lastpage :
1052
Abstract :
In a previous paper, the author (2001) proved the convergence of a commonly used decomposition method for support vector machines (SVMs). However, there is no theoretical justification about its stopping criterion, which is based on the gap of the violation of the optimality condition. It is essential to have the gap asymptotically approach zero, so we are sure that existing implementations stop in a finite number of iterations after reaching a specified tolerance. Here, we prove this result and illustrate it by two extensions: ν-SVM and a multiclass SVM by Crammer and Singer (2001). A further result shows that, in final iterations of the decomposition method, only a particular set of variables are still being modified. This supports the use of the shrinking and caching techniques in some existing implementations. Finally, we prove the asymptotic convergence of a decomposition method for this multiclass SVM. Discussions on the difference between this convergence proof and the one in another paper by Lin are also included.
Keywords :
convergence; learning (artificial intelligence); learning automata; optimisation; ν-SVM; asymptotic convergence; caching; decomposition methods; formal analysis; multiclass SVM; shrinking; stopping criteria; support vector machines; Computer science; Convergence; Kernel; Matrix decomposition; Support vector machines; Upper bound;
fLanguage :
English
Journal_Title :
Neural Networks, IEEE Transactions on
Publisher :
ieee
ISSN :
1045-9227
Type :
jour
DOI :
10.1109/TNN.2002.1031937
Filename :
1031937
Link To Document :
بازگشت