DocumentCode :
1368418
Title :
The analysis of decomposition methods for support vector machines
Author :
Chang, Chih-Chung ; Hsu, Chih-Wei ; Lin, Chih-Jen
Author_Institution :
Dept. of Comput. Sci. & Inf. Eng., Nat. Taiwan Univ., Taipei, Taiwan
Volume :
11
Issue :
4
fYear :
2000
fDate :
7/1/2000 12:00:00 AM
Firstpage :
1003
Lastpage :
1008
Abstract :
The support vector machine (SVM) is a promising technique for pattern recognition. It requires the solution of a large dense quadratic programming problem. Traditional optimization methods cannot be directly applied due to memory restrictions. Up to now, very few methods can handle the memory problem and an important one is the “decomposition method.” However, there is no convergence proof so far. We connect this method to projected gradient methods and provide theoretical proofs for a version of decomposition methods. An extension to bound-constrained formulation of SVM is also provided. We then show that this convergence proof is valid for general decomposition methods if their working set selection meets a simple requirement
Keywords :
convergence; gradient methods; neural nets; pattern recognition; quadratic programming; bound-constrained formulation; convergence proof; decomposition methods; large dense quadratic programming problem; projected gradient methods; support vector machines; working set selection; Computer science; Convergence; Gradient methods; Newton method; Optimization methods; Pattern recognition; Quadratic programming; Support vector machine classification; Support vector machines; Upper bound;
fLanguage :
English
Journal_Title :
Neural Networks, IEEE Transactions on
Publisher :
ieee
ISSN :
1045-9227
Type :
jour
DOI :
10.1109/72.857780
Filename :
857780
Link To Document :
بازگشت