DocumentCode :
816428
Title :
Global Convergence of Decomposition Learning Methods for Support Vector Machines
Author :
Takahashi, N. ; Nishi, T.
Author_Institution :
Dept. of Comput. Sci. & Commun. Eng., Kyushu Univ., Fukuoka
Volume :
17
Issue :
6
fYear :
2006
Firstpage :
1362
Lastpage :
1369
Abstract :
Decomposition methods are well-known techniques for solving quadratic programming (QP) problems arising in support vector machines (SVMs). In each iteration of a decomposition method, a small number of variables are selected and a QP problem with only the selected variables is solved. Since large matrix computations are not required, decomposition methods are applicable to large QP problems. In this paper, we will make a rigorous analysis of the global convergence of general decomposition methods for SVMs. We first introduce a relaxed version of the optimality condition for the QP problems and then prove that a decomposition method reaches a solution satisfying this relaxed optimality condition within a finite number of iterations under a very mild condition on how to select variables
Keywords :
convergence; learning (artificial intelligence); matrix decomposition; quadratic programming; support vector machines; decomposition learning methods; global convergence; iteration method; quadratic programming; support vector machines; Computer science; Convergence; Learning systems; Machine learning; Matrix decomposition; Neural networks; Pattern recognition; Quadratic programming; Signal processing algorithms; Support vector machines; Decomposition method; global convergence; quadratic programming (QP); support vector machines (SVMs); termination; Algorithms; Artificial Intelligence; Information Storage and Retrieval; Information Theory; Pattern Recognition, Automated; Programming, Linear; Signal Processing, Computer-Assisted;
fLanguage :
English
Journal_Title :
Neural Networks, IEEE Transactions on
Publisher :
ieee
ISSN :
1045-9227
Type :
jour
DOI :
10.1109/TNN.2006.880584
Filename :
4012045
Link To Document :
بازگشت