DocumentCode
1263902
Title
A comparison of methods for multiclass support vector machines
Author
Hsu, Chih-Wei ; Lin, Chih-Jen
Author_Institution
Dept. of Comput. Sci. & Inf. Eng., Nat. Taiwan Univ., Taipei, Taiwan
Volume
13
Issue
2
fYear
2002
fDate
3/1/2002 12:00:00 AM
Firstpage
415
Lastpage
425
Abstract
Support vector machines (SVMs) were originally designed for binary classification. How to effectively extend it for multiclass classification is still an ongoing research issue. Several methods have been proposed where typically we construct a multiclass classifier by combining several binary classifiers. Some authors also proposed methods that consider all classes at once. As it is computationally more expensive to solve multiclass problems, comparisons of these methods using large-scale problems have not been seriously conducted. Especially for methods solving multiclass SVM in one step, a much larger optimization problem is required so up to now experiments are limited to small data sets. In this paper we give decomposition implementations for two such "all-together" methods. We then compare their performance with three methods based on binary classifications: "one-against-all," "one-against-one," and directed acyclic graph SVM (DAGSVM). Our experiments indicate that the "one-against-one" and DAG methods are more suitable for practical use than the other methods. Results also show that for large problems methods by considering all data at once in general need fewer support vectors
Keywords
learning automata; pattern classification; DAGSVM; SVMs; binary classifiers; decomposition; directed acyclic graph SVM; multiclass classification; optimization; support vector machines; Computer science; Large-scale systems; Optimization methods; Support vector machine classification; Support vector machines; Training data;
fLanguage
English
Journal_Title
Neural Networks, IEEE Transactions on
Publisher
ieee
ISSN
1045-9227
Type
jour
DOI
10.1109/72.991427
Filename
991427
Link To Document