DocumentCode :
3492575
Title :
Efficient reduction of support vectors in kernel-based methods
Author :
Kobayashi, Takumi ; Otsu, Nobuyuki
Author_Institution :
Nat. Inst. of Adv. Ind. Sci. & Technol., Tsukuba, Japan
fYear :
2009
fDate :
7-10 Nov. 2009
Firstpage :
2077
Lastpage :
2080
Abstract :
Kernel-based methods, e.g., support vector machine (SVM), produce high classification performances. However, the computation becomes time-consuming as the number of the vectors supporting the classifier increases. In this paper, we propose a method for reducing the computational cost of classification by kernel-based methods while retaining the high performance. By using linear algebra of a kernel Gram matrix of the support vectors (SVs) at low computational cost, the method efficiently prunes the redundant SVs which are unnecessary for constructing the classifier. The pruning is based on the evaluation of the performance of the classifier formed by the reduced SVs in SVM. In the experiment of classification using SVM for various datasets, the feasibility of the evaluation criterion and the effectiveness of the proposed method are demonstrated.
Keywords :
matrix algebra; pattern classification; support vector machines; classification performances; computational cost; kernel Gram matrix; kernel-based methods; linear algebra; support vector machine; support vectors; Computational efficiency; Euclidean distance; Kernel; Large-scale systems; Linear algebra; Performance evaluation; Support vector machine classification; Support vector machines; Kernel-based method; Reduction of support vectors; Support vector machine;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Image Processing (ICIP), 2009 16th IEEE International Conference on
Conference_Location :
Cairo
ISSN :
1522-4880
Print_ISBN :
978-1-4244-5653-6
Electronic_ISBN :
1522-4880
Type :
conf
DOI :
10.1109/ICIP.2009.5414339
Filename :
5414339
Link To Document :
بازگشت