Title :
Feature selection and fast training of subspace based support vector machines
Author :
Kitamura, Takuya ; Takeuchi, Syogo ; Abe, Shigeo
Author_Institution :
Grad. Sch. of Eng., Kobe Univ., Kobe, Japan
Abstract :
In this paper, we propose two methods for subspace based support vector machines (SS-SVMs) which are subspace based least squares support vector machines (SSLS-SVMs) and subspace based linear programming support vector machines (SSLP-SVMs): 1) optimum selection of the dictionaries of each class subspace from the standpoint of classification separability, and 2) speeding up training SS-SVMs. In method 1), for SSLS-SVMs, we select the dictionaries with optimized weights, and for SSLP-SVMs, we select the dictionaries without non-negative constraints. In method 2), the empirical feature space is obtained by using only the training data belonging to a class instead of using all the training data. Thus the dimension of the empirical feature space and training cost become lower. We demonstrate the effectiveness of the proposed methods over the conventional method for two-class bench mark datasets.
Keywords :
least squares approximations; linear programming; support vector machines; SSLP-SVM; SSLS-SVM; feature selection; subspace based least squares support vector machines; subspace based linear programming support vector machines; Heart;
Conference_Titel :
Neural Networks (IJCNN), The 2010 International Joint Conference on
Conference_Location :
Barcelona
Print_ISBN :
978-1-4244-6916-1
DOI :
10.1109/IJCNN.2010.5596566