DocumentCode
3373267
Title
Unbiased support vector classifiers
Author
Navia-Vazquez, A. ; Pérez-Cruz, F. ; Artés-Rodríguez, A. ; Figueiras-Vidal, A.R.
Author_Institution
DTSC, Univ. Carlos III de Madrid, Spain
fYear
2001
fDate
2001
Firstpage
183
Lastpage
192
Abstract
Support Vector Classifiers (SVC) are claimed to provide a natural mechanism for implementing Structural Risk Minimization (SRM), obtaining machines with good generalization capabilities. SVC leads to the optimal hyperplane (maximal margin) criterion for separable datasets but, in the nonseparable case, a functional with an additional term has to be minimized. The particular form of this extra term is such that the minimization can be solved via Quadratic Programming (QP), but, in this case, it represents a rather coarse approximation to the number of errors. We propose an unbiased implementation of SVC by introducing a more appropriate "error counting" term. This way, the number of classification errors is truly minimized (hence the "unbiased" appellative), while the maximal margin solution is obtained in the separable case. QP can no longer be used for solving the new minimization problem, and we apply instead an iterated Weighted Least Squares (WLS) procedure. Computer experiments show that the proposed method is superior to the classical approach in terms of both classification error and machine complexity
Keywords
computational complexity; learning automata; least squares approximations; quadratic programming; classification error complexity; machine complexity; quadratic programming; structural risk minimization; unbiased support vector classifiers; weighted least squares procedure; Character generation;
fLanguage
English
Publisher
ieee
Conference_Titel
Neural Networks for Signal Processing XI, 2001. Proceedings of the 2001 IEEE Signal Processing Society Workshop
Conference_Location
North Falmouth, MA
ISSN
1089-3555
Print_ISBN
0-7803-7196-8
Type
conf
DOI
10.1109/NNSP.2001.943123
Filename
943123
Link To Document