DocumentCode
445975
Title
Fβ support vector machines
Author
Callut, Jérôme ; Dupont, Pierre
Author_Institution
Dept. of Comput. Sci. & Eng., Univ. Catholique de Louvain, Louvain-la-Neuve, Belgium
Volume
3
fYear
2005
fDate
31 July-4 Aug. 2005
Firstpage
1443
Abstract
We introduce in this paper Fβ SVMs, a new parametrization of support vector machines. It allows to optimize a SVM in terms of Fβ, a classical information retrieval criterion, instead of the usual classification rate. Experiments illustrate the advantages of this approach with respect to the traditional 2-norm soft-margin SVM when precision and recall are of unequal importance. An automatic model selection procedure based on the generalization Fβ score is introduced. It relies on the results of Chapelle, Vapnjk et al. (2002) about the use of gradient-based techniques in SVM model selection. The derivatives of a Fβ loss function with respect to the hyperparameters C and the width σ of a gaussian kernel are formally defined. The model is then selected by performing a gradient descent of the Fβ loss function over the set of hyperparameters. Experiments on artificial and real-life data show the benefits of this method when the Fβ score is considered.
Keywords
support vector machines; Fβ loss function; Fβ support vector machines; automatic model selection; gradient descent method; gradient-based technique; information retrieval; Polynomials; Support vector machine classification; Support vector machines; Upper bound;
fLanguage
English
Publisher
ieee
Conference_Titel
Neural Networks, 2005. IJCNN '05. Proceedings. 2005 IEEE International Joint Conference on
Print_ISBN
0-7803-9048-2
Type
conf
DOI
10.1109/IJCNN.2005.1556087
Filename
1556087
Link To Document