DocumentCode :
2487495
Title :
A least square kernel machine with box constraints
Author :
Basak, Jayanta
Author_Institution :
IBM India Res. Lab., New Delhi
fYear :
2008
fDate :
8-11 Dec. 2008
Firstpage :
1
Lastpage :
4
Abstract :
In this paper, we present a least square kernel machine with box constraints (LSKMBC). The existing least square machines assume Gaussian hyperpriors and subsequently express the optima of the regularized squared loss as a set of linear equations. The generalized LASSO framework deviates from the assumption of Gaussian hyperpriors and employs a more general Huber loss function. In our approach, we consider uniform priors and obtain the loss functional for a given margin considered to be a model selection parameter. The framework not only differs from the existing least square kernel machines, but also it does not require Mercer condition satisfiability. Experimentally we validate the performance of the classifier and show that it is able to outperform SVM and LSSVM on certain real-life datasets.
Keywords :
Gaussian processes; least squares approximations; support vector machines; Gaussian hyperpriors; LSSVM; general Huber loss function; generalized LASSO framework; least square kernel machine with box constraints; linear equations; regularized squared loss; Bayesian methods; Equations; Gaussian distribution; Hilbert space; Kernel; Lagrangian functions; Least squares methods; Support vector machine classification; Support vector machines; Testing;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Pattern Recognition, 2008. ICPR 2008. 19th International Conference on
Conference_Location :
Tampa, FL
ISSN :
1051-4651
Print_ISBN :
978-1-4244-2174-9
Electronic_ISBN :
1051-4651
Type :
conf
DOI :
10.1109/ICPR.2008.4761717
Filename :
4761717
Link To Document :
بازگشت