DocumentCode :
1356855
Title :
Feature Extraction Using Constrained Approximation and Suppression
Author :
Washizawa, Yoshikazu
Author_Institution :
Brain Sci. Inst., RIKEN, Wako, Japan
Volume :
21
Issue :
2
fYear :
2010
Firstpage :
201
Lastpage :
210
Abstract :
In this paper, we systematize a family of constrained quadratic classifiers that belong to the class of one-class classifiers. One-class classifiers such as the single-class support vector machine or the subspace methods are widely used for pattern classification and detection problems because they have many advantages over binary classifiers. We interpret subspace methods as rank-constrained quadratic classifiers in the framework. We also introduce two constraints and a method of suppressing the effect of competing classes to make them more accurate and retain their advantages over binary classifiers. Experimental results demonstrate the advantages of our methods over conventional classifiers.
Keywords :
constraint handling; feature extraction; least squares approximations; pattern classification; principal component analysis; support vector machines; constrained approximation; constrained quadratic classifiers; constrained suppression; feature extraction; pattern classification; single-class support vector machine; subspace methods; Class feature information compression (CLAFIC); feature extraction; least squares approximation; rank reduction; regularization; subspace methods; Algorithms; Databases, Factual; Pattern Recognition, Automated; Reproducibility of Results;
fLanguage :
English
Journal_Title :
Neural Networks, IEEE Transactions on
Publisher :
ieee
ISSN :
1045-9227
Type :
jour
DOI :
10.1109/TNN.2009.2034852
Filename :
5353616
Link To Document :
بازگشت