DocumentCode :
1237092
Title :
A General Wrapper Approach to Selection of Class-Dependent Features
Author :
Wang, Lipo ; Zhou, Nina ; Chu, Feng
Author_Institution :
Sch. of Electr. & Electron. Eng., Nanyang Technol. Univ., Singapore
Volume :
19
Issue :
7
fYear :
2008
fDate :
7/1/2008 12:00:00 AM
Firstpage :
1267
Lastpage :
1278
Abstract :
In this paper, we argue that for a C-class classification problem, C 2-class classifiers, each of which discriminating one class from the other classes and having a characteristic input feature subset, should in general outperform, or at least match the performance of, a C-class classifier with one single input feature subset. For each class, we select a desirable feature subset, which leads to the lowest classification error rate for this class using a classifier for a given feature subset search algorithm. To fairly compare all models, we propose a weight method for the class-dependent classifier, i.e., assigning a weight to each model\´s output before the comparison is carried out. The method\´s performance is evaluated on two artificial data sets and several real-world benchmark data sets, with the support vector machine (SVM) as the classifier , and with the RELIEF, class separability, and minimal-redundancy–maximal-relevancy (mRMR) as attribute importance measures. Our results indicate that the class-dependent feature subsets found by our approach can effectively remove irrelevant or redundant features, while maintaining or improving (sometimes substantially ) the classification accuracy, in comparison with other feature selection methods.
Keywords :
error statistics; feature extraction; image classification; support vector machines; C 2-class classifiers; C-class classification problem; RELIEF; attribute importance measures; class separability; class-dependent features; classification error rate; feature selection methods; feature subset search algorithm; general wrapper approach; minimal-redundancy-maximal-relevancy; support vector machine; Accuracy; Data mining; Degradation; Error analysis; Feature extraction; Linear discriminant analysis; Neural networks; Principal component analysis; Support vector machine classification; Support vector machines; Class-dependent feature extraction; class-dependent feature selection; feature importance ranking; minimal-redundancy–maximal-relevancy (mRMR); one-against-all; one-against-one; support vector machine (SVM);
fLanguage :
English
Journal_Title :
Neural Networks, IEEE Transactions on
Publisher :
ieee
ISSN :
1045-9227
Type :
jour
DOI :
10.1109/TNN.2008.2000395
Filename :
4531777
Link To Document :
بازگشت