Title of article :
Integrated classifier hyperplane placement and feature selection
Author/Authors :
Chinneck، نويسنده , , John W.، نويسنده ,
Issue Information :
روزنامه با شماره پیاپی سال 2012
Pages :
11
From page :
8193
To page :
8203
Abstract :
The process of placing a separating hyperplane for data classification is normally disconnected from the process of selecting the features to use. An approach for feature selection that is conceptually simple but computationally explosive is to simply apply the hyperplane placement process to all possible subsets of features, selecting the smallest set of features that provides reasonable classification accuracy. Two ways to speed this process are (i) use a faster filtering criterion instead of a complete hyperplane placement, and (ii) use a greedy forward or backwards sequential selection method. This paper introduces a new filtering criterion that is very fast: maximizing the drop in the sum of infeasibilities in a linear-programming transformation of the problem. It also shows how the linear programming transformation can be applied to reduce the number of features after a separating hyperplane has already been placed while maintaining the separation that was originally induced by the hyperplane. Finally, a new and highly effective integrated method that simultaneously selects features while placing the separating hyperplane is introduced.
Keywords :
Classification , feature selection , Linear programming , Maximum feasible subsystem
Journal title :
Expert Systems with Applications
Serial Year :
2012
Journal title :
Expert Systems with Applications
Record number :
2352063
Link To Document :
بازگشت