Title of article :
Nearest neighbor estimate of conditional mutual information in feature selection
Author/Authors :
Tsimpiris، نويسنده , , Alkiviadis and Vlachos، نويسنده , , Ioannis and Kugiumtzis، نويسنده , , Dimitris، نويسنده ,
Issue Information :
روزنامه با شماره پیاپی سال 2012
Pages :
12
From page :
12697
To page :
12708
Abstract :
Mutual information (MI) is used in feature selection to evaluate two key-properties of optimal features, the relevance of a feature to the class variable and the redundancy of similar features. Conditional mutual information (CMI), i.e., MI of the candidate feature to the class variable conditioning on the features already selected, is a natural extension of MI but not so far applied due to estimation complications for high dimensional distributions. We propose the nearest neighbor estimate of CMI, appropriate for high-dimensional variables, and build an iterative scheme for sequential feature selection with a termination criterion, called CMINN. We show that CMINN is equivalent to feature selection MI filters, such as mRMR and MaxiMin, in the presence of solely single feature effects, and more appropriate for combined feature effects. We compare CMINN to mRMR and MaxiMin on simulated datasets involving combined effects and confirm the superiority of CMINN in selecting the correct features (indicated also by the termination criterion) and giving best classification accuracy. The application to ten benchmark databases shows that CMINN obtains the same or higher classification accuracy compared to mRMR and MaxiMin at a smaller cardinality of the selected feature subset.
Keywords :
Conditional mutual information , Maximin , nearest neighbor estimate , feature selection , Classification , MRMR
Journal title :
Expert Systems with Applications
Serial Year :
2012
Journal title :
Expert Systems with Applications
Record number :
2352710
Link To Document :
بازگشت