DocumentCode :
3724161
Title :
Efficient Approximate Solutions to Mutual Information Based Global Feature Selection
Author :
Hemanth Venkateswara;Prasanth Lade;Binbin Lin;Jieping Ye;Sethuraman Panchanathan
Author_Institution :
Arizona State Univ., Tempe, AZ, USA
fYear :
2015
Firstpage :
1009
Lastpage :
1014
Abstract :
Mutual Information (MI) is often used for feature selection when developing classifier models. Estimating the MI for a subset of features is often intractable. We demonstrate, that under the assumptions of conditional independence, MI between a subset of features can be expressed as the Conditional Mutual Information (CMI) between pairs of features. But selecting features with the highest CMI turns out to be a hard combinatorial problem. In this work, we have applied two unique global methods, Truncated Power Method (TPower) and Low Rank Bilinear Approximation (LowRank), to solve the feature selection problem. These algorithms provide very good approximations to the NP-hard CMI based feature selection problem. We experimentally demonstrate the effectiveness of these procedures across multiple datasets and compare them with existing MI based global and iterative feature selection procedures.
Keywords :
"Yttrium","Approximation methods","Random variables","Entropy","Mutual information","Computational modeling","Estimation"
Publisher :
ieee
Conference_Titel :
Data Mining (ICDM), 2015 IEEE International Conference on
ISSN :
1550-4786
Type :
conf
DOI :
10.1109/ICDM.2015.140
Filename :
7373427
Link To Document :
بازگشت