DocumentCode :
506634
Title :
Mutual information based on Renyi´s entropy feature selection
Author :
Liu Can-Tao ; Hu Bao-Gang
Author_Institution :
Nat. Lab. of Pattern Recognition, Chinese Acad. of Sci., Beijing, China
Volume :
1
fYear :
2009
fDate :
20-22 Nov. 2009
Firstpage :
816
Lastpage :
820
Abstract :
Feature selection problem has become the focus of much pattern classification research and mutual information is more and more important in the feature selection algorithms. We proposed normalized mutual information based on Renyi´s quadratic entropy feature selection, which reduces the computational complexity, relying on the efficient estimation of the mutual information. Then we combine NMIFS with wrappers into a two-stage feature selection algorithm. This helps us find more charactering feature subset. We perform some experiments to compare the efficiency and classification accuracy to other MI-based feature selection algorithm. Results show that our method leads to promising improvement on computation complexity.
Keywords :
computational complexity; entropy; pattern classification; Renyi´s quadratic entropy feature selection; computational complexity; normalized mutual information; pattern classification; Automation; Computational complexity; Computer science; Degradation; Entropy; Filters; Laboratories; Machine learning algorithms; Mutual information; Pattern recognition; NMIFS; Renyi Entropy; estimation of entropy; feature selection; mutual information;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Intelligent Computing and Intelligent Systems, 2009. ICIS 2009. IEEE International Conference on
Conference_Location :
Shanghai
Print_ISBN :
978-1-4244-4754-1
Electronic_ISBN :
978-1-4244-4738-1
Type :
conf
DOI :
10.1109/ICICISYS.2009.5358033
Filename :
5358033
Link To Document :
بازگشت