DocumentCode :
2053618
Title :
When classifier selection meets information theory: A unifying view
Author :
Hady, Mohamed Farouk Abdel ; Schwenker, Friedhelm ; Palm, Gunther
Author_Institution :
Inst. of Neural Inf. Process., Univ. of Ulm, Ulm, Germany
fYear :
2010
fDate :
7-10 Dec. 2010
Firstpage :
314
Lastpage :
319
Abstract :
Classifier selection aims to reduce the size of an ensemble of classifiers in order to improve its efficiency and classification accuracy. Recently an information-theoretic view was presented for feature selection. It derives a space of possible selection criteria and show that several feature selection criteria in the literature are points within this continuous space. The contribution of this paper is to export this information-theoretic view to solve an open issue in ensemble learning which is classifier selection. We investigated a couple of information-theoretic selection criteria that are used to rank classifiers.
Keywords :
feature extraction; learning (artificial intelligence); pattern classification; classifier selection; ensemble learning; feature selection; information theory; Accuracy; Decision trees; Entropy; Joints; Mutual information; Random variables; Redundancy; classification; data mining; decision trees; ensemble learning; ensemble pruning; information theory;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Soft Computing and Pattern Recognition (SoCPaR), 2010 International Conference of
Conference_Location :
Paris
Print_ISBN :
978-1-4244-7897-2
Type :
conf
DOI :
10.1109/SOCPAR.2010.5686645
Filename :
5686645
Link To Document :
بازگشت