DocumentCode :
506898
Title :
Derivations of Normalized Mutual Information in Binary Classifications
Author :
Wang, Yong ; Hu, Bao Gang
Author_Institution :
Beijing Grad. Sch., Chinese Acad. of Sci., Beijing, China
Volume :
1
fYear :
2009
fDate :
14-16 Aug. 2009
Firstpage :
155
Lastpage :
163
Abstract :
Although the conventional performance indexes, such as accuracy, are commonly used in classifier selection or evaluation, information-based criteria, such as mutual information, are becoming popular in feature/model selections. In this work, we analyze the classifier learning model with the maximization normalized mutual information (NI) criterion, which is novel and well defined in a compact range for classifier evaluation. We derive close-form relations of normalized mutual information with respect to accuracy, precision, and recall in binary classifications. By exploring the relations among them, we reveal that NI is actually a set of nonlinear functions, with a concordant power-exponent form, to each performance index. The relations can also be expressed with respect to precision and recall, or to false alarm and hitting rate (recall).
Keywords :
information theory; nonlinear functions; pattern classification; performance index; binary classification; classifier evaluation; classifier learning model; classifier selection; information based criteria; maximization normalized mutual information; nonlinear functions; performance index; power-exponent form; Bayesian methods; Bonding; Entropy; Fuzzy systems; Information analysis; Laboratories; Machine learning; Mutual information; Pattern recognition; Performance analysis; binary classification; entropy; model evaluation; nonlinear functions; normalized mutual information;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Fuzzy Systems and Knowledge Discovery, 2009. FSKD '09. Sixth International Conference on
Conference_Location :
Tianjin
Print_ISBN :
978-0-7695-3735-1
Type :
conf
DOI :
10.1109/FSKD.2009.342
Filename :
5358633
Link To Document :
بازگشت