DocumentCode :
3661380
Title :
Linear discriminant analysis with an information divergence criterion
Author :
Matthew Emigh;Evan Kriminger;José C. Prîncipe
Author_Institution :
Department of Electrical and Computer Engineering, University of Florida, Gainesville, 32611, United States
fYear :
2015
fDate :
7/1/2015 12:00:00 AM
Firstpage :
1
Lastpage :
6
Abstract :
Linear discriminant analysis seeks to find a one-dimensional projection of a dataset to alleviate the problems associated with classifying high-dimensional data. The earliest methods, based on second-order statistics often fail on multimodal datasets. Information-theoretic criteria do not suffer in such cases, and allow for projections to spaces higher than one dimension and with multiple classes. These approaches are based on maximizing mutual information between the projected data and the labels. However, mutual information is computationally demanding and vulnerable to datasets with class imbalance. In this paper we propose an information-theoretic criterion for learning discriminants based on the Euclidean distance divergence between classes. This objective more directly seeks projections which separate classes and performs well in the midst of class imbalance. We demonstrate the effectiveness on real datasets, and provide extensions to the multi-class and multi-dimension cases.
Keywords :
"Euclidean distance","Ionosphere"
Publisher :
ieee
Conference_Titel :
Neural Networks (IJCNN), 2015 International Joint Conference on
Electronic_ISBN :
2161-4407
Type :
conf
DOI :
10.1109/IJCNN.2015.7280693
Filename :
7280693
Link To Document :
بازگشت