Title :
Discriminative components of data
Author :
Peltonen, Jaakko ; Kaski, Samuel
Author_Institution :
Neural Networks Res. Center, Helsinki Univ. of Technol., Finland
Abstract :
A simple probabilistic model is introduced to generalize classical linear discriminant analysis (LDA) in finding components that are informative of or relevant for data classes. The components maximize the predictability of the class distribution which is asymptotically equivalent to 1) maximizing mutual information with the classes, and 2) finding principal components in the so-called learning or Fisher metrics. The Fisher metric measures only distances that are relevant to the classes, that is, distances that cause changes in the class distribution. The components have applications in data exploration, visualization, and dimensionality reduction. In empirical experiments, the method outperformed, in addition to more classical methods, a Renyi entropy-based alternative while having essentially equivalent computational cost.
Keywords :
entropy; learning (artificial intelligence); probability; Fisher metrics; data discriminative components; generalize classical linear discriminant analysis; simple probabilistic model; Computational efficiency; Covariance matrix; Data analysis; Data visualization; Entropy; Information analysis; Linear discriminant analysis; Mutual information; Neural networks; Predictive models; Component model; discriminant analysis; exploratory data analysis; learning metrics; mutual information; Algorithms; Artificial Intelligence; Cluster Analysis; Computing Methodologies; Databases, Factual; Discriminant Analysis; Information Storage and Retrieval; Models, Statistical; Neural Networks (Computer); Pattern Recognition, Automated; Principle-Based Ethics;
Journal_Title :
Neural Networks, IEEE Transactions on
DOI :
10.1109/TNN.2004.836194