Title :
A Semi-definite Positive Linear Discriminant Analysis and Its Applications
Author :
Deguang Kong ; Ding, Chibiao
Author_Institution :
Dept. of Comput. Sci. & Eng., Univ. of Texas at Arlington, Arlington, TX, USA
Abstract :
Linear Discriminant Analysis (LDA) is widely used for dimension reduction in classification tasks. However, standard LDA formulation is not semi definite positive (s.d.p), and thus it is difficult to obtain the global optimal solution when standard LDA formulation is combined with other loss functions or graph embedding. In this paper, we present an alternative approach to LDA. We rewrite the LDA criterion as a convex formulation (semi-definite positive LDA, i.e., sdpLDA) using the largest eigen-value of the generalized eigen-value problem of standard LDA. We give applications by incorporating sdpLDA as a regularization term into discriminant regression analysis. Another application is to incorporate sdpLDA into standard Laplacian embedding, which utilizes the supervised information to improve the Laplacian embedding performance. Proposed sdpLDA formulation can be used for both multi-class classification tasks. Extensive experiments results on 10 multi-class datasets indicate promising results of proposed method.
Keywords :
convex programming; data analysis; eigenvalues and eigenfunctions; graph theory; pattern classification; regression analysis; LDA criterion; classification tasks; convex formulation; dimension reduction; discriminant regression analysis; eigenvalue; global optimal solution; graph embedding; sdpLDA; semi-definite positive linear discriminant analysis; standard Laplacian embedding; Accuracy; Convex functions; Eigenvalues and eigenfunctions; Kernel; Laplace equations; Linear regression; Standards; LDA; kernel LDA; multi-class; multi-label; semi-definite positive;
Conference_Titel :
Data Mining (ICDM), 2012 IEEE 12th International Conference on
Conference_Location :
Brussels
Print_ISBN :
978-1-4673-4649-8
DOI :
10.1109/ICDM.2012.111