Author_Institution :
Sch. of Comput. Sci. & Technol., Huazhong Univ. of Sci. & Technol., Wuhan, China
Abstract :
In this paper, we study discriminative analysis of symmetric positive definite (SPD) matrices on Lie groups (LGs), namely, transforming an LG into a dimension-reduced one by optimizing data separability. In particular, we take the space of SPD matrices, e.g., covariance matrices, as a concrete example of LGs, which has proved to be a powerful tool for high-order image feature representation. The discriminative transformation of an LG is achieved by optimizing the within-class compactness as well as the between-class separability based on the popular graph embedding framework. A new kernel based on the geodesic distance between two samples in the dimension-reduced LG is then defined and fed into classical kernel-based classifiers, e.g., support vector machine, for various visual classification tasks. Extensive experiments on five public datasets, i.e., Scene-15, Caltech101, UIUC-Sport, MIT-Indoor, and VOC07, well demonstrate the effectiveness of discriminative analysis for SPD matrices on LGs, and the state-of-the-art performances are reported.
Keywords :
Lie groups; data handling; feature extraction; image representation; matrix algebra; LG; Lie groups; SPD matrices; classical kernel based classifiers; data separability; discriminative analysis; geodesic distance; image feature representation; symmetric positive definite matrices; visual classification; Algebra; Covariance matrices; Kernel; Manifolds; Measurement; Symmetric matrices; Visualization; Discriminative analysis; Lie group; Lie group (LG); graph embedding; visual classification;