DocumentCode :
553120
Title :
Semi-supervised geometric mean of Kullback-Leibler divergences for subspace selection
Author :
Si-Bao Chen ; Hai-Xian Wang ; Xing-Yi Zhang ; Bin Luo
Author_Institution :
Key Lab. of Intell. Comput. & Signal Process. of Minist. of Educ., Anhui Univ., Hefei, China
Volume :
2
fYear :
2011
fDate :
26-28 July 2011
Firstpage :
1232
Lastpage :
1235
Abstract :
Subspace selection is widely adopted in many areas of pattern recognition. A recent result, named maximizing the geometric mean of Kullback-Leibler (KL) divergences of class pairs (MGMD), is a successful method for subspace selection, which can significantly reduce the class separation problem. However, in many applications, labeled data are very limited while unlabeled data can be easily obtained. The estimation of divergences of class pairs is unstable using inadequate labeled data. To take advantage of unlabeled data for subspace selection, semi-supervised MGMD (SSMGMD) is proposed using graph Laplacian as normalization. Quasi-Newton method is adopted to solve the optimization problem. Experiments on synthetic data and real image data show the validity of SSMGMD.
Keywords :
geometry; optimisation; pattern recognition; Kullback-Leibler divergence; Quasi-Newton method; SSMGMD; class separation problem; graph Laplacian; optimization problem; pattern recognition; semi supervised MGMD; semi supervised geometric mean; subspace selection; Covariance matrix; Educational institutions; Laplace equations; Manifolds; Optimization; Symmetric matrices; Training;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Fuzzy Systems and Knowledge Discovery (FSKD), 2011 Eighth International Conference on
Conference_Location :
Shanghai
Print_ISBN :
978-1-61284-180-9
Type :
conf
DOI :
10.1109/FSKD.2011.6019712
Filename :
6019712
Link To Document :
بازگشت