DocumentCode :
1797454
Title :
Linear Subspace Learning via sparse dimension reduction
Author :
Ming Yin ; Yi Guo ; Junbin Gao
Author_Institution :
Sch. of Autom., Guangdong Univ. of Technol., Guangzhou, China
fYear :
2014
fDate :
6-11 July 2014
Firstpage :
3540
Lastpage :
3547
Abstract :
Linear Subspace Learning (LSL) has been widely used in many areas of information processing, such as dimensionality reduction, data mining, pattern recognition and computer vision. Recent years have witnessed several excellent extensions of PCA in LSL. One is the recent L1-norm maximization principal component analysis (L1Max-PCA), which aims at learning linear subspace efficiently. L1Max-PCA simply simulates PCA by replacing the covariance with the so-called L1-norm dispersion in the mapped feature space. However, it is difficult to give an intuitive interpretation. In this paper, a novel subspace learning approach based on sparse dimension reduction is proposed, which enforces the sparsity of the mapped data to better recover cluster structures. The optimization problem is solved efficiently via Alternating Direction Method (ADM). Experimental results show that the proposed method is effective in subspace learning.
Keywords :
image classification; learning (artificial intelligence); pattern clustering; ADM; alternating direction method; cluster structure recovery; faces image classification; linear subspace learning; optimization problem; sparse dimension reduction; Databases; Educational institutions; Face; Minimization; Optimization; Principal component analysis; Robustness; Alternating Direction Method; Ll-norm; Subspace learning; principal component analysis (PCA);
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Neural Networks (IJCNN), 2014 International Joint Conference on
Conference_Location :
Beijing
Print_ISBN :
978-1-4799-6627-1
Type :
conf
DOI :
10.1109/IJCNN.2014.6889461
Filename :
6889461
Link To Document :
بازگشت