Title :
Low-Rank Transfer Subspace Learning
Author :
Ming Shao ; Castillo, Claris ; Zhenghong Gu ; Yun Fu
Author_Institution :
Northeastern Univ., Boston, MA, USA
Abstract :
One of the most important challenges in machine learning is performing effective learning when there are limited training data available. However, there is an important case when there are sufficient training data coming from other domains (source). Transfer learning aims at finding ways to transfer knowledge learned from a source domain to a target domain by handling the subtle differences between the source and target. In this paper, we propose a novel framework to solve the aforementioned knowledge transfer problem via low-rank representation constraints. This is achieved by finding an optimal subspace where each datum in the target domain can be linearly represented by the corresponding subspace in the source domain. Extensive experiments on several databases, i.e., Yale B, CMU PIE, UB Kin Face databases validate the effectiveness of the proposed approach and show the superiority to the existing, well-established methods.
Keywords :
database management systems; learning (artificial intelligence); CMU PIE database; UB Kin Face database; Yale B database; knowledge transfer problem; low-rank representation constraint; machine learning; training data; transfer subspace learning; Convergence; Databases; Knowledge transfer; Learning systems; Machine learning; Principal component analysis; Silicon; domain adaptation; low-rank; transfer learning;
Conference_Titel :
Data Mining (ICDM), 2012 IEEE 12th International Conference on
Conference_Location :
Brussels
Print_ISBN :
978-1-4673-4649-8
DOI :
10.1109/ICDM.2012.102