DocumentCode
3328384
Title
Generalized Domain-Adaptive Dictionaries
Author
Shekhar, Shashi ; Patel, Vishal M. ; Nguyen, Hien ; Chellappa, Rama
Author_Institution
Univ. of Maryland, College Park, MD, USA
fYear
2013
fDate
23-28 June 2013
Firstpage
361
Lastpage
368
Abstract
Data-driven dictionaries have produced state-of-the-art results in various classification tasks. However, when the target data has a different distribution than the source data, the learned sparse representation may not be optimal. In this paper, we investigate if it is possible to optimally represent both source and target by a common dictionary. Specifically, we describe a technique which jointly learns projections of data in the two domains, and a latent dictionary which can succinctly represent both the domains in the projected low-dimensional space. An efficient optimization technique is presented, which can be easily kernelized and extended to multiple domains. The algorithm is modified to learn a common discriminative dictionary, which can be further used for classification. The proposed approach does not require any explicit correspondence between the source and target domains, and shows good results even when there are only a few labels available in the target domain. Various recognition experiments show that the method performs on par or better than competitive state-of-the-art methods.
Keywords
data structures; dictionaries; learning (artificial intelligence); classification tasks; common dictionary; common discriminative dictionary; data projection learning; data-driven dictionaries; generalized domain-adaptive dictionaries; learned sparse representation; optimization technique; projected low-dimensional space; source data; target data; target domain; Cost function; Dictionaries; Joints; Kernel; Robustness; Sparse matrices;
fLanguage
English
Publisher
ieee
Conference_Titel
Computer Vision and Pattern Recognition (CVPR), 2013 IEEE Conference on
Conference_Location
Portland, OR
ISSN
1063-6919
Type
conf
DOI
10.1109/CVPR.2013.53
Filename
6618897
Link To Document