DocumentCode :
3166330
Title :
Accelerating Active Learning with Transfer Learning
Author :
Kale, David ; Yan Liu
Author_Institution :
Dept. of Comput. Sci., Univ. of Southern California, Los Angeles, CA, USA
fYear :
2013
fDate :
7-10 Dec. 2013
Firstpage :
1085
Lastpage :
1090
Abstract :
Active learning, transfer learning, and related techniques are unified by a core theme: efficient and effective use of available data. Active learning offers scalable solutions for building effective supervised learning models while minimizing annotation effort. Transfer learning utilizes existing labeled data from one task to help learning related tasks for which limited labeled data are available. There has been limited research, however, on how to combine these two techniques. In this paper, we present a simple and principled transfer active learning framework that leverages pre-existing labeled data from related tasks to improve the performance of an active learner. We derive an intuitive bound on generalization error for the classifiers learned by this algorithm that provides insight into the algorithm´s behavior and the problem in general. Experimental results using several well-known transfer learning data sets confirm our theoretical analysis and demonstrate the effectiveness of our approach.
Keywords :
learning (artificial intelligence); pattern classification; active learning; classifiers generalization error; labeled data; supervised learning models; transfer learning; Acceleration; Algorithm design and analysis; Labeling; Query processing; Supervised learning; Training; Upper bound; Active Learning; Learning Theory; Machine Learning; Transfer Learning;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Data Mining (ICDM), 2013 IEEE 13th International Conference on
Conference_Location :
Dallas, TX
ISSN :
1550-4786
Type :
conf
DOI :
10.1109/ICDM.2013.160
Filename :
6729602
Link To Document :
بازگشت