DocumentCode :
1309582
Title :
When Does Cotraining Work in Real Data?
Author :
Du, Jun ; Ling, Charles X. ; Zhou, Zhi-Hua
Author_Institution :
Dept. of Comput. Sci., Univ. of Western Ontario, London, ON, Canada
Volume :
23
Issue :
5
fYear :
2011
fDate :
5/1/2011 12:00:00 AM
Firstpage :
788
Lastpage :
799
Abstract :
Cotraining, a paradigm of semisupervised learning, is promised to alleviate effectively the shortage of labeled examples in supervised learning. The standard two-view cotraining requires the data set to be described by two views of features, and previous studies have shown that cotraining works well if the two views satisfy the sufficiency and independence assumptions. In practice, however, these two assumptions are often not known or ensured (even when the two views are given). More commonly, most supervised data sets are described by one set of attributes (one view). Thus, they need be split into two views in order to apply the standard two-view cotraining. In this paper, we first propose a novel approach to empirically verify the two assumptions of cotraining given two views. Then, we design several methods to split single view data sets into two views, in order to make cotraining work reliably well. Our empirical results show that, given a whole or a large labeled training set, our view verification and splitting methods are quite effective. Unfortunately, cotraining is called for precisely when the labeled training set is small. However, given small labeled training sets, we show that the two cotraining assumptions are difficult to verify, and view splitting is unreliable. Our conclusions for cotraining´s effectiveness are mixed. If two views are given, and known to satisfy the two assumptions, cotraining works well. Otherwise, based on small labeled training sets, verifying the assumptions or splitting single view into two views are unreliable; thus, it is uncertain whether the standard cotraining would work or not.
Keywords :
learning (artificial intelligence); labeled training set; semisupervised learning; two-view cotraining; view splitting method; view verification method; Semisupervised learning; cotraining; independence assumption; single-view.; sufficiency assumption; view splitting;
fLanguage :
English
Journal_Title :
Knowledge and Data Engineering, IEEE Transactions on
Publisher :
ieee
ISSN :
1041-4347
Type :
jour
DOI :
10.1109/TKDE.2010.158
Filename :
5560662
Link To Document :
بازگشت