Title :
Boosting crowdsourcing with expert labels: Local vs. global effects
Author :
Qiang Liu;Alexander Ihler;John Fisher
fDate :
7/1/2015 12:00:00 AM
Abstract :
Crowdsourcing provides a cheap but efficient approach for large-scale data and information collection. However, human judgments are inherently noisy, ambiguous and sometimes biased, and should be calibrated by additional (usually much more expensive) expert or true labels. In this work, we study the optimal allocation of the true labels to best calibrate the crowdsourced labels. We frame the problem as a submodular optimization, and propose a greedy allocation strategy that exhibits an interesting trade-off between a local effect, which encourages acquiring true labels for the most uncertain items, and a global effect, which favors the true labels of the most “influential” items, whose information can propagate to help the prediction of other items. We show that exploiting and monitoring the global effect yields a significantly better selection strategy, and also provides potentially valuable information for other tasks such as designing stopping rules.
Keywords :
"Crowdsourcing","Approximation methods","Entropy","Uncertainty","Resource management","Electronic mail","Optimization"
Conference_Titel :
Information Fusion (Fusion), 2015 18th International Conference on