• DocumentCode
    3060125
  • Title

    An Entropic Score to Rank Annotators for Crowdsourced Labeling Tasks

  • Author

    Raykar, Vikas C. ; Yu, Shipeng

  • Author_Institution
    Siemens Healthcare, Malvern, PA, USA
  • fYear
    2011
  • fDate
    15-17 Dec. 2011
  • Firstpage
    29
  • Lastpage
    32
  • Abstract
    With the advent of crowd sourcing services it has become quite cheap and reasonably effective to get a dataset labeled by multiple annotators in a short amount of time. Various methods have been proposed to estimate the consensus labels by correcting for the bias of annotators with different kinds of expertise. Often we have low quality annotators or spammers -- annotators who assign labels randomly (e.g., without actually looking at the instance). Spammers can make the cost of acquiring labels very expensive and can potentially degrade the quality of the consensus labels. In this paper we propose a score (based on the reduction in entropy) which can be used to rank the annotators -- with the spammers having a score close to zero and the good annotators having a high score close to one.
  • Keywords
    data handling; entropy; learning (artificial intelligence); annotator ranking; crowdsourced labeling tasks; crowdsourcing services; entropic score; spammers; Accuracy; Entropy; Estimation; Gold; Integrated circuits; Labeling; Uncertainty; crowdsourcing; entropic score; ranking annotators;
  • fLanguage
    English
  • Publisher
    ieee
  • Conference_Titel
    Computer Vision, Pattern Recognition, Image Processing and Graphics (NCVPRIPG), 2011 Third National Conference on
  • Conference_Location
    Hubli, Karnataka
  • Print_ISBN
    978-1-4577-2102-1
  • Type

    conf

  • DOI
    10.1109/NCVPRIPG.2011.14
  • Filename
    6132993