• DocumentCode
    25869
  • Title

    Creating Experts From the Crowd: Techniques for Finding Workers for Difficult Tasks

  • Author

    Gottlieb, Luke ; Friedland, Gerald ; Choi, Jang-Young ; Kelm, Pascal ; Sikora, Thomas

  • Author_Institution
    Audio & Multimedia Res. Directive, Int. Comput. Sci. Inst., Berkeley, CA, USA
  • Volume
    16
  • Issue
    7
  • fYear
    2014
  • fDate
    Nov. 2014
  • Firstpage
    2075
  • Lastpage
    2079
  • Abstract
    Crowdsourcing is currently used for a range of applications, either by exploiting unsolicited user-generated content, such as spontaneously annotated images, or by utilizing explicit crowdsourcing platforms such as Amazon Mechanical Turk to mass-outsource artificial-intelligence-type jobs. However, crowdsourcing is most often seen as the best option for tasks that do not require more of people than their uneducated intuition as a human being. This article describes our methods for identifying workers for crowdsourced tasks that are difficult for both machines and humans. It discusses the challenges we encountered in qualifying annotators and the steps we took to select the individuals most likely to do well at these tasks.
  • Keywords
    social networking (online); video signal processing; Amazon Mechanical Turk; annotators; artificial-intelligence-type jobs; crowdsourcing; multimodal location estimation; social media video; unsolicited user-generated content; Cities and towns; Crowdsourcing; Estimation; Reliability; Tutorials; Videos; Visualization; Annotation; cheat detection; crowdsourcing; mechanical turk; multimodal location estimation; qualification;
  • fLanguage
    English
  • Journal_Title
    Multimedia, IEEE Transactions on
  • Publisher
    ieee
  • ISSN
    1520-9210
  • Type

    jour

  • DOI
    10.1109/TMM.2014.2347268
  • Filename
    6877717