• DocumentCode
    266109
  • Title

    Exploring relevance assessment using crowdsourcing for faceted and ambiguous queries

  • Author

    Ravana, Sri Devi ; Samimi, Parnia ; Dabir Ashtyani, Parisa

  • Author_Institution
    Dept. of Inf. Syst., Univ. of Malaya, Kuala Lumpur, Malaysia
  • fYear
    2014
  • fDate
    27-29 Aug. 2014
  • Firstpage
    771
  • Lastpage
    776
  • Abstract
    Relevance assessment usually generated by human experts that can be a time-consuming, difficult and potentially expensive process. Recently, crowdsourcing has been presented to be a fast and cheap method to make relevance assessments in a semi-automatic way. However, previous work on the limit of crowdsourcing in IR evaluation is still inadequate and need further investigation especially for varying nature of queries used during the Web search. In this study, we have observed the responses from the crowdsourced workers and experts in assessing the judgments, compared the agreement between the relevance judgments made by an expert assessor and crowdsourced worker, and finally explored the constancy in system ranking when these two sets of relevance judgments were used to score the systems. Two commercial search engines were compared using two different types of queries namely the faceted and ambiguous queries. In general, both set of judgments ranked the systems in the same order although the absolute systems scores vary slightly. However, the findings shows that the type of query used does influence the agreement between assessors and the system performance measure.
  • Keywords
    Internet; query processing; search engines; Web search; ambiguous queries; crowdsourcing; faceted queries; information retrieval; judgment assessment; relevance assessment; search engines; Computer science; Crowdsourcing; Google; Information systems; Search engines; Web search; crowdsourcing; evaluation; information retrieval; relevant assessment;
  • fLanguage
    English
  • Publisher
    ieee
  • Conference_Titel
    Science and Information Conference (SAI), 2014
  • Conference_Location
    London
  • Print_ISBN
    978-0-9893-1933-1
  • Type

    conf

  • DOI
    10.1109/SAI.2014.6918273
  • Filename
    6918273