• DocumentCode
    3549045
  • Title

    Evaluating image retrieval

  • Author

    Shirahatti, Nikhil V. ; Barnard, Kobus

  • Author_Institution
    Electr. & Comput. Eng., Arizona Univ., Tuczon, AZ, USA
  • Volume
    1
  • fYear
    2005
  • fDate
    20-25 June 2005
  • Firstpage
    955
  • Abstract
    We present a comprehensive strategy for evaluating image retrieval algorithms. Because automated image retrieval is only meaningful in its service to people, performance characterization must be grounded in human evaluation. Thus we have collected a large data set of human evaluations of retrieval results, both for query by image example and query by text. The data is independent of any particular image retrieval algorithm and can be used to evaluate and compare many such algorithms without further data collection. The data and calibration software are available on-line. We develop and validate methods for generating sensible evaluation data, calibrating for disparate evaluators, mapping image retrieval system scores to the human evaluation results, and comparing retrieval systems. We demonstrate the process by providing grounded comparison results for several algorithms.
  • Keywords
    human factors; image retrieval; performance evaluation; automated image retrieval; disparate evaluators; evaluation data; human evaluation; image query; image retrieval algorithm; performance characterization; Biomedical imaging; Calibration; Computer Society; Computer science; Computer vision; Content based retrieval; Humans; Image retrieval; Information retrieval; Vocabulary;
  • fLanguage
    English
  • Publisher
    ieee
  • Conference_Titel
    Computer Vision and Pattern Recognition, 2005. CVPR 2005. IEEE Computer Society Conference on
  • ISSN
    1063-6919
  • Print_ISBN
    0-7695-2372-2
  • Type

    conf

  • DOI
    10.1109/CVPR.2005.147
  • Filename
    1467369