Encyclopedia of Database Systems

2018 Edition
| Editors: Ling Liu, M. Tamer Özsu

Multimedia Retrieval Evaluation

  • Thijs WesterveldEmail author
Reference work entry
DOI: https://doi.org/10.1007/978-1-4614-8265-9_235


Multimedia Retrieval Evaluation is the activity of measuring the effectiveness of one or more multimedia search techniques. A common way of evaluating multimedia retrieval systems is by comparing them to each other in community wide benchmarks. In such benchmarks participants are invited to submit their retrieval results for a given set of topics, the relevance of the submitted items is checked, and effectiveness measures for each of the submissions are reported. Multimedia retrieval evaluation measures the effectiveness of multimedia retrieval systems or techniques by looking at how well the information need as described by a topic is satisfied by the results retrieved by the system or technique. Efficiency of the techniques is typically not taken into account, but may be studied separately.

Historical Background

Until the mid-1990s, no commonly used evaluation methodology existed for multimedia retrieval. An important reason for this is that the field has merely been a...

This is a preview of subscription content, log in to check access.

Recommended Reading

  1. 1.
    Cleverdon CW. The cranfield tests on index languagr devices. Aslib Proc. 1967;19(6):173–92.CrossRefGoogle Scholar
  2. 2.
    Draper SW, Dunlop MD, Ruthven I, van Rijsbergen CJ, editors. In: Proceedings of the MIRA 99: Evaluating Interactive Information Retrieval. eWiC, Electronic Workshops in Computing; 1999.Google Scholar
  3. 3.
    Gunther NJ, Beretta G. A benchmark for image retrieval using distributed systems over the internet: BIRDS-I. Technical report HPL-2000-162, HP laboratories. 2000.Google Scholar
  4. 4.
    Hull D. Using statistical testing in the evaluation of retrieval experiments. In: Proceedings of the 16th Annual International ACM SIGIR Conference on Research and Development in Information Retrieval; 1993. p. 329–38.Google Scholar
  5. 5.
    Jones KS. Automatic indexing. J Doc. 1974;30(4):393–432.CrossRefGoogle Scholar
  6. 6.
    Leung CHC, Ho-Shing IPH. Benchmarking for content-based visual information search. In: Advances in Visual Information Systems, 4th International Conference; 2000. p. 442–56.Google Scholar
  7. 7.
    Müller H, Müller W, McG Squire D, Marchand-Maillet S, Pun T. Performance evaluation in content-based image retrieval: overview and proposals. Pattern Recogn Lett. (Special Issue on Image and Video Indexing). 2001;22(5):593–601.zbMATHCrossRefGoogle Scholar
  8. 8.
    Narasimhalu AD, Kankanhalli MS, Wu J. Benchmarking multimedia databases. Multimedia Tools Appl. 1997;4(3):333–56. ISSN 1380-7501CrossRefGoogle Scholar
  9. 9.
    Smeaton AF, Over P, Costello CJ, de Vries AP, Doermann D, Hauptmann A, Rorvig ME, Smith JR, Wu L. The TREC-2001 video track: information retrieval on digital video information. In: Proceedings of the 6th European Conference on Research and Advanced Technology for Digital Libraries; 2002. p. 266–75.zbMATHCrossRefGoogle Scholar
  10. 10.
    Smith JR. Image retrieval evaluation. In: Proceedings of the IEEE Workshop on Content-Based Access of Image and Video Libraries; 1998. p. 112–3.Google Scholar
  11. 11.
    Voorhees EM, Harman DK. Overview of the eighth text retrieval conference (TREC-8). In: Proceedings of the 8th Text Retrieval Conference; 2000.Google Scholar
  12. 12.
    Voorhees EM, Harman DK. TREC: experiment and evaluation in information retrieval (digital libraries and electronic publishing). Cambridge: MIT; 2005. ISBN 0262220733.Google Scholar
  13. 13.
    Westerveld T. Trecvid as a re-usable test-collection for video retrieval. In: Proceedings of the Multimedia Information Retrieval Workshop; 2005.Google Scholar
  14. 14.
    Zobel J. How reliable are the results of large-scale information retrieval experiments? In: Proceedings of the 21st Annual International ACM SIGIR Conference on Research and Development in Information Retrieval; 1998. p. 307–14.Google Scholar

Copyright information

© Springer Science+Business Media, LLC, part of Springer Nature 2018

Authors and Affiliations

  1. 1.Teezir Search SolutionsEdeNetherlands

Section editors and affiliations

  • Jeffrey Xu Yu
    • 1
  1. 1.The Chinese University of Hong KongHong KongChina