Synonyms
Evaluation of XML retrieval effectiveness; Performance metrics
Definition
Anevaluation metric is used to evaluate the effectiveness of information retrieval systems and to justify theoretical and/or pragmatic developments of these systems. It consists of a set of measures that follow a common underlying evaluation methodology.
There are many metrics that can be used to evaluate the effectiveness of structured text retrieval systems. These metrics are based on different evaluation assumptions, incorporate different hypotheses of the expected user behavior, and implement their own evaluation methodologies to handle the level of overlap among the units of retrieval.
Historical Background
Over the past 5 years, the initiative for the evaluation of XML retrieval (INEX) has investigated various aspects of structured text retrieval, by particularly focusing on XML retrieval. Major advances, both in terms of approaches to XML retrieval and evaluation of XML retrieval, have been made...
This is a preview of subscription content, log in via an institution.
Buying options
Tax calculation will be finalised at checkout
Purchases are for personal use only
Learn about institutional subscriptionsRecommended Reading
Gövert N, Fuhr N, Lalmas M, Kazai G. Evaluating the effectiveness of content-oriented XML retrieval methods. Inf Retr. 2006;9(6):699–722.
Järvelin K, Kekäläinen J. Cumulated gain-based evaluation of IR techniques. ACM Trans Inf Syst. 2002;20(4):422–46.
Kazai G. Choosing an ideal recall-base for the evaluation of the focused task: sensitivity analysis of the XCG evaluation measures. In: Proceedings of the Comparative Evaluation of XML Information Retrieval Systems: Fifth Workshop of the Initiative for the Evaluation of XML Retrieval; 2007. p. 35–44.
Kazai G, Lalmas M. Notes on what to measure in INEX. In: Proceedings of the INEX 2005 Workshop on Element Retrieval Methodology; 2005. p. 22–38.
Kazai G, Lalmas M. eXtended Cumulated Gain measures for the evaluation of content-oriented XML retrieval. ACM Trans Inf Syst. 2006;24(4):503–42.
Kazai G, Lalmas M, de Vries AP. The overlap problem in content-oriented XML retrieval evaluation. In: Proceedings of the 30th Annual International ACM SIGIR Conference on Research and Development in Information Retrieval; 2004. p. 72–9.
Pehcevski J. Evaluation of effective XML information retrieval. RMIT University, Melbourne, Australia, Ph.D. thesis, 2006.
Pehcevski J, Thom JA. HiXEval: highlighting XML retrieval evaluation. In: Advances in XML Information Retrieval and Evaluation: Proceedings of the Fourth Workshop of the Initiative for the Evaluation of XML Retrieval; 2006. p. 43–57.
Piwowarski B, Dupret G. Evaluation in (XML) information retrieval: expected precision-recall with user modelling (EPRUM). In: Proceedings of the 32nd Annual International ACM SIGIR Conference on Research and Development in Information Retrieval; 2006. p. 260–7.
Piwowarski B, Gallinari P. Expected ratio of relevant units: a measure for structured information retrieval. In: Proceedings of the 2nd International Workshop of the Initiative for the Evaluation of XML Retrieval; 2003. p. 158–66.
Piwowarski B, Gallinari P, Dupret G. Precision recall with user modelling (PRUM): application to structured information retrieval. ACM Trans Inf Syst. 2007;25(1):1–37.
Raghavan V, Bollmann P, Jung G. A critical investigation of recall and precision. ACM Trans Inf Syst. 1989;7(3):205–29.
de Vries A, Kazai G, Lalmas M. Tolerance to irrelevance: a user-effort evaluation of retrieval systems without predefined retrieval unit. In: Proceedings of the 7th International Conference on Computer-Assisted Information Retrieval; 2004. p. 463–73.
Woodley A. and Geva S. XCG overlap at INEX 2004. In: Proceedings of the 4th International Workshop of the Initiative for the Evaluation of XML Retrieval; 2005. p. 25–39.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Section Editor information
Rights and permissions
Copyright information
© 2018 Springer Science+Business Media, LLC, part of Springer Nature
About this entry
Cite this entry
Pehcevski, J., Piwowarski, B. (2018). Evaluation Metrics for Structured Text Retrieval. In: Liu, L., Özsu, M.T. (eds) Encyclopedia of Database Systems. Springer, New York, NY. https://doi.org/10.1007/978-1-4614-8265-9_152
Download citation
DOI: https://doi.org/10.1007/978-1-4614-8265-9_152
Published:
Publisher Name: Springer, New York, NY
Print ISBN: 978-1-4614-8266-6
Online ISBN: 978-1-4614-8265-9
eBook Packages: Computer ScienceReference Module Computer Science and Engineering