Abstract
Evaluation of information systems in commercial and industrial settings differs from academic evaluation of methodology in important ways. Those differences have to do with differing organisational priorities between practice and research. Some of those priorities can be adjusted, others must be taken into account, to be able to include evaluation into an operational development pipeline.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
Braschler M (2009) Best practices in system-oriented aspects for multilingual information access applications. In: Proceedings of the eChallenges 2009 conference
Braschler M, Rietberger S, Imhof M, Järvelin A, Hansen P, Lupu M, Gäde M, Berendsen R, de Herrera AGS (2012) Best Practices Report, Deliverable 2.3. PROMISE project
Cleverdon CW, Mills J, Keen M (1966) Aslib Cranfield research project—factors determining the performance of indexing systems. Technical report
Forner P, Bentivogli L, Braschler M, Choukri K, Ferro N, Hanbury A, Karlgren J, Müller H (2013) PROMISE technology transfer day: spreading the word on information access evaluation at an industrial event. SIGIR Forum 47(1):53–58
Imhof M, Braschler M (2015) Are test collections “Real”? Mirroring real-world complexity in IR test collections. In: Mothe J, Savoy J, Kamps J, Pinel-Sauvagnat K, Jones GJF, SanJuan E, Cappellato L, Ferro N (eds) Experimental IR meets multilinguality, multimodality, and interaction. Proceedings of the sixth international conference of the CLEF association (CLEF 2015). Lecture notes in computer science (LNCS), vol 9283. Springer, Heidelberg, pp 241–247
Jacobson I (1993) Object-oriented software engineering: a use case driven approach. Pearson Education India, Delhi
Kanoulas E, Karlgren J (2017) Practical issues in information access system evaluation. SIGIR Forum 51(1):67–72
Kazai G, Ingersoll G, Lin J (2016) Evaluation is for conference papers. I need to build a real life product! SIGIR 2016 Industry Track Panel, Pisa
Kruschwitz U, Hull C et al (2017) Searching the enterprise. Found Trends Inf Retr 11(1):1–142
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2019 Springer Nature Switzerland AG
About this chapter
Cite this chapter
Karlgren, J. (2019). Adopting Systematic Evaluation Benchmarks in Operational Settings. In: Ferro, N., Peters, C. (eds) Information Retrieval Evaluation in a Changing World. The Information Retrieval Series, vol 41. Springer, Cham. https://doi.org/10.1007/978-3-030-22948-1_25
Download citation
DOI: https://doi.org/10.1007/978-3-030-22948-1_25
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-22947-4
Online ISBN: 978-3-030-22948-1
eBook Packages: Computer ScienceComputer Science (R0)