Abstract
The paper makes three points of significance for IR research: (1) The Cranfield paradigm of IR evaluation seems to lose power when one looks at human instead of system performance. (2) Searchers using IR systems in real-life use rather short queries, which individually often have poor performance. However, when used in sessions, they may be surprisingly effective. The searcher’s strategies have not been sufficiently described and cannot therefore be properly understood, supported nor evaluated. (3) Searchers in real-life seek to optimize the entire information access process, not just result quality. Evaluation of output alone is insufficient to explain searcher behavior.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
Allan, J., Carterette, B., Lewis, J.: When will information retrieval be “good enough”? In: Proc. ACM SIGIR 2005, pp. 433–440. ACM Press, New York (2005)
Cleverdon, C., Mills, L., Keen, M.: Factors determining the performance of indexing systems, vol. 1 - design. Aslib Cranfield Research Project, Cranfield (1966)
Ellis, D.: Progress and problems in information retrieval. Library Assoc., London (1996)
Hersh, W.: Relevance and Retrieval Evaluation: Perspectives from Medicine. J. Amer. Soc. Inform. Sci. 45, 201–206 (1994)
Huuskonen, S., Vakkari, P.: Students’ search process and outcome in Medline in writing an essay for a class on evidence based medicine. J. Documentat. 64, 287–303 (2008)
Ingwersen, P., Järvelin, K.: The turn: Integration of information seeking and retrieval in context. Springer, Heidelberg (2005)
ISO Ergonomic requirements for office work with visual display terminals (VDTs), Part 11: Guidance on usability. ISO 9241-11:1998(E) (1998)
Jansen, B.J., Spink, A., Saracevic, T.: Real Life, Real Users, and Real Needs: A Study and Analysis of User Queries on the Web. In: Inform. Proc. Manag., vol. 36, pp. 207–227 (2000)
Järvelin, K.: An Analysis of Two Approaches in Information Retrieval: From Frameworks to Study Designs. J. Amer. Soc. Inform. Sci. 58, 971–986 (2007)
Järvelin, K., Price, S.L., Delcambre, L.M.L., Nielsen, M.L.: Discounted Cumulated Gain Based Evaluation of Multiple-Query IR Sessions. In: Macdonald, C., Ounis, I., Plachouras, V., Ruthven, I., White, R.W. (eds.) ECIR 2008. LNCS, vol. 4956, pp. 4–15. Springer, Heidelberg (2008)
Keskustalo, H., Järvelin, K., Pirkola, A., Sharma, T., Lykke Nielsen, M.: Test Collection-Based IR Evaluation Needs Extension Toward Sessions - A Case Study of Extremely Short Queries. In: Proc. AIRS 2009. LNCS. Springer, Heidelberg (to appear, 2009)
Salton, G.: Evaluation Problems in Interactive Information Retrieval. Inform. Stor. Retr. 6, 29–44 (1970)
Saracevic, T.: User lost: reflections on the past, future, and limits of information science. ACM SIGIR Forum 31(2), 16–27 (1997)
Smith, C.L., Kantor, P.B.: User Adaptation: Good Results from Poor Systems. In: Proc. ACM SIGIR 2008, pp. 147–154. ACM Press, New York (2008)
Turpin, A., Scholer, F.: User performance versus precision measures for simple search tasks. In: Proc. ACM SIGIR 2006, pp. 11–18. ACM Press, New York (2006)
Wagner, D., Berger, J., Zeldith, M.: A working strategy for constructing theories. In: Ritzer, G. (ed.) Metatheorizing, pp. 107–123. Sage, Tousand Oaks (1992)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2009 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Järvelin, K. (2009). Explaining User Performance in Information Retrieval: Challenges to IR Evaluation. In: Azzopardi, L., et al. Advances in Information Retrieval Theory. ICTIR 2009. Lecture Notes in Computer Science, vol 5766. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-04417-5_28
Download citation
DOI: https://doi.org/10.1007/978-3-642-04417-5_28
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-642-04416-8
Online ISBN: 978-3-642-04417-5
eBook Packages: Computer ScienceComputer Science (R0)