Abstract
This study addresses the Relevant-in-Context retrieval task and seeks justification for systems providing focused answers to be successful in it. Obviously, under some circumstances the full document retrieval is sufficient in finding relevant material effectively. Namely, the Relevant-in-Context retrieval does not bring any improvements in case the retrieved documents are thoroughly (i.e. densely) relevant, or the relevant material is located in the document start. By using the INEX data, we perform a topic-wise analysis focusing on these qualities of the retrieved relevant documents. In addition, we evaluate the submitted INEX runs with various measures, in order to study how different T2I values affect the mutual rankings and measure the systems in locating the relevant material within a document.
Keywords
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
Arvola, P.: Passage Retrieval Evaluation Based on Intended Reading Order. In: Workshop Information Retrieval, LWA 2008, pp. 91–94 (2008)
Arvola, P., Junkkari, M., Kekäläinen, J.: Applying XML Retrieval Methods for Result Document Navigation in Small Screen Devices. In: Proceedings of MUIA at MobileHCI 2006, pp. 6–10 (2006)
Arvola, P., Kekäläinen, J., Junkkari, M.: Expected reading effort in focused retrieval evaluation. Information Retrieval 13(4), 460–484 (2010)
Arvola, P., Kekäläinen, J., Junkkari, M.: Focused access to sparsely and densely relevant documents. In: Proceedings of SIGIR 2010, pp. 781–782 (2010)
Chiaramella, Y.: Information retrieval and structured documents. Lectures on Information Retrieval, pp. 286–309 (2001)
Järvelin, K., Kekäläinen, J.: Cumulated Gain-Based Evaluation of IR Techniques. ACM Transactions on Information Systems 20(4), 422–446 (2002)
Kamps, J., Lalmas, M., Pehcevski, J.: Evaluating relevant in context: Document retrieval with a twist. In: Proceedings SIGIR 2007, pp. 749–750 (2007)
Kamps, J., Pehcevski, J., Kazai, G., Lalmas, M., Robertson, S.: INEX 2007 evaluation measures. In: Fuhr, N., Kamps, J., Lalmas, M., Trotman, A. (eds.) INEX 2007. LNCS, vol. 4862, pp. 24–33. Springer, Heidelberg (2008)
Kazai, G., Lalmas, M.: Notes on what to measure in INEX. In: Proceedings of the INEX, Workshop on Element Retrieval Methodology, INEX 2005, pp. 22–38 (2005)
Kekäläinen, J., Järvelin, K.: Using graded relevance assessments in IR evaluation. Journal of the American Society for Information Science and Technology 53, 1120–1129 (2002)
Piwowarski, B., Lalmas, M.: Providing consistent and exhaustive relevance assessments for XML retrieval evaluation. In: Proceedings of CIKM 2004, pp. 361–370 (2004)
Trotman, A., Pharo, N., Lehtonen, M.: XML-IR Users and Use Cases. In: Fuhr, N., Lalmas, M., Trotman, A. (eds.) INEX 2006. LNCS, vol. 4518, pp. 400–412. Springer, Heidelberg (2007)
de Vries, A.P., Kazai, G., Lalmas, M.: Tolerance to irrelevance: A user-effort oriented evaluation of retrieval systems without predefined retrieval unit. In: Proceedings of RIAO 2004, pp. 463–473 (2004)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2011 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Arvola, P., Vainio, J. (2011). The Potential Benefit of Focused Retrieval in Relevant-in-Context Task. In: Geva, S., Kamps, J., Schenkel, R., Trotman, A. (eds) Comparative Evaluation of Focused Retrieval. INEX 2010. Lecture Notes in Computer Science, vol 6932. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-23577-1_2
Download citation
DOI: https://doi.org/10.1007/978-3-642-23577-1_2
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-642-23576-4
Online ISBN: 978-3-642-23577-1
eBook Packages: Computer ScienceComputer Science (R0)