Advertisement

Lexical and Algorithmic Stemming Compared for 9 European Languages with Hummingbird SearchServerTM at CLEF 2003

  • Stephen Tomlinson
Part of the Lecture Notes in Computer Science book series (LNCS, volume 3237)

Abstract

Hummingbird participated in the monolingual information retrieval tasks of the Cross-Language Evaluation Forum (CLEF) 2003: for natural language queries in 9 European languages (German, French, Italian, Spanish, Dutch, Finnish, Swedish, Russian and English), find all the relevant documents (with high precision) in the CLEF 2003 document sets. SearchServer produced the highest mean average precision score of the submitted automatic Title+Description runs for German, Finnish and Dutch, the CLEF languages for which SearchServer could find words which are parts of compounds. In a comparison of experimental SearchServer lexical stemmers with Porter’s algorithmic stemmers, the biggest differences (most of them significant) were for languages in which compound words are frequent. For the other languages, typically the lexical stemmers performed inflectional stemming while the algorithmic stemmers often additionally performed derivational stemming; these differences did not pass a significance test.

Keywords

Relevant Document Average Precision Query Expansion Compound Word Test Collection 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Braschler, M., Peters, C.: CLEF 2003: Methodology and Metrics. In: Peters, C., Gonzalo, J., Braschler, M., Kluck, M. (eds.) CLEF 2003. LNCS, vol. 3237, pp. 7–20. Springer, Heidelberg (2004)CrossRefGoogle Scholar
  2. 2.
    Cross-Language Evaluation Forum web site, http://www.clef-campaign.org/
  3. 3.
    Efron, B., Tibshirani, R.J.: An Introduction to the Bootstrap. Chapman & Hall/CRC (1993)Google Scholar
  4. 4.
    Hodgson, A.: Converting the Fulcrum Search Engine to Unicode. In: Sixteenth International Unicode Conference, Amsterdam, The Netherlands (March 2000)Google Scholar
  5. 5.
    NTCIR (NII-NACSIS Test Collection for IR Systems) Home Page, http://research.nii.ac.jp/~ntcadm/index-en.html
  6. 6.
    Porter, M.F.: Snowball: A language for stemming algorithms (October 2001), http://snowball.tartarus.org/texts/introduction.html
  7. 7.
    Robertson, S.E., Walker, S., Jones, S., Hancock-Beaulieu, M.M., Gatford, M. (City University): Okapi at TREC-3. In: Harman, D.K. (ed.) Overview of the Third Text REtrieval Conference (TREC-3), NIST Special Publication 500-226, http://trec.nist.gov/pubs/trec3/t3_proceedings.html
  8. 8.
    Text REtrieval Conference (TREC) Home Page, http://trec.nist.gov/
  9. 9.
    Tomlinson, S.: Experiments in 8 European Languages with Hummingbird SearchServerTM at CLEF 2002. In: Peters, C., Braschler, M., Gonzalo, J. (eds.) CLEF 2002. LNCS, vol. 2785. Springer, Heidelberg (2003)CrossRefGoogle Scholar
  10. 10.
    Tomlinson, S.: Hummingbird SearchServerTM at TREC 2001. In: Voorhees, E.M., Harman, D.K. (eds.) Proceedings of the Tenth Text REtrieval Conference (TREC 2001), NIST Special Publication 500-250 (2001), http://trec.nist.gov/pubs/trec10/t10_proceedings.html
  11. 11.
    The Unicode Standard Version 3.0. The Unicode Consortium. Addison-Wesley, London (2000)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2004

Authors and Affiliations

  • Stephen Tomlinson
    • 1
  1. 1.HummingbirdOttawaCanada

Personalised recommendations