© 2009

Evaluating Systems for Multilingual and Multimodal Information Access

9th Workshop of the Cross-Language Evaluation Forum, CLEF 2008, Aarhus, Denmark, September 17-19, 2008, Revised Selected Papers

  • Carol Peters
  • Thomas Deselaers
  • Nicola Ferro
  • Julio Gonzalo
  • Gareth J. F. Jones
  • Mikko Kurimo
  • Thomas Mandl
  • Anselmo Peñas
  • Vivien Petras
Conference proceedings CLEF 2008

Part of the Lecture Notes in Computer Science book series (LNCS, volume 5706)

Table of contents

  1. Front Matter
  2. What Happened in CLEF 2008

    1. Carol Peters
      Pages 1-14
  3. Part I: Multilingual Textual Document Retrieval (Ad Hoc)

    1. Eneko Agirre, Giorgio Maria Di Nunzio, Nicola Ferro, Thomas Mandl, Carol Peters
      Pages 15-37

    1. Alessio Bosca, Luca Dini
      Pages 42-49
    2. Dong Nguyen, Arnold Overwijk, Claudia Hauff, Dolf R. B. Trieschnigg, Djoerd Hiemstra, Franciska de Jong
      Pages 58-65
    3. André Pinto Geraldo, Viviane P. Moreira
      Pages 66-74
    4. Jens Kürsten, Thomas Wilhelm, Maximilian Eibl
      Pages 75-82
  5. Persian@CLEF

    1. Reza Karimpour, Amineh Ghorbani, Azadeh Pishdad, Mitra Mohtarami, Abolfazl AleAhmad, Hadi Amiri et al.
      Pages 89-96
    2. Zahra Aghazade, Nazanin Dehghani, Leili Farzinvash, Razieh Rahimi, Abolfazl AleAhmad, Hadi Amiri et al.
      Pages 97-104
    3. Abolfazl AleAhmad, Ehsan Kamalloo, Arash Zareh, Masoud Rahgozar, Farhad Oroumchian
      Pages 105-112
  6. Robust-WSD

    1. Fernando Martínez-Santiago, José M. Perea-Ortega, Miguel A. García-Cumbreras
      Pages 113-117
    2. Annalina Caputo, Pierpaolo Basile, Giovanni Semeraro
      Pages 126-133
    3. Sergio Navarro, Fernando Llopis, Rafael Muñoz
      Pages 134-137
    4. José R. Pérez-Agüera, Hugo Zaragoza
      Pages 138-145
    5. Jacques Guyot, Gilles Falquet, Saïd Radhouani, Karim Benzineb
      Pages 146-154
    6. Andreas Juffinger, Roman Kern, Michael Granitzer
      Pages 155-162

About these proceedings


The ninth campaign of the Cross-Language Evaluation Forum (CLEF) for European languages was held from January to September 2008. There were seven main eval- tion tracks in CLEF 2008 plus two pilot tasks. The aim, as usual, was to test the p- formance of a wide range of multilingual information access (MLIA) systems or s- tem components. This year, 100 groups, mainly but not only from academia, parti- pated in the campaign. Most of the groups were from Europe but there was also a good contingent from North America and Asia plus a few participants from South America and Africa. Full details regarding the design of the tracks, the methodologies used for evaluation, and the results obtained by the participants can be found in the different sections of these proceedings. The results of the CLEF 2008 campaign were presented at a two-and-a-half day workshop held in Aarhus, Denmark, September 17–19, and attended by 150 resear- ers and system developers. The annual workshop, held in conjunction with the European Conference on Digital Libraries, plays an important role by providing the opportunity for all the groups that have participated in the evaluation campaign to get together comparing approaches and exchanging ideas. The schedule of the workshop was divided between plenary track overviews, and parallel, poster and breakout sessions presenting this year’s experiments and discu- ing ideas for the future. There were several invited talks.


Cross-Language Evaluation Forum Wiki answer validation cross-language cross-language queries cross-lingual data mining image retrieval information retrieval machine learning medical images natural language processing semantic analysis video retrieval

Editors and affiliations

  • Carol Peters
    • 1
  • Thomas Deselaers
    • 2
  • Nicola Ferro
    • 3
  • Julio Gonzalo
    • 4
  • Gareth J. F. Jones
    • 5
  • Mikko Kurimo
    • 6
  • Thomas Mandl
    • 7
  • Anselmo Peñas
    • 4
  • Vivien Petras
    • 8
  1. 1.Istituto di Scienza e Tecnologie dell’InformazioneCNRPisaItaly
  2. 2.RWTH Aachen UniversityAachenGermany
  3. 3.University of PaduaPaduaItaly
  4. 4.LSI-UNEDMadridSpain
  5. 5.Dublin City UniversityDublin 9Ireland
  6. 6.Helsinki University of TechnologyEspooFinland
  7. 7.University of HildesheimHildesheimGermany
  8. 8.Humboldt University BerlinGermany

Bibliographic information

Industry Sectors
IT & Software