Advertisement

Efficient Question Answering with Question Decomposition and Multiple Answer Streams

  • Sven Hartrumpf
  • Ingo Glöckner
  • Johannes Leveling
Part of the Lecture Notes in Computer Science book series (LNCS, volume 5706)

Abstract

The German question answering (QA) system IRSAW (formerly: InSicht) participated in QA@CLEF for the fifth time. IRSAW was introduced in 2007 by integrating the deep answer producer InSicht, several shallow answer producers, and a logical validator. InSicht builds on a deep QA approach: it transforms documents to semantic representations using a parser, draws inferences on semantic representations with rules, and matches semantic representations derived from questions and documents. InSicht was improved for QA@CLEF 2008 mainly in the following two areas. The coreference resolver was trained on question series instead of newspaper texts in order to be better applicable for follow-up questions. Questions are decomposed by several methods on the level of semantic representations. On the shallow processing side, the number of answer producers was increased from two to four by adding FACT, a fact index, and SHASE, a shallow semantic network matcher. The answer validator introduced in 2007 was replaced by the faster RAVE validator designed for logic-based answer validation under time constraints. Using RAVE for merging the results of the answer producers, monolingual German runs and bilingual runs with source language English and Spanish were produced by applying the machine translation web service Promt. An error analysis shows the main problems for the precision-oriented deep answer producer InSicht and the potential offered by the recall-oriented shallow answer producers.

Keywords

Machine Translation Semantic Representation Semantic Network Query Expansion Question Answering 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Hartrumpf, S.: Coreference resolution with syntactico-semantic rules and corpus statistics. In: Proceedings of the Fifth Computational Natural Language Learning Workshop (CoNLL 2001), Toulouse, France, pp. 137–144 (2001)Google Scholar
  2. 2.
    Harabagiu, S.: Questions and intentions. In: Strzalkowski, T., Harabagiu, S. (eds.) Advances in Open Domain Question Answering. Text, Speech and Language Technology, vol. 32, pp. 99–147. Springer, Dordrecht (2006)CrossRefGoogle Scholar
  3. 3.
    Hartrumpf, S.: Semantic decomposition for question answering. In: Ghallab, M., Spyropoulos, C.D., Fakotakis, N., Avouris, N. (eds.) Proceedings of the 18th European Conference on Artificial Intelligence (ECAI), Patras, Greece, pp. 313–317 (2008)Google Scholar
  4. 4.
    Leveling, J.: On the role of information retrieval in the question answering system IRSAW. In: Proceedings of the LWA 2006 (Learning, Knowledge, and Adaptability), Workshop Information Retrieval, pp. 119–125. Universität Hildesheim, Hildesheim (2006)Google Scholar
  5. 5.
    Leveling, J.: A modified information retrieval approach to produce answer candidates for question answering. In: Hinneburg, A. (ed.) Proceedings of the LWA 2007 (Lernen-Wissen-Adaption), Workshop FGIR. Gesellschaft für Informatik, Halle/Saale, Germany (2007)Google Scholar
  6. 6.
    Hengel, C., Pfeifer, B.: Kooperation der Personennamendatei (PND) mit Wikipedia. Dialog mit Bibliotheken 17(3), 18–24 (2005)Google Scholar
  7. 7.
    Helbig, H.: Knowledge Representation and the Semantics of Natural Language. Springer, Berlin (2006)zbMATHGoogle Scholar
  8. 8.
    Glöckner, I.: University of Hagen at QA@CLEF 2008: Answer validation exercise. In: Results of the CLEF 2008 Cross-Language System Evaluation Campaign, Working Notes for the CLEF 2008 Workshop, Aarhus, Denmark (2008)Google Scholar
  9. 9.
    Hartrumpf, S., Glöckner, I., Leveling, J.: University of Hagen at QA@CLEF 2008: Efficient question answering with question decomposition and multiple answer streams. In: Results of the CLEF 2008 Cross-Language System Evaluation Campaign, Working Notes for the CLEF 2008 Workshop, Aarhus, Denmark (2008)Google Scholar
  10. 10.
    Leveling, J., Hartrumpf, S.: Integrating methods from IR and QA for geographic information retrieval. In: Peters, C., et al. (eds.) CLEF 2008. LNCS, vol. 5706, pp. 851–854. Springer, Heidelberg (2009)Google Scholar
  11. 11.
    Hartrumpf, S.: Question answering using sentence parsing and semantic network matching. In: Peters, C., Clough, P., Gonzalo, J., Jones, G.J.F., Kluck, M., Magnini, B. (eds.) CLEF 2004. LNCS, vol. 3491, pp. 512–521. Springer, Heidelberg (2005)CrossRefGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2009

Authors and Affiliations

  • Sven Hartrumpf
    • 1
  • Ingo Glöckner
    • 1
  • Johannes Leveling
    • 2
  1. 1.Intelligent Information and Communication Systems (IICS)University of Hagen (FernUniversität in Hagen)HagenGermany
  2. 2.Centre for Next Generation Localisation (CNGL)Dublin City UniversityDublin 9Ireland

Personalised recommendations