Advertisement

Abstract

Natural Language Processing (NLP) techniques ought to be really useful for people building adaptive hypermedia (AH) systems. This talk explores the gap between theory and practice, illustrating it with examples of things that do (and don’t work), and it suggests a way of closing the gap. The examples are mainly drawn from collaborative work I’ve been involved with over the last decade, on a series of AH systems using NLP: ILEX, SOLE, M-PIRO and Methodius. In theory, NLP sub-systems should help find, filter and format information for re-presentation in AH systems. So there ought to be lots of cross-fertilisation between NLP and AH. It is true that some projects have effectively brought them together; particularly on the formatting—or information presentation—side, natural language generation systems have allowed quite fine-grained personalisation of information to the language, interests and history of individual users. But in practice, NLP has been less useful to AH than one might have expected. Now, one reason for this is that the information to be presented has to come from somewhere, and NLP support for AH authors is not as good as it should be. Arguably, where NLP could really make a difference is on the finding and filtering side. State-of-the-art information extraction tools can increase author productivity, and help make fine-grained personalisation more practical.

Copyright information

© Springer-Verlag Berlin Heidelberg 2006

Authors and Affiliations

  • Jon Oberlander
    • 1
  1. 1.School of InformaticsUniversity of EdinburghEdinburghUK

Personalised recommendations