Skip to main content

Evaluating Performance Indicators for Adaptive Information Filtering

  • Conference paper
Internet Applications (ICSC 1999)

Part of the book series: Lecture Notes in Computer Science ((LNCS,volume 1749))

Included in the following conference series:

  • 343 Accesses

Abstract

The task of information filtering is to classify documents from a stream as either relevant or non-relevant according to a particular user interest with the objective to reduce information load. When using an information filter in an environment that changes over time, methods for adapting the filter should be considered in order to retain classification performance. We favor a methodology that attempts to detect changes and adapts the information filter only if need be. Thus the amount of user feedback for providing new training data can be minimized. Nevertheless, detecting changes may also require expensive hand-labeling of documents. This paper explores two methods for assessing performance indicators without user feedback. The first is based on performance estimation and the second counts uncertain classification decisions. Empirical results for a simulated change scenario with real-world text data show that our adaptive information filter can perform well in changing domains.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Allan, J., Carbonell, J., Doddington, G., Yamron, J., Yang, Y.: Topic detection and tracking pilot study final report. In: Proceedings of the Broadcast News Transcription and Understranding Workshop (Sponsored by DARPA) (1998)

    Google Scholar 

  2. Baker, D., Hofmann, T., McCallum, A., Yang, Y.: A hierarchical probabilistic model for novelty detection in text (1999) (submitted)

    Google Scholar 

  3. Dumais, S., Platt, J., Heckerman, D., Sahami, M.: Inductive learning algorithms and representation for text categorization. In: Proceedings of the Seventh International Conference on Information and Knowledge Management (1998)

    Google Scholar 

  4. Klinkenberg, R., Renz, I.: Adaptive information filtering: Learning in the presence of concept drifts. In: Learning for Text Categorization, Menlo Park, California, pp. 33–40. AAAI Press, Menlo Park (1998)

    Google Scholar 

  5. Lanquillon, C.: Information filtering in changing domains. In: Proceedings of the IJCAI 1999 Workshop on Machine Learning for Information Filtering, Stockholm, Sweden (1999)

    Google Scholar 

  6. Lanquillon, C., Renz, I.: Adaptive information filtering: Detecting changes in text streams. In: Proceedings of the Eighth International Conference on Information and Knowledge Management, Kansas City, Missouri (1999) (to appear)

    Google Scholar 

  7. Lewis, D.D.: Evaluating and optimizing autonomous text classification systems. In: Proceedings of the Eighteenth Annual International ACM-SIGIR Conference on Research and Development in Information Retrieval, pp. 246–254 (1995)

    Google Scholar 

  8. Maloof, M., Michalski, R.: Learning evolving concepts using partial memory approach. In: Working Notes of the 1995 AAAI Fall Symposium on Active Learning, Boston, MA, pp. 70–73 (1995)

    Google Scholar 

  9. Montgomery, D.C.: Introduction to Statistical Quality Control, 3rd edn. Wiley, New York (1997)

    MATH  Google Scholar 

  10. Nakhaeizadeh, G., Taylor, C., Lanquillon, C.: Evaluating usefulness for dynamic classification. In: Proceedings of The Fourth International Conference on Knowledge Discovery & Data Mining, New York, pp. 87–93 (1998)

    Google Scholar 

  11. Reuters-21578. This text categorization test collection from D. Lewis is publicly (1997), available at http://www.research.att.com/~lewis/reuters21578.html

  12. Rocchio. Jr., J.J.: Relevance feedback in information retrieval. In: Salton, G. (ed.) The SMART Retrieval System: Experiments in Automatic Document Processing, pp. 313–323. Prentice Hall, Englewood Cliffs (1971)

    Google Scholar 

  13. Yang, Y., Liu, X.: A re-examination of text categorization methods. In: Proceeding of ACM SIGIR Conference on Research and Developement in Information Retrieval (1999)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 1999 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Lanquillon, C. (1999). Evaluating Performance Indicators for Adaptive Information Filtering. In: Hui, L.C.K., Lee, DL. (eds) Internet Applications. ICSC 1999. Lecture Notes in Computer Science, vol 1749. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-540-46652-9_2

Download citation

  • DOI: https://doi.org/10.1007/978-3-540-46652-9_2

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-66903-6

  • Online ISBN: 978-3-540-46652-9

  • eBook Packages: Springer Book Archive

Publish with us

Policies and ethics