Advertisement

The Role of Unlabeled Data in Supervised Learning

  • Tom M. Mitchell
Part of the Philosophical Studies Series book series (PSSP, volume 99)

Abstract

Most computational models of supervised learning rely only on labeled training examples, and ignore the possible role of unlabeled data. This is true both for cognitive science models of learning such as SOAR [Newell 1990] and ACT–R [Anderson, et al. 1995], and for machine learning and data mining algorithms such as decision tree learning and inductive logic programming (see, e.g., [Mitchell 1997]). In this paper we consider the potential role of unlabeled data in supervised learning. We present an algorithm and experimental results demonstrating that unlabeled data can significantly improve learning accuracy in certain practical problems. We then identify the abstract problem structure that enables the algorithm to successfully utilize this unlabeled data, and prove that unlabeled data will boost learning accuracy for problems in this class. The problem class we identify includes problems where the features describing the examples are redundantly sufficient for classifying the example; a notion we make precise in this paper. This problem class includes many natural learning problems faced by humans, such as learning a semantic lexicon over noun phrases in natural language, and learning to recognize objects from multiple sensor inputs. We argue that models of human and animal learning should consider more strongly the potential role of unlabeled data, and that many natural learning problems fit the class we identify.

Keywords

Noun Phrase Supervise Learn Audio Signal Unlabeled Data Word Sense 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. A Anderson et al. [1995], Production system models of complex cognition. In Proceedings of the Seventeenth Annual Conference of the Cognitive Science Society (pp. 9–12). Hillsdale, NJ: Lawrence Erlbaum Associates.Google Scholar
  2. Blum and Mitchell [1998], Combining Labeled and Unlabeled Data with CoTraining, COLT98. Available at http://www.cs.cmu.edu/-webkb/-webkb.Google Scholar
  3. A Craven et al. [1998], Learning to extract symbolic knowledge from the world wide web. In Proceedings of the 15 th National Conference on Artificial Intelligence (AAAI–98). Available at http://www.cs.cmu.edu.Google Scholar
  4. de Sa [1994], Learning classification with unlabeled data, NIPS–6, 1994.Google Scholar
  5. de Sa and Ballard [1998], Category learning through multi–modality sensing, Neural Computation 10(5), 1998.Google Scholar
  6. Riloff and Jones [1999], Learning dictionaries for information extraction by multi–level bootstrapping, AAAI99. Available at http://www.cs.cmu.edu/-webkb/-webkb.Google Scholar
  7. Mitchella [1997], Machine learning. New York: McGraw Hill, 1997. See http://www.cs.cmu.edu/-webkb/-webkb.Google Scholar
  8. Newell [1990], Unified theories of cognition. Cambridge, MA: Harvard University Press, 1990.Google Scholar
  9. Yarowsky [1995], Unsupervised word sense disambiguation rivaling supervised methods. Proceedings of the 33 rd Annual Meeting of the ACL, pp. 189–196.Google Scholar

Copyright information

© Springer Science+Business Media Dordrecht 2004

Authors and Affiliations

  • Tom M. Mitchell
    • 1
  1. 1.Carnegie Mellon UniversityUSA

Personalised recommendations