Skip to main content

Feature Construction during Tree Learning

  • Conference paper
GWAI-91 15. Fachtagung für Künstliche Intelligenz

Part of the book series: Informatik-Fachberichte ((2252,volume 285))

  • 41 Accesses

Abstract

This paper addresses the “problem of new terms” in the context of learning decision trees using the approach based on ID3. We discuss an algorithm for efficiently constructing new features from given primitive features and relate it to constructive induction. In our approach, feature construction is integrated with selecting a (new) feature for building the decision tree in one process. Hence, appropriate features are constructed during tree generation. The representation of constructed features is based on sets. While the search space of possible features is exponential, we use a geometric interpretation to show that this algorithm provides linear time and space complexity. Moreover, we show that it finds features with optimal value for the tree construction procedure of ID3. Results of experiments are reported, and besides of considerations related to the size of the generated trees we also discuss the important issue of how comprehensible these trees are. In particular, we are interested in the intelligibility of the discovered features.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 49.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 59.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. D. F. Beal and M. R. B. Clarke. The construction of economical and correct algorithms for king and pawn against king. In M. R. B. Clarke, editor, Advances in Computer Chess 2, pages 1–30, Edinburgh University Press, Edinburgh, U.K., 1980.

    Google Scholar 

  2. I. Bratko and I. Kononenko. Learning diagnostic rules from incomplete and noisy data. In B. Phelps, editor, Interactions in Artificial Intelligence and Statistical Method, pages 142–153, London, England, 1986.

    Google Scholar 

  3. L. Breiman, J.H. Friedman, R.A. Olshen, and C.J. Stone. Classification and Regression Trees. Wadsworth, Belmont, 1984.

    MATH  Google Scholar 

  4. B. Cestnik, I. Kononenko, and I. Bratko. ASSISTANT 86: A knowledge-elicitation tool for sophisticated users. In I. Bratko and N. Lavrac, editors, Progress in Machine Learning, pages 31–45, Sigma Press, Wilmslow, England, 1987.

    Google Scholar 

  5. G. Drastal, G. Czako, and S. Raatz. Induction in an abstraction space: A form of constructive induction. In Proceedings of the Eleventh International Joint Conference on Artificial Intelligence, pages 708–712, Morgan Kaufmann, Detroit, MI, August 1989.

    Google Scholar 

  6. E. B. Hunt, J. Marin, and P. T. Stone. Experiments in Induction. Academic Press, New York, NY, 1966.

    Google Scholar 

  7. C. J. Matheus. Adding domain knowledge to SBL through feature construction. In Proceedings of the Eighth National Conference on Artificial Intelligence, pages 803–808, AAAI, AAAI Press and MIT Press, Boston, MA, July/August 1990.

    Google Scholar 

  8. C. J. Matheus and L. A. Rendell. Constructive induction on decision trees. In Proceedings of the Eleventh International Joint Conference on Artificial Intelligence, pages 645–650, Morgan Kaufmann, Detroit, MI, August 1989.

    Google Scholar 

  9. G. Mehlsam. Automatisches Erzeugen von Klassifikationskriterien. Doctoral Dissertation, Technische Universität Wien, Vienna, Austria, August 1989.

    Google Scholar 

  10. P. Mehra, L. A. Rendell, and B. W. Wah. Principled constructive induction. In Proceedings of the Eleventh International Joint Conference on Artificial Intelligence, pages 651–656, Morgan Kaufmann, Detroit, MI, August 1989.

    Google Scholar 

  11. R. S. Michalski. Pattern recognition as rule-guided inductive inference. IEEE Transactions on Pattern Analysis and Machine Intelligence, PAMI-2(4):349–361, July 1980.

    Article  MATH  Google Scholar 

  12. R. S. Michalski and Y. Kodratoff. Research in machine learning: Recent progress, classification of methods, and future directions. In Y. Kodratoff and R. S. Michalski, editors, Machine Learning: An Artificial Intelligence Approach, Volume III, chapter 1, pages 3–30, Morgan Kaufmann, San Mateo, CA, 1990.

    Google Scholar 

  13. T. M. Mitchell, R. M. Keller, and S. T. Kedar-Cabelli. Explanation-based generalization: A unifying view. Machine Learning, 1:47–80, 1986.

    Google Scholar 

  14. S. Muggleton. Duce, an oracle based approach to constructive induction. In Proceedings of the Tenth International Joint Conference on Artificial Intelligence, pages 287–292, Morgan Kaufmann, Milan, Italy, August 1987.

    Google Scholar 

  15. G. Pagallo. Learning DNF by decision trees. In Proceedings of the Eleventh International Joint Conference on Artificial Intelligence, pages 639–644, Morgan Kaufmann, Detroit, MI, August 1989.

    Google Scholar 

  16. J. R. Quinlan. Decision trees and multi-valued attributes. In J. E. Hayes, D. Michie, and J. Richards, editors, Machine Intelligence 11, chapter 13, pages 305–318, Clarendon Press, Oxford, England, 1988.

    Google Scholar 

  17. J. R. Quinlan. Induction of decision trees. Machine Learning, 1:81–106, 1986.

    Google Scholar 

  18. J. R. Quinlan. Learning efficient classification procedures and their application to chess end games. In R. S. Michalski, J. G. Carbonell, and T. M. Mitchell, editors, Machine Learning: An Artificial Intelligence Approach, chapter 15, pages 463–482, Tioga, Palo Alto, CA, 1983.

    Google Scholar 

  19. L. A. Rendell. Substantial constructive induction using layered information compression: Tractable feature formation in search. In Proceedings of the Ninth International Joint Conference on Artificial Intelligence, pages 650–658, Morgan Kaufmann, Los Angeles, CA, August 1985.

    Google Scholar 

  20. B. A. Shepherd. An appraisal of a decision tree approach to image classification. In Proceedings of the Eighth International Joint Conference on Artificial Intelligence, pages 473–475, Morgan Kaufmann, Karlsruhe, FRG, August 1983.

    Google Scholar 

  21. L. Watanabe and R. Elio. Guiding constructive induction for incremental learning from examples. In Proceedings of the Tenth International Joint Conference on Artificial Intelligence, pages 293–296, Morgan Kaufmann, Milan, Italy, August 1987.

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 1991 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Mehlsam, G., Kaindl, H., Barth, W. (1991). Feature Construction during Tree Learning. In: Christaller, T. (eds) GWAI-91 15. Fachtagung für Künstliche Intelligenz. Informatik-Fachberichte, vol 285. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-662-02711-0_6

Download citation

  • DOI: https://doi.org/10.1007/978-3-662-02711-0_6

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-54558-3

  • Online ISBN: 978-3-662-02711-0

  • eBook Packages: Springer Book Archive

Publish with us

Policies and ethics