Skip to main content

Decision Trees

  • Chapter
Pattern Recognition

Part of the book series: Undergraduate Topics in Computer Science ((UTICS,volume 0))

Abstract

A decision tree is a tree where each non-leaf or internal node is associated with a decision and the leaf nodes are generally associated with an outcome or class label. Each internal node tests one or more attribute values leading to two or more links or branches. Each link in turn is associated with a possible value of the decision. These links are mutually distinct and collectively exhaustive. This means that it is possible to follow only one of the links and all possibilities will be taken care of—there is a link for each possibility.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 29.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 39.95
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

Bibliography

  1. Buntine, W. and T. Niblett. A further comparison of splitting rules for decisiontree Induction. Machine Learning 8: 75–85. 1992.

    Google Scholar 

  2. Chandra, B. and P. Paul Varghese. Moving towards efficient decision tree construction. Information Sciences 179(8): 1059–1069. 2009.

    Article  MATH  Google Scholar 

  3. Chen, Yen-Liang, Chia-Chi Wu and Kwei Tang. Building a cost-constrained decision tree with multiple condition attributes. Information Sciences 179(7): 967–079. 2009.

    Article  Google Scholar 

  4. Chen, Yen-Liang, Hsiao-Wei Hu and Kwei Tang. Constructing a decision tree from data with hierarchical class labels. Expert Systems with Applications 36(3) Part 1: 4838–4847. 2009.

    Article  Google Scholar 

  5. John, George H. Finding multivariate splits in decision trees using function optimization. Proceedings of the AAAI. 1994.

    Google Scholar 

  6. Kalkanis, G. The application of confidence interval error analysis to the design of decision tree classifiers. Pattern Recognition Letters 14(5): 355–361. 1993.

    Article  Google Scholar 

  7. Murthy, Sreerama K., Simon Kasif and Steven Salzberg. A system for induction of oblique decision trees. Journal of Artificial Intelligence Research 2: 1–32. 1994.

    MATH  Google Scholar 

  8. Ouyang, Jie, Nilesh Patel and Ishwar Sethi. Induction of multiclass multifeature split decision trees from distributed data. Pattern Recognition. 2009.

    Google Scholar 

  9. Quinlan, J. R. Induction of decision trees. Machine Learning 1: 81–106. 1986.

    Google Scholar 

  10. Quinlan, J.R. C4.5-Programs for Machine Learning. San Mateo, CA: Morgan Kaufmann. 1992.

    Google Scholar 

  11. Quinlan, J.R. Improved use of continuous attributes in C4.5. Journal of Artificial Intelligence Research 4: 77–90. 1996.

    MATH  Google Scholar 

  12. Sethi, Ishwar K. and Jae H. Yoo. Design of multicategory multifeature split decision trees using perceptron learning. Pattern Recognition 27(7): 939–947. 1994.

    Article  Google Scholar 

  13. Sieling, Detlef. Minimization of decision trees is hard to approximate. Journal of Computer and System Sciences 74(3): 394–403. 2008.

    Article  MathSciNet  MATH  Google Scholar 

  14. Twala, B. E. T. H., M. C. Jones and D. J. Hand. Good methods for coping with missing data in decision trees. Pattern Recognition Letters 29(7): 950–956. 2008.

    Article  Google Scholar 

  15. Wang, Xi-Zhao, Jun-Hai Zhai and Shu-Xia Lu. Induction of multiple fuzzy decision trees based on rough set technique. Information Sciences 178(16): 3188–3202. 2008.

    Article  MathSciNet  MATH  Google Scholar 

  16. Yildiz, Olcay Taner and Onur Dikmen. Parallel univariate decision trees. Pattern Recognition Letters 28(7): 825–832. 2007.

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to M. Narasimha Murty .

Rights and permissions

Reprints and permissions

Copyright information

© 2011 Universities Press (India) Pvt. Ltd.

About this chapter

Cite this chapter

Murty, M.N., Devi, V.S. (2011). Decision Trees. In: Pattern Recognition. Undergraduate Topics in Computer Science, vol 0. Springer, London. https://doi.org/10.1007/978-0-85729-495-1_6

Download citation

  • DOI: https://doi.org/10.1007/978-0-85729-495-1_6

  • Publisher Name: Springer, London

  • Print ISBN: 978-0-85729-494-4

  • Online ISBN: 978-0-85729-495-1

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics