Decision Trees

Part of the Undergraduate Topics in Computer Science book series (UTICS, volume 0)


A decision tree is a tree where each non-leaf or internal node is associated with a decision and the leaf nodes are generally associated with an outcome or class label. Each internal node tests one or more attribute values leading to two or more links or branches. Each link in turn is associated with a possible value of the decision. These links are mutually distinct and collectively exhaustive. This means that it is possible to follow only one of the links and all possibilities will be taken care of—there is a link for each possibility.


Decision Tree Leaf Node Class Label Pattern Classification Categorical Feature 


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. 1.
    Buntine, W. and T. Niblett. A further comparison of splitting rules for decisiontree Induction. Machine Learning 8: 75–85. 1992.Google Scholar
  2. 2.
    Chandra, B. and P. Paul Varghese. Moving towards efficient decision tree construction. Information Sciences 179(8): 1059–1069. 2009.MATHCrossRefGoogle Scholar
  3. 3.
    Chen, Yen-Liang, Chia-Chi Wu and Kwei Tang. Building a cost-constrained decision tree with multiple condition attributes. Information Sciences 179(7): 967–079. 2009.CrossRefGoogle Scholar
  4. 4.
    Chen, Yen-Liang, Hsiao-Wei Hu and Kwei Tang. Constructing a decision tree from data with hierarchical class labels. Expert Systems with Applications 36(3) Part 1: 4838–4847. 2009.CrossRefGoogle Scholar
  5. 5.
    John, George H. Finding multivariate splits in decision trees using function optimization. Proceedings of the AAAI. 1994.Google Scholar
  6. 6.
    Kalkanis, G. The application of confidence interval error analysis to the design of decision tree classifiers. Pattern Recognition Letters 14(5): 355–361. 1993.CrossRefGoogle Scholar
  7. 7.
    Murthy, Sreerama K., Simon Kasif and Steven Salzberg. A system for induction of oblique decision trees. Journal of Artificial Intelligence Research 2: 1–32. 1994.MATHGoogle Scholar
  8. 8.
    Ouyang, Jie, Nilesh Patel and Ishwar Sethi. Induction of multiclass multifeature split decision trees from distributed data. Pattern Recognition. 2009.Google Scholar
  9. 9.
    Quinlan, J. R. Induction of decision trees. Machine Learning 1: 81–106. 1986.Google Scholar
  10. 10.
    Quinlan, J.R. C4.5-Programs for Machine Learning. San Mateo, CA: Morgan Kaufmann. 1992.Google Scholar
  11. 11.
    Quinlan, J.R. Improved use of continuous attributes in C4.5. Journal of Artificial Intelligence Research 4: 77–90. 1996.MATHGoogle Scholar
  12. 12.
    Sethi, Ishwar K. and Jae H. Yoo. Design of multicategory multifeature split decision trees using perceptron learning. Pattern Recognition 27(7): 939–947. 1994.CrossRefGoogle Scholar
  13. 13.
    Sieling, Detlef. Minimization of decision trees is hard to approximate. Journal of Computer and System Sciences 74(3): 394–403. 2008.MathSciNetMATHCrossRefGoogle Scholar
  14. 14.
    Twala, B. E. T. H., M. C. Jones and D. J. Hand. Good methods for coping with missing data in decision trees. Pattern Recognition Letters 29(7): 950–956. 2008.CrossRefGoogle Scholar
  15. 15.
    Wang, Xi-Zhao, Jun-Hai Zhai and Shu-Xia Lu. Induction of multiple fuzzy decision trees based on rough set technique. Information Sciences 178(16): 3188–3202. 2008.MathSciNetMATHCrossRefGoogle Scholar
  16. 16.
    Yildiz, Olcay Taner and Onur Dikmen. Parallel univariate decision trees. Pattern Recognition Letters 28(7): 825–832. 2007.CrossRefGoogle Scholar

Copyright information

© Universities Press (India) Pvt. Ltd. 2011

Authors and Affiliations

  1. 1.Dept. of Computer Science and AutomationIndian Institute of ScienceBangaloreIndia

Personalised recommendations