Abstract
A decision tree is a tree where each non-leaf or internal node is associated with a decision and the leaf nodes are generally associated with an outcome or class label. Each internal node tests one or more attribute values leading to two or more links or branches. Each link in turn is associated with a possible value of the decision. These links are mutually distinct and collectively exhaustive. This means that it is possible to follow only one of the links and all possibilities will be taken care of—there is a link for each possibility.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
Bibliography
Buntine, W. and T. Niblett. A further comparison of splitting rules for decisiontree Induction. Machine Learning 8: 75–85. 1992.
Chandra, B. and P. Paul Varghese. Moving towards efficient decision tree construction. Information Sciences 179(8): 1059–1069. 2009.
Chen, Yen-Liang, Chia-Chi Wu and Kwei Tang. Building a cost-constrained decision tree with multiple condition attributes. Information Sciences 179(7): 967–079. 2009.
Chen, Yen-Liang, Hsiao-Wei Hu and Kwei Tang. Constructing a decision tree from data with hierarchical class labels. Expert Systems with Applications 36(3) Part 1: 4838–4847. 2009.
John, George H. Finding multivariate splits in decision trees using function optimization. Proceedings of the AAAI. 1994.
Kalkanis, G. The application of confidence interval error analysis to the design of decision tree classifiers. Pattern Recognition Letters 14(5): 355–361. 1993.
Murthy, Sreerama K., Simon Kasif and Steven Salzberg. A system for induction of oblique decision trees. Journal of Artificial Intelligence Research 2: 1–32. 1994.
Ouyang, Jie, Nilesh Patel and Ishwar Sethi. Induction of multiclass multifeature split decision trees from distributed data. Pattern Recognition. 2009.
Quinlan, J. R. Induction of decision trees. Machine Learning 1: 81–106. 1986.
Quinlan, J.R. C4.5-Programs for Machine Learning. San Mateo, CA: Morgan Kaufmann. 1992.
Quinlan, J.R. Improved use of continuous attributes in C4.5. Journal of Artificial Intelligence Research 4: 77–90. 1996.
Sethi, Ishwar K. and Jae H. Yoo. Design of multicategory multifeature split decision trees using perceptron learning. Pattern Recognition 27(7): 939–947. 1994.
Sieling, Detlef. Minimization of decision trees is hard to approximate. Journal of Computer and System Sciences 74(3): 394–403. 2008.
Twala, B. E. T. H., M. C. Jones and D. J. Hand. Good methods for coping with missing data in decision trees. Pattern Recognition Letters 29(7): 950–956. 2008.
Wang, Xi-Zhao, Jun-Hai Zhai and Shu-Xia Lu. Induction of multiple fuzzy decision trees based on rough set technique. Information Sciences 178(16): 3188–3202. 2008.
Yildiz, Olcay Taner and Onur Dikmen. Parallel univariate decision trees. Pattern Recognition Letters 28(7): 825–832. 2007.
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
Copyright information
© 2011 Universities Press (India) Pvt. Ltd.
About this chapter
Cite this chapter
Murty, M.N., Devi, V.S. (2011). Decision Trees. In: Pattern Recognition. Undergraduate Topics in Computer Science, vol 0. Springer, London. https://doi.org/10.1007/978-0-85729-495-1_6
Download citation
DOI: https://doi.org/10.1007/978-0-85729-495-1_6
Publisher Name: Springer, London
Print ISBN: 978-0-85729-494-4
Online ISBN: 978-0-85729-495-1
eBook Packages: Computer ScienceComputer Science (R0)