Abstract
The paper proposed a new method based on the multi-edited nearest neighbor rule to prevent decision tree algorithms from growing a tree of unnecessary large size and hence partially alleviate the problem of “over-training”. For this purpose, two useful prosperities of the multi-edited nearest neighbor rule are investigated. Experiments show that the method proposed could drastically reduce the size of resulting trees, significantly enhance their understandability, and meanwhile improve the test accuracy when the control parameter takes an appropriate value.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
Apté Chidanand, Weiss Sholom: Data Mining with Decision Trees and Decision Rules. Future Generation Computer Systems, 13 (1997) 197–210
Richard O. Duda, Peter E. Hart, David G. Stork: Pattern Classification (2nd edition). John Wiley&Sons, New York (2001)
Thomas G. Dietterich: An Experimental Comparison of Three Methods for Constructing Ensembles of Decision Trees: Bagging, Boosting, and Randomization. Machine Learning, 40 (2000) 139–157
Krishnan R., Sivakumar G., Bhattacharya P.: Extracting Decision Trees from Trained Neural Networks. Pattern Recognition, 32 (1999) 1999–2009
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2003 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Ye, Cz., Yang, J., Yao, Lx., Chen, Ny. (2003). Improving Performance of Decision Tree Algorithms with Multi-edited Nearest Neighbor Rule. In: Whang, KY., Jeon, J., Shim, K., Srivastava, J. (eds) Advances in Knowledge Discovery and Data Mining. PAKDD 2003. Lecture Notes in Computer Science(), vol 2637. Springer, Berlin, Heidelberg. https://doi.org/10.1007/3-540-36175-8_39
Download citation
DOI: https://doi.org/10.1007/3-540-36175-8_39
Published:
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-04760-5
Online ISBN: 978-3-540-36175-6
eBook Packages: Springer Book Archive