Skip to main content

Improving Performance of Decision Tree Algorithms with Multi-edited Nearest Neighbor Rule

  • Conference paper
  • First Online:
Advances in Knowledge Discovery and Data Mining (PAKDD 2003)

Part of the book series: Lecture Notes in Computer Science ((LNAI,volume 2637))

Included in the following conference series:

  • 1156 Accesses

Abstract

The paper proposed a new method based on the multi-edited nearest neighbor rule to prevent decision tree algorithms from growing a tree of unnecessary large size and hence partially alleviate the problem of “over-training”. For this purpose, two useful prosperities of the multi-edited nearest neighbor rule are investigated. Experiments show that the method proposed could drastically reduce the size of resulting trees, significantly enhance their understandability, and meanwhile improve the test accuracy when the control parameter takes an appropriate value.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Apté Chidanand, Weiss Sholom: Data Mining with Decision Trees and Decision Rules. Future Generation Computer Systems, 13 (1997) 197–210

    Article  Google Scholar 

  2. Richard O. Duda, Peter E. Hart, David G. Stork: Pattern Classification (2nd edition). John Wiley&Sons, New York (2001)

    MATH  Google Scholar 

  3. Thomas G. Dietterich: An Experimental Comparison of Three Methods for Constructing Ensembles of Decision Trees: Bagging, Boosting, and Randomization. Machine Learning, 40 (2000) 139–157

    Article  Google Scholar 

  4. Krishnan R., Sivakumar G., Bhattacharya P.: Extracting Decision Trees from Trained Neural Networks. Pattern Recognition, 32 (1999) 1999–2009

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2003 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Ye, Cz., Yang, J., Yao, Lx., Chen, Ny. (2003). Improving Performance of Decision Tree Algorithms with Multi-edited Nearest Neighbor Rule. In: Whang, KY., Jeon, J., Shim, K., Srivastava, J. (eds) Advances in Knowledge Discovery and Data Mining. PAKDD 2003. Lecture Notes in Computer Science(), vol 2637. Springer, Berlin, Heidelberg. https://doi.org/10.1007/3-540-36175-8_39

Download citation

  • DOI: https://doi.org/10.1007/3-540-36175-8_39

  • Published:

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-04760-5

  • Online ISBN: 978-3-540-36175-6

  • eBook Packages: Springer Book Archive

Publish with us

Policies and ethics