Skip to main content

Boosting Cost-Sensitive Trees

  • Conference paper
  • First Online:
Discovey Science (DS 1998)

Part of the book series: Lecture Notes in Computer Science ((LNAI,volume 1532))

Included in the following conference series:

Abstract

This paper explores two techniques for boosting costsensitive trees. The two techniques differ in whether the misclassification cost information is utilized during training. We demonstrate that each of these techniques is good at different aspects of cost-sensitive classifications. We also show that both techniques provide a means to overcome the weaknesses of their base cost-sensitive tree induction algorithm

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Breiman, L. (1996), Bias, variance, and arcing classifiers, Technical Report 460, Department of Statistics, University of California, Berkeley, CA.

    Google Scholar 

  2. Breiman, L., J.H. Friedman, R.A. Olshen, & C.J. Stone (1984), Classification And Regression Trees, Belmont, CA: Wadsworth.

    MATH  Google Scholar 

  3. Freund, Y.& R.E. Schapire (1996), Experiments with a new boosting algorithm, in Proceedings of the Thirteenth International Conference on Machine Learning, pp. 148–156, Morgan Kaufmann.

    Google Scholar 

  4. Knoll, U., G. Nakhaeizadeh, & B. Tausend (1994), Cost-sensitive pruning of decision trees, in Proceedings of the Eighth European Conference on Machine Learning, pp. 383–386. Berlin, Germany: Springer-Verlag.

    Google Scholar 

  5. Merz, C.J. & P.M. Murphy (1997), UCI Repository of machine learning databases [http://www.ics.uci.edu/mlearn/MLRepository.html]. Irvine, CA: University of California, Department of Information and Computer Science.

    Google Scholar 

  6. Michie, D., D.J. Spiegelhalter, & C.C. Taylor (1994), Machine Learning, Neural and Statistical Classification, Ellis Horwood Limited.

    Google Scholar 

  7. Norton, S.W. (1989), Generating better decision trees, in Proceedings of the Eleventh International Joint Conference on Artificial Intelligence, pp. 800–805, Morgan Kaufmann.

    Google Scholar 

  8. Núñez, M. (1991), The use of background knowledge in decision tree induction, Machine Learning, 6, pp. 231–250.

    Google Scholar 

  9. Pazzani, M., C. Merz, P. Murphy, K. Ali, T. Hume, & C. Brunk(1994), Reducing misclassification costs, in Proceedings of the Eleventh International Conference on Machine Learning, pp. 217–225, Morgan Kaufmann.

    Google Scholar 

  10. Quinlan, J.R. (1993), C4.5: Program for Machine Learning, Morgan Kaufmann.

    Google Scholar 

  11. Quinlan, J.R. (1996), Bagging, boosting, and C4.5, in Proceedings of the 13th National Conference on Artificial Intelligence, pp. 725–730, AAAI Press.

    Google Scholar 

  12. Schapire, R.E., Y. Freund, P. Bartlett, & W.S. Lee (1997), Boosting the margin: A new explanation for the effectiveness of voting methods, in Proceedings of the Fourteenth International Conference on Machine Learning, pp. 322–330. Morgan Kaufmann.

    Google Scholar 

  13. Tan, M. (1993), Cost-sensitive learning of classification knowledge and its applications in robotics, Machine Learning, 13, pp. 7–33.

    Google Scholar 

  14. Ting, K.M. (1998), Inducing cost-sensitive trees via instance-weighting, to appear in Proceedings of The Second European Symposium on Principles of Data Mining and Knowledge Discovery, Springer-Verlag.

    Google Scholar 

  15. Ting, K.M. & Z. Zheng (1998), Boosting trees for cost-sensitive classifications, Proceedings of the Tenth European Conference on Machine Learning, LNAI-1398, pp. 190–195, Berlin: Springer-Verlag.

    Google Scholar 

  16. Turney, P.D. (1995), Cost-sensitive classification: Empirical evaluation of a hybrid genetic decision tree induction algorithm, Journal of Artificial Intelligence Research, 2, pp. 369–409.

    Google Scholar 

  17. Webb, G.I. (1996), Cost-sensitive specialization, in Proceedings of the 1996 Pacific Rim International Conference on Artificial Intelligence, pp. 23–34, Springer-Verlag.

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 1998 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Ting, K.M., Zheng, Z. (1998). Boosting Cost-Sensitive Trees. In: Arikawa, S., Motoda, H. (eds) Discovey Science. DS 1998. Lecture Notes in Computer Science(), vol 1532. Springer, Berlin, Heidelberg. https://doi.org/10.1007/3-540-49292-5_22

Download citation

  • DOI: https://doi.org/10.1007/3-540-49292-5_22

  • Published:

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-65390-5

  • Online ISBN: 978-3-540-49292-4

  • eBook Packages: Springer Book Archive

Publish with us

Policies and ethics