Advertisement

Experiments with Cost-Sensitive Feature Evaluation

  • Marko Robnik-Šikonja
Part of the Lecture Notes in Computer Science book series (LNCS, volume 2837)

Abstract

Many machine learning tasks contain feature evaluation as one of its important components. This work is concerned with attribute estimation in the problems where class distribution is unbalanced or the misclassification costs are unequal. We test some common attribute evaluation heuristics and propose their cost-sensitive adaptations. The new measures are tested on problems which can reveal their strengths and weaknesses.

Keywords

Class Problem Class Distribution Cost Information Cost Matrix Gain Ratio 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

References

  1. 1.
    Breiman, L., Friedman, J.H., Olshen, R.A., Stone, C.J.: Classification and regression trees. Wadsworth Inc., Belmont (1984)zbMATHGoogle Scholar
  2. 2.
    Dietterich, T.G., Kerns, M., Mansour, Y.: Applying the weak learning framework to understand and improve C4.5. In: Saitta, L. (ed.) Machine Learning: Proceedings of the Thirteenth International Conference (ICML 1996), pp. 96–103. Morgan Kaufmann, San Francisco (1996)Google Scholar
  3. 3.
    Drummond, C., Holte, R.C.: Exploiting the cost (in)sensitivity of decision tree splitting criteria. In: Proceedings of the Seventeenth International Conference on Machine Learnintg (ICML 2000), pp. 239–246 (2000)Google Scholar
  4. 4.
    Elkan, C.: The foundations of cost-sensitive learning. In: Proceedings of the Seventeenth International Joint Conference on Artificaial Intelligence, IJCAI 2001 (2001)Google Scholar
  5. 5.
    Kira, K., Rendell, L.A.: A practical approach to feature selection. In: Sleeman, D., Edwards, P. (eds.) Machine Learning: Proceedings of International Conference (ICML 1992), pp. 249–256. Morgan Kaufmann, San Francisco (1992)Google Scholar
  6. 6.
    Kononenko, I.: Estimating attributes: analysis and extensions of Relief. In: Bergadano, F., De Raedt, L. (eds.) ECML 1994. LNCS, vol. 784, pp. 171–182. Springer, Heidelberg (1994)Google Scholar
  7. 7.
    Kononenko, I.: On biases in estimating multi-valued attributes. In: Proceedings of the International Joint Conference on Artificial Intelligence (IJCAF 1995), pp. 1034–1040. Morgan Kaufmann, San Francisco (1995)Google Scholar
  8. 8.
    Kukar, M., Kononenko, I., Grošelj, C., Kralj, K., Fettich, J.: Analysing and improving the diagnosis of ischaemic heart disease with machine learning. Artificial Intelligence in Medicine 16, 25–50 (1999)CrossRefGoogle Scholar
  9. 9.
    Margineantu, D.D.: On class-probability estimates and cost-sensitive evaluation of classifiers. In: Workshop on Cost-Sensitive Learning at the Seventeenth International Conference on Machine Learning (WCSL at ICML 2000) (2000)Google Scholar
  10. 10.
    Margineantu, D.D., Dietterich, T.G.: Bootstrap methods for the cost-sensitive evaluation of classifiers. In: Machine Learning: Proceedings of Seventeenth International Conference on Machine Learning (ICML 2000), pp. 583–590. Morgan Kaufmann, San Francisco (2000)Google Scholar
  11. 11.
    Ross Quinlan, J.: C4.5: Programs for Machine Learning. Morgan Kaufmann, San Francisco (1993)Google Scholar
  12. 12.
    Robnik-Sikonja, M., Kononenko, I.: Theoretical and empirical analysis of ReliefF and RReliefF. Machine Learning Journal (2003), http://www.kluweronline.com/issn/0885-6125/, (forthcoming, available also as technical report at http://lkm.fri.uni-lj.si/rmarko/)
  13. 13.
    Turney, P.D., Boz, O.: On-line cost-sensitive learning bibliography (1996-2001), http://home.ptd.net/olcay/cost-sensitive.html

Copyright information

© Springer-Verlag Berlin Heidelberg 2003

Authors and Affiliations

  • Marko Robnik-Šikonja
    • 1
  1. 1.Faculty of Computer and Information ScienceUniversity of LjubljanaLjubljanaSlovenia

Personalised recommendations