Induction of Decision Trees Using Relieff

  • I. Kononenko
  • E. Simec
Part of the International Centre for Mechanical Sciences book series (CISM, volume 363)


In the context of machine learning from examples this paper deals with the problem of estimating the quality of attributes with and without dependencies between then. Greedy search prevents current inductive machine learning algorithms to detect significant dependencies between the attributes. Recently, Kira and R.endell developed the RELIEF algorithm for estimating the quality of attributes that is able to detect dependencies between attributes. We show strong relation between R.ELIEF’s estimates and impurity functions, that are usually used for heuristic guidance of inductive learning algorithms. We propose to use RELIEFF, an extended version of RELIEF, instead of myopic impurity functions. We have reimplemented Assistant, a system for top down induction of decision trees, using RELIEFF as an estimator of attributes at each selection step. The algorithm is tested on several artificial and several real world problems. Results show the advantage of the presented approach to inductive learning and open a wide range of possibilities for using RELIEFF.


Decision Tree Training Instance Inductive Logic Programming Artificial Data Greedy Search 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. 1.
    Breiuran L., Friedman J.H., Olshen R.A., Stone G.J.: Classification and Regression Trees, Wadsworth International Group, 1984.Google Scholar
  2. 2.
    Cestnik B.: Estimating probabilities: A crucial task in machine learning, Proc. r’vropean Conference on Artificial Intelligence 90, Stockholm, August 1990, 147–149.Google Scholar
  3. 3.
    Cestnik B., Bratko I.: On estimating probabilities in tree pruning, Proc. European Working Session on Learning, (Porto, March 1991), Y.Kodratoff (ed.), Springer Verlag, 1991, pp. 138–150.Google Scholar
  4. 4.
    Cestnik, B., Kononenko, I. k Bratko, I.: ASSISTANT 86: A knowledge elicitation tool for sophisticated users, in: Progress in Machine Learning (Eds. I, Bratko k N. Lavrac.), Sigma Press, Wilmslow, England, 1987.Google Scholar
  5. 5.
    Dietterich T.G.,. Shavlik J.W. (eds.): Readings in machine learning, Morgan Kaufmann, 1990.Google Scholar
  6. 6.
    Dolsak, B. k Muggleton, S.: The application of inductive logic programming to finite element mesh design, in: Inductive Logic Programming (Ed. S. Muggleton ), Academic Press. 1992.Google Scholar
  7. 7.
    Dleroski S.: Handling noise in inductive logic programming, M.SC. Thesis, University of Ljubljana, Faculty of electrical engineering k computer science. Ljubljana., Slovenia, 1991Google Scholar
  8. 8.
    Favvad U.M.: On the induction of decision trees for multiple concept learning, Ph.D. Thesis, The University of Michigan, 1991.Google Scholar
  9. 9.
    Favvad U.I. k Irani K.B.: The attribute selection problem in decision tree generation, Proc. AAA/-92 (San Jose, CA, July 1992), MIT Press, 1992.Google Scholar
  10. 10.
    Hunt E. Martin.1 k Stone P.: Experiments in Induction. New York, Academic Press, 1966.Google Scholar
  11. 11.
    Kira K. k Rendell L. (a): A practical approach to feature selection. Proc. Intern. Conf. on:1lachin.e Learning (Aberdeen, July 1992 ). (Eds. D.Sleeman P.Edwards ), Morgan Kaufmann, 1992, 249–256.Google Scholar
  12. 12.
    Kira. K. k Rendell L. (h): The feature selection problem: traditional methods and new algorithm, Proc. AAAI’.92 (San Jose, CA, July 1992), MIT Press, 1992.Google Scholar
  13. 13.
    Kononenko I.: Semi-naive Bayesian classifier. Proc. European Working Session on Learning, (Ed. Y.Kodratoff ). Springer Verlag. 1991, 206–219.Google Scholar
  14. 14.
    Kononenko I.: Inductive and Bayesian learning in medical diagnosis. Applied Artificial Intelligence. 7 (1993). 317–337.Google Scholar
  15. 15.
    Kononenko I.: Estimating attributes: Analysis and extensions of RELIEF’. Proc. European C’onf. on Machine Learning (Catania, April 199–1) (Ed. L. De Raedt k F.Bergadano ). Springer Verlag. 1994, 171–182.Google Scholar
  16. 16.
    Kononenko. I. k Bratko. I.: Information based evaluation criterion for classifier’s per formance. Machine Learning. 6 (1991). 67–80.Google Scholar
  17. 17.
    Mantaras R.L.: ID3 Revisited: A distance based criterion for attribute selection, Proc. Int. Syn,p.)ft thodologics for Intelligent Systems. Charlotte, North Carolina, U.S.A., Oct. 1989.Google Scholar
  18. 18.
    Michalski. R.S. (’hilausky. R.L.: Learning by being told and learning from examples: An experimental comparison of the two methods of knowledge acquisition in the context of developing an expert system for soybean disease diagnosis. International Journal of Policy Analysis and Information Systems, 4 (1980). 125–161.Google Scholar
  19. 19.
    Michalski, R.S., Tecuci G. (eds.): Machine learning: A multistrategy approach, Vol. IV, Morgan Kaufmann, 1994.Google Scholar
  20. 20.
    Mladenic D.: Combinatorial optimization in inductive concept learning, Proc. 10th Intern. Conf. on Machine Learning (Amherst, June 1993), Morgan Kaufmann, 1993, 205–211.Google Scholar
  21. 21.
    Muggleton S. (ed.): Inductive Logic Programming Academic Press, 1992.Google Scholar
  22. 22.
    Murphy, P. M.,, Aha, D. W.: UCI Repository of machine learning databases [Machine-readable data repository]. Irvine, CA: University of California, Department of Information and Computer Science, 1991.Google Scholar
  23. 23.
    Niblett, T., Bratko, I.: Learning decision rules in noisy domains, Proc. Expert Systems 86, Brighton, UK, December 1986.Google Scholar
  24. 24.
    U.Pompe, M.Kova.cic, I.Kononenko: SFOIL: Stochastic approach to inductive logic programming. Proc. Slovenian Conf. on Electrical Engineering and Computer Science, Portoroz, Slovenia, Sept. 1993, 189–192.Google Scholar
  25. 25.
    Quinlan R.: Induction of decision trees, Machine Learning, 1 (1986), 81–106.Google Scholar
  26. 26.
    Smyth P., Goodman R.M.: Rule induction using information theory, in: knowledge Discovery in Databases (Eds. G.Piatetsky-Shapiro, W.Frawley ), MIT Press, 1990.Google Scholar
  27. 27.
    Smyth P., Goodman R.M., Higgins C.: A hybrid Rule-based Bayesian Classifier, Proc. European Conf. on Artificial Intelligence, Stockholm, August, 1990, 610–615.Google Scholar

Copyright information

© Springer-Verlag Wien 1995

Authors and Affiliations

  • I. Kononenko
    • 1
  • E. Simec
    • 1
  1. 1.University of LjubljanaLjubljanaSlovenia

Personalised recommendations