Inference Based Classifier: Efficient Construction of Decision Trees for Sparse Categorical Attributes

  • Shih-Hsiang Lo
  • Jian-Chih Ou
  • Ming-Syan Chen
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 2737)


Classification is an important problem in data mining and machine learning, and the decision tree approach has been identified as an efficient means for classification. According to our observation on real data, the distribution of attributes with respect to information gain is very sparse because only a few attributes are major discriminating attributes where a discriminating attribute is an attribute, by whose value we are likely to distinguish one tuple from another. In this paper, we propose an efficient decision tree classifier for categorical attribute of sparse distribution. In essence, the proposed Inference Based Classifier (abbreviated as IBC) can alleviate the ôoverfittingö problem of conventional decision tree classifiers. Also, IBC has the advantage of deciding the splitting number automatically based on the generated partitions. IBC is empirically compared to C4.5, SLIQ and K-means based classifiers. The experimental results show that IBC significantly outperforms the companion methods in execution efficiency for dataset with categorical attributes of sparse distribution while attaining approximately the same classification accuracies. Consequently, IBC is considered as an accurate and efficient classifier for sparse categorical attributes.


Decision Tree Information Gain GINI Index Categorical Attribute Target Class 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. 1.
    Breiman, L., Friedman, J.H., Olshen, R.A., Sotne, C.J.: Classification and Regression Trees. Wadsworth, Belmont (1984)zbMATHGoogle Scholar
  2. 2.
    NASA Ames Research Center. Introduction to IND Version 2.1. GA23-2475-02 edition (1992)Google Scholar
  3. 3.
    Chesseman, P., Kelly, J., Self, M., et al.: AutoClass: A Bayesian classification system. In: 5th Int’l Conf. on Machine Learning. Morgan Kaufman, San Francisco (1988)Google Scholar
  4. 4.
    Chou, P.A.: Optimal Partitioning for Classification and Resgression Trees. IEEE Transactions on Pattern Analysis and Machine Intelligence 13(4) (1991)Google Scholar
  5. 5.
    Fayyad, U.: On the Induction of Decision Trees for multiple Concept Learning. PhD thesis, The University of Michigan, Ann arbor (1991)Google Scholar
  6. 6.
    Fayyad, U., Irani, K.B.: Multi-interval discretization of continuous-valued attributes for classification learning. In: Proc. of the 13th International Joint Conference on Artificial Intelligence (1993)Google Scholar
  7. 7.
    Goldberg, D.E.: Genetic Algorithms in Search, Optimization and Machine Learning. Morgan Kaufmann, San Francisco (1989)zbMATHGoogle Scholar
  8. 8.
    Han, J., Kamber, M.: Data Mining: Concepts and Techniques. Morgan Kaufmann Publishers, San Francisco (2000)Google Scholar
  9. 9.
    Mehta, M., Agrawal, R., Rissanen, J.: SLIQ: A fast scalable classifier for data mining. In: Apers, P.M.G., Bouzeghoub, M., Gardarin, G. (eds.) EDBT 1996. LNCS, vol. 1057. Springer, Heidelberg (1996)Google Scholar
  10. 10.
    Mehta, M., Rissanen, J., Agrawal, R.: MDL-based decision tree pruning. In: Int’l Conference on Knowledge Discovery in Databases and Data Mining (1995)Google Scholar
  11. 11.
    Michie, D., Spiegelhalter, D.J., Taylor, C.C.: Machine Learning, Neural and Statistical Classification. Ellis Horwood (1994)Google Scholar
  12. 12.
    Quinlan, J.R.: C4.5: Programs for Machine Learning. Morgan Kaufmann, San Francisco (1993)Google Scholar
  13. 13.
    Quinlan, J.R.: Induction of decision trees. Machine Learning (1986)Google Scholar
  14. 14.
    Quinlan, J.R., Rivest, R.L.: Inferring decision trees using minimum description length principle. Information and Computtation (1989)Google Scholar
  15. 15.
    Rastogi, R., Shim, K.: PUBLIC: A Decision Tree Classifier that Integrates Building and Pruning. In: Proceedings of 24rd International Conference on Very Large Data Bases, New York City, New York, USA, August 24-27 (1998)Google Scholar
  16. 16.
    Ripley, B.D.: Pattern Recognition and Neural Networks. Cambridge University Press, Cambridge (1996)zbMATHGoogle Scholar
  17. 17.
    Shafer, J., Agrawal, R., Mehta, M.: SPRINT: A scalable parallel classifier for data mining. In: Proc. of the VLDB Conference, Bombay, India (1996)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2003

Authors and Affiliations

  • Shih-Hsiang Lo
    • 1
  • Jian-Chih Ou
    • 1
  • Ming-Syan Chen
    • 1
  1. 1.Department of Electrical EngineeringNational Taiwan UniversityTaipeiTaiwan, ROC

Personalised recommendations