An Application of Maximum Probabilistic-Based Rough Set on ID3

  • Utpal PalEmail author
  • Sharmistha Bhattacharya (Halder)
Conference paper
Part of the Smart Innovation, Systems and Technologies book series (SIST, volume 106)


The Iterative Dichotomiser 3 (ID3) recursively partitions the problem domain, producing the subsequent partitions as decision trees. However, the classification accuracy of ID3 decreases in dealing with very large datasets. Maximum probabilistic-based rough set (MPBRS) is a sophisticated approach for insignificant feature reduction. This paper represents an application of MPBRS on ID3 as a method for insignificant attribute elimination. The paper further investigates ID3 using Pawlak rough set and also Bayesian decision-theoretic rough set (BDTRS) for comparison. The experimental result, using R language, on datasets collected from UCI Machine-Learning repository shows that MPBRS-based ID3 induces enriched decision tree resulting in improved classification.


ID3 MPBRS Feature reduction 


  1. 1.
    Halder, S.B.: A study on bayesian decision theoretic rough set. Int. J. Rough Sets Data Anal. (IJRSDA) 1(1), 1–14 (2014)Google Scholar
  2. 2.
    Halder, S.B., Debnath, K.: Attribute reduction using bayesian decision theoretic rough set models. Int. J. Rough Sets Data Anal. (IJRSDA) 1(1), 15–31 (2014)Google Scholar
  3. 3.
    Harrison, D., Rubinfeld, D.L.: Hedonic prices and the demand for clean air. J. Environ. Econ. Manag. 5, 81–102 (1978)CrossRefGoogle Scholar
  4. 4.
    Ihaka, R., Gentleman, R.: R: a language for data analysis and graphics. J. Comput. Graph. Stat. 5, 299–314 (1996)Google Scholar
  5. 5.
    Pal, U.: Halder, S.B., Debnath, K.: A study on CART based on maximum probabilistic-based rough set. In: MIKE, vol. 10682, pp. 1–12. LNAI-Springer (2017).
  6. 6.
    Pal, U., Halder, S.B., Debnath, K.: R implementation of bayesian decision theoretic rough set model for attribute reduction. In: I3SET, vol. 11. LNNS-Springer (2017).
  7. 7.
    Pawlak, Z.: Rough sets. Int. J. Comput. Inform. Sci. 11, 341–356 (1982)CrossRefGoogle Scholar
  8. 8.
    Quinlan, J.R.: Induction of decision trees. Mach. Learn. 1, 81–106 (1986)Google Scholar
  9. 9.
    Quinlan, J.R.: Simplifying decision trees. Int. J. Man-Mach. Stud. 27, 221–234 (1987)CrossRefGoogle Scholar
  10. 10.
    R Development Core Team: R: A Language and Environment for Statistical Computing. Vienna, Austria, 2011: The R Foundation for Statistical Computing. ISBN: 3-900051-07-0. 08 June 2016
  11. 11.
    Riza, L.S., Janusz, A., Bergmeir, C., Cornelis, C., Herrera, F., Slezak, D., Benitez, J.M.: Implementing algorithms of rough set theory and fuzzy rough set theory in the R package “RoughSets”. Inf. Sci. (ELSEVIER), 287, 68–89 (2014)Google Scholar
  12. 12.
    Skowron, A., Rauszer, C.: The discernibility matrices and functions in information systems. In: Slowiski, R. (ed.) Intelligent Decision Support. Handbook of Applications and Advances of the Rough Set Theory. Kluwer Academic Publishers, Dordrecht, pp. 311–362 (1992)Google Scholar
  13. 13.
    Slezak, D., Ziarko, W.: Bayesian rough set model. In: Proceedings of the International Workshop on Foundation of Data mining, Japan, pp. 131–135, 9 Dec 2002Google Scholar
  14. 14.
    Yao, Y.Y.: Generalized rough set models. In: Polkowski, L., Skowron, A. (eds.) Rough Sets in Knowledge Discovery, pp. 286–318. Physica-Verlag, Heidelberg (1998)Google Scholar
  15. 15.
    Yao, Y.Y.: Probabilistic approaches on rough sets. Expert Syst. 20, 287–297 (2003)CrossRefGoogle Scholar
  16. 16.
    Zhiling, C., Qingmin, Z., Qinglian, Y.: A method based on rough set to construct decision tree. J. Nanjing Univ. Technol. 27, 80–83 (2005)Google Scholar
  17. 17.
    Ziarko, W.: Variable precision rough set model. J. Comput. Syst. Sci. 46, 39–59 (1993)MathSciNetCrossRefGoogle Scholar
  18. 18.
  19. 19.

Copyright information

© Springer Nature Singapore Pte Ltd. 2019

Authors and Affiliations

  1. 1.Tripura UniversityAgartalaIndia

Personalised recommendations