Skip to main content

Proposal for a New Reduct Method for Decision Tables and an Improved STRIM

  • Conference paper
  • First Online:
Data Mining and Big Data (DMBD 2017)

Part of the book series: Lecture Notes in Computer Science ((LNISA,volume 10387))

Included in the following conference series:

Abstract

Rough Sets theory is widely used as a method for estimating and/or inducing the knowledge structure of if-then rules from a decision table after a reduct of the table. The concept of a reduct is that of constructing a decision table by necessary and sufficient condition attributes to induce the rules. This paper retests the reduct by the conventional methods by the use of simulation datasets after summarizing the reduct briefly and points out several problems of their methods. Then, a new reduct method based on a statistical viewpoint is proposed and confirmed to be valid by applying it to the simulation datasets. The new reduct method is incorporated into STRIM (Statistical Test Rule Induction Method), and plays an effective role for the rule induction. The STRIM including the reduct method is also applied for a UCI dataset and shows to be very useful and effective for estimating if-then rules hidden behind the decision table of interest.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Pawlak, Z.: Rough sets. Int. J. Inf. Comput. Sci. 11(5), 341–356 (1982)

    Article  MATH  Google Scholar 

  2. Grzymala-Busse, J.W.: LERS — a system for learning from examples based on rough sets. In: Słowiński, R. (ed.) Intelligent Decision Support — Handbook of Applications and Advances of the Rough Sets Theory. Theory and Decision Library, vol. 11, pp. 3–18. Kluwer Academic Publishers, Amsterdam (1992)

    Google Scholar 

  3. Skowron, A., Rauser, C.M.: The discernibility matrix and functions in information systems. In: Słowiński, R. (ed.) Intelligent Decision Support — Handbook of Applications and Advances of the Rough Sets Theory. Theory and Decision Library, vol. 11, pp. 331–362. Kluwer Academic Publishers, Amsterdam (1992)

    Google Scholar 

  4. Pawlak, Z.: Rough set fundamentals; KFIS Autumn Coference Tutorial, pp. 1–32 (1996)

    Google Scholar 

  5. Ślęzak, D.: Various approaches to reasoning with frequency based decision reducts: a survey. In: Polkowski, L., Tsumoto, S., Lin, T.Y. (eds.) Rough Set Method and Applications, vol. 56, pp. 235–285. Physical-Verlag, Heidelberg (2000)

    Chapter  Google Scholar 

  6. Bao, Y.G., Du, X.Y., Deng, M.G., Ishii, N.: An efficient method for computing all reducts. Trans. Jpn. Soc. Artif. Intell. 19(3), 166–173 (2004)

    Article  Google Scholar 

  7. Jia, X., Shang, L., Zhou, Z., Yao, Y.: Generalized attribute reduct in rough set theory. Knowl.-Based Syst. 91, 204–218 (2016). Elsevier

    Article  Google Scholar 

  8. Matsubayashi, T., Kato, Y., Saeki, T.: A new rule induction method from a decision table using a statistical test. In: Li, T., Nguyen, H.S., Wang, G., Grzymala-Busse, J., Janicki, R., Hassanien, A.E., Yu, H. (eds.) RSKT 2012. LNCS (LNAI), vol. 7414, pp. 81–90. Springer, Heidelberg (2012). doi:10.1007/978-3-642-31900-6_11

    Chapter  Google Scholar 

  9. Kato, Y., Saeki, T., Mizuno, S.: Studies on the necessary data size for rule induction by STRIM. In: Lingras, P., Wolski, M., Cornelis, C., Mitra, S., Wasilewski, P. (eds.) RSKT 2013. LNCS (LNAI), vol. 8171, pp. 213–220. Springer, Heidelberg (2013). doi:10.1007/978-3-642-41299-8_20

    Chapter  Google Scholar 

  10. Kato, Y., Saeki, T., Mizuno, S.: Considerations on rule induction procedures by STRIM and their relationship to VPRS. In: Kryszkiewicz, M., Cornelis, C., Ciucci, D., Medina-Moreno, J., Motoda, H., Raś, Z.W. (eds.) RSEISP 2014. LNCS (LNAI), vol. 8537, pp. 198–208. Springer, Cham (2014). doi:10.1007/978-3-319-08729-0_19

    Google Scholar 

  11. Kato, Y., Saeki, T., Mizuno, S.: Proposal of a statistical test rule induction method by use of the decision table. Appl. Soft Compt. 28, 160–166 (2015). Elsevier

    Article  Google Scholar 

  12. Asunction, A., Newman, D.J.: UCI machine learning repository. University of California, School of Information and Computer Science, Irvine (2007). http://www.ics.edu/~mlearn/MlRepository.html

  13. Walpole, R.E., Myers, R.H., Myers, S.L., Ye, K.: Probability and Statistics for Engineers and Scientists, 8th edn., pp. 374–377. Pearson Prentice Hall, New Jersey (2007)

    Google Scholar 

  14. Walpole, R.E., Myers, R.H., Myers, S.L., Ye, K.: Probability and Statistics for Engineers and Scientists, 8th edn., pp. 361–364. Pearson Prentice Hall, New Jersey (2007)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Tetsuro Saeki .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2017 Springer International Publishing AG

About this paper

Cite this paper

Fei, J., Saeki, T., Kato, Y. (2017). Proposal for a New Reduct Method for Decision Tables and an Improved STRIM. In: Tan, Y., Takagi, H., Shi, Y. (eds) Data Mining and Big Data. DMBD 2017. Lecture Notes in Computer Science(), vol 10387. Springer, Cham. https://doi.org/10.1007/978-3-319-61845-6_37

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-61845-6_37

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-61844-9

  • Online ISBN: 978-3-319-61845-6

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics