Skip to main content

Feature Selection with Adjustable Criteria

  • Conference paper
Book cover Rough Sets, Fuzzy Sets, Data Mining, and Granular Computing (RSFDGrC 2005)

Part of the book series: Lecture Notes in Computer Science ((LNAI,volume 3641))

Abstract

We present a study on a rough set based approach for feature selection. Instead of using significance or support, Parameterized Average Support Heuristic (PASH) considers the overall quality of the potential set of rules. It will produce a set of rules with balanced support distribution over all decision classes. Adjustable parameters of PASH can help users with different levels of approximation needs to extract predictive rules that may be ignored by other methods. This paper finetunes the PASH heuristic and provides experimental results to PASH.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Blake, C.L., Merz, C.J.: UCI Repository of machine learning databases, University of California (1998), Available at http://www.ics.uci.edu/~mlearn/MLRepository.html

  2. Bellman, R.: Adaptive Control Processes: A Guided Tour. Princeton University Press, Princeton (1961)

    MATH  Google Scholar 

  3. Deogun, J.S., Raghavan, V.V., Sever, H.: Rough set based classification methods and extended decision tables. In: Proc. of The Int. Workshop on Rough Sets and Soft Computing, pp. 302–309 (1994)

    Google Scholar 

  4. Deogun, J.S., Choubey, S.K., Raghavan, V.V., Sever, H.: On feature selection and effective classifiers. Journal of American Society for Information Science 49(5), 423–434 (1998)

    Article  Google Scholar 

  5. Dash, M., Liu, H.: Feature selection for classification. Intelligence Data Analysis 1, 131–156 (1997)

    Article  Google Scholar 

  6. Dy, J.G., Brodley, C.E.: Feature selection for unsupervised learning. The Journal of Machine Learning Research archive 5, 845–889 (2004)

    MathSciNet  Google Scholar 

  7. Hu, X.: Knowledge discovery in databases: an attribute-oriented rough set approach, PhD thesis, University of Regina, Canada (1995)

    Google Scholar 

  8. John, G.H., Kohavi, R., Pfleger, K.: Irrelevant features and the subset selection problem. In: Bergadano, F., De Raedt, L. (eds.) ECML 1994. LNCS, vol. 784, pp. 121–129. Springer, Heidelberg (1994)

    Google Scholar 

  9. Kira, K., Rendell, L.: A practical approach to feature selection. In: Proceedings of the 9th International Conference on Machine Learning, pp. 249–256 (1992)

    Google Scholar 

  10. Kononenko, I.: Estimating attributes: analysis and extension of relief. In: Proceedings of European Conference on Machine Learning, pp. 171–182 (1994)

    Google Scholar 

  11. Lin, T.Y.: Attribute (Feature) completion- the theory of attributes from data mining prospect. In: Proceedings of International Conference on Data Mining, Maebashi, Japan, pp. 282–289 (2002)

    Google Scholar 

  12. Liu, H., Setiono, R.: A probabilistic approach to feature selection - a filter solution. In: Proceedings of the 13th International Conference on Machine Learning, pp. 319–327 (1996)

    Google Scholar 

  13. Narendra, P.M., Fukunaga, K.: A branch and bound algorithm for feature subset selection. IEEE Transactions on Computers C-26(9), 917–922 (1977)

    Article  MATH  Google Scholar 

  14. Trunk, G.V.: A problem of dimensionality: a simple example. IEEE Transactions on Pattern Analysis and Machine Intelligence 1(3), 306–307 (1979)

    Article  Google Scholar 

  15. Yao, Y.Y., Wong, S.K.M.: ‘A decision theoretic framework for approximating concepts. International Journal of Man-machine Studies 37(6), 793–809 (1992)

    Article  Google Scholar 

  16. Zhang, M., Yao, J.T.: A rough set approach to feature selection. In: Proceedings of the 23rd International Conference of NAFIPS, Canada, pp. 434–439 (2004)

    Google Scholar 

  17. Zhong, N., Dong, J.Z., Ohsuga, S.: Using rough sets with heuristics for feature selection. Journal of Intelligent Information Systems 16, 199–214 (2001)

    Article  MATH  Google Scholar 

  18. Ziarko, W.: Variable precision rough set model. Journal of Computer and System Sciences 46, 39–59 (1993)

    Article  MATH  MathSciNet  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2005 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Yao, J., Zhang, M. (2005). Feature Selection with Adjustable Criteria. In: Ślęzak, D., Wang, G., Szczuka, M., Düntsch, I., Yao, Y. (eds) Rough Sets, Fuzzy Sets, Data Mining, and Granular Computing. RSFDGrC 2005. Lecture Notes in Computer Science(), vol 3641. Springer, Berlin, Heidelberg. https://doi.org/10.1007/11548669_22

Download citation

  • DOI: https://doi.org/10.1007/11548669_22

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-28653-0

  • Online ISBN: 978-3-540-31825-5

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics