Advertisement

An Accumulative Points/Votes Based Approach for Feature Selection

  • Hamid Parvin
  • Behrouz Minaei-Bidgoli
  • Sajad Parvin
Part of the Lecture Notes in Computer Science book series (LNCS, volume 7042)

Abstract

This paper proposes an ensemble based approach for feature selection. We aim at overcoming the problem of parameter sensitivity of feature selection approaches. To do this we employ ensemble method. We get the results per different possible threshold values automatically in our algorithm. For each threshold value, we get a subset of features. We give a score to each feature in these subsets. Finally by use of ensemble method, we select the features which have the highest scores. This method is not a parameter sensitive one, and also it has been shown that using the method based on the fuzzy entropy results in more reliable selected features than the previous methods’. Empirical results show that although the efficacy of the method is not considerably decreased in most of cases, the method becomes free from setting of any parameter.

Keywords

Feature Selection Ensemble Methods Fuzzy Entropy 

References

  1. 1.
    Tan, P.N., Steinbach, M., Kumar, V.: Introduction to Data Mining, 1st edn. Addison-Wesley Longman Publishing Co. Inc. (2005)Google Scholar
  2. 2.
    Tsang, E.C.C., Yeung, D.S., Wang, X.Z.: OFFSS: optimal fuzzyvalued feature subset selection. IEEE Trans. Fuzzy Syst. 11(2), 202–213 (2003)CrossRefGoogle Scholar
  3. 3.
    Caruana, R., Freitag, D.: Greedy attribute selection. In: Proceedings of International Conference on Machine Learning, New Brunswick, NJ, pp. 28–33 (1994)Google Scholar
  4. 4.
    Baim, P.W.: A method for attribute selection in inductive learning systems. IEEE Trans. Pattern. Anal. Mach. Intell. 10(6), 888–896 (1988)CrossRefGoogle Scholar
  5. 5.
    Chaikla, N., Qi, Y.: Genetic algorithms in feature selection. In: Proceedings of the 1999 IEEE International Conference on Systems, Man, and Cybernetics, Tokyo, Japan, vol. 5, pp. 538–540 (1999)Google Scholar
  6. 6.
    De, R.K., Basak, J., Pal, S.K.: Neuro-fuzzy feature evaluation with theoretical analysis. Neural Netw. 12(10), 1429–1455 (1999)CrossRefGoogle Scholar
  7. 7.
    Battiti, R.: Using mutual information for selecting features in supervised neural net learning. IEEE Trans. Neural Netw. 5(4), 537–550 (1994)CrossRefGoogle Scholar
  8. 8.
    Dong, M., Kothari, R.: Feature subset selection using a new definition of classi-fiability. Pattern Recognit. Lett. 24(9), 1215–1225 (2003)CrossRefzbMATHGoogle Scholar
  9. 9.
    De, R.K., Pal, N.R., Pal, S.K.: Feature analysis: neural network and fuzzy set theoretic approaches. Pattern Recognit. 30(10), 1579–1590 (1997)CrossRefzbMATHGoogle Scholar
  10. 10.
    Platt, J.C.: Using analytic QP and sparseness to speed training of support vector machines. In: Proceedings of the Thirteenth Annual Conference on Neural Information Processing Systems, Denver, CO, pp. 557–563 (1999)Google Scholar
  11. 11.
    Shie, J.D., Chen, S.M.: Feature subset selection based on fuzzy entropy measures for handling classification problems. Springer Science+Business Media (2007)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2011

Authors and Affiliations

  • Hamid Parvin
    • 1
  • Behrouz Minaei-Bidgoli
    • 1
  • Sajad Parvin
    • 1
  1. 1.Nourabad Mamasani BranchIslamic Azad UniversityNourabad MamasaniIran

Personalised recommendations