Advertisement

Efficient Feature Selection Algorithm Based on Population Random Search with Adaptive Memory Strategies

  • Ilya Hodashinsky
  • Konstantin Sarin
  • Artyom Slezkin
Conference paper
Part of the Advances in Intelligent Systems and Computing book series (AISC, volume 874)

Abstract

The effectiveness of classifier training methods depends significantly on the number of features that describe a dataset to be classified. This research proposes a new approach to feature selection that combines random and heuristic search strategies. A solution is represented as a binary vector whose size is determined by the number of features in a dataset. New solutions are generated randomly using normal and uniform distributions. The heuristic underlying the proposed approach is formulated as follows: the chance for a feature to be included into the next generation is proportional to the frequency of its occurrence in the previous best solutions. For feature selection, we have used the algorithm with a fuzzy classifier. The method is tested on several datasets from the KEEL repository. Comparison with analogs is presented. To compare feature selection algorithms, we found the values their efficiency criterion. This criterion reflects the accuracy of the classification and the speed of finding the appropriate features.

Keywords

Feature selection Classification Population random search Adaptive memory strategies 

Notes

Acknowledgements

This work was supported by the Russian Foundation for Basic Research, project no. 16-07-00034.

References

  1. 1.
    Aggarwal, C.C.: An introduction to data classification. In: Data Classification: Algorithms and Applications, pp. 2–36. CRC Press, New York (2015)Google Scholar
  2. 2.
    Hu, X., Pedrycz, W., Wang, X.: Fuzzy classifiers with information granules in feature space and logic-based computing. Pattern Recognit. 80, 156–167 (2018)CrossRefGoogle Scholar
  3. 3.
    Kohavi, R., John, G.H.: Wrappers for feature subset selection. Artif. Intell. 97(1), 273–324 (1997)CrossRefGoogle Scholar
  4. 4.
    Dash, M., Liu, H.: Feature selection for classification. Intell. Data Anal. 1(1–4), 131–156 (1997)CrossRefGoogle Scholar
  5. 5.
    Bolon-Canedo, V., Sanchez-Marono, N., Alonso-Betanzos, A.: Feature Selection for High-Dimensional Data. Springer, Heidelberg (2015)CrossRefGoogle Scholar
  6. 6.
    Veerabhadrappa, R.L.: Multi-level dimensionality reduction methods using feature selection and feature extraction. Int. J. Artif. Intell. Appl. 1(4), 54–68 (2010)CrossRefGoogle Scholar
  7. 7.
    Yusta, S.C.: Different metaheuristic strategies to solve the feature selection problem. Pattern Recognit. Lett. 30(5), 525–534 (2009)CrossRefGoogle Scholar
  8. 8.
    Pedergnana, M., Marpu, P.R., Dalla Mura, M., Benediktsson, J.A., Bruzzone, L.: A novel technique for optimal feature selection in attribute profiles based on genetic algorithms. IEEE Trans. Geosci. Remote Sens. 51(6), 3514–3528 (2013)CrossRefGoogle Scholar
  9. 9.
    Aladeemy, M., Tutun, S., Khasawneh, M.T.: A new hybrid approach for feature selection and support vector machine model selection based on self-adaptive cohort intelligence. Expert Syst. Appl. 88, 118–131 (2017)CrossRefGoogle Scholar
  10. 10.
    Hodashinsky, I.A., Mekh, M.A.: Fuzzy classifier design using harmonic search methods. Program. Comput. Softw. 43(1), 37–46 (2017)MathSciNetCrossRefGoogle Scholar
  11. 11.
    Gurav, A., Nair, V., Gupta U., Valadi, J.: Glowworm swarm based informative attribute selection using support vector machines for simultaneous feature selection and classification. In: Panigrahi, B.K., et al. (eds.) SEMCCO 2014. LNCS, vol. 8947, pp. 27–37. Springer, Heidelberg (2015)CrossRefGoogle Scholar
  12. 12.
    Marinaki, M., Marinakis, Y., Zopounidis, C.: Honey bees mating optimization algorithm for financial classification problems. Appl. Soft Comput. 10, 806–812 (2010)CrossRefGoogle Scholar
  13. 13.
    Glover, F., Laguna, M.: Tabu Search. Springer (1997)Google Scholar
  14. 14.
    Taillard, E.D., Gambardella, L.M., Gendreau, M., Potvin, J.-Y.: Adaptive memory programming: a unified view of metaheuristics. Eur. J. Oper. Res. 135(1), 1–16 (2001)MathSciNetCrossRefGoogle Scholar
  15. 15.
    Hedar, A., Abdel-Hakim, A.E., Almaraashi, M.: Granular-based dimension reduction for solar radiation prediction using adaptive memory programming. In: GECCO 2016 Companion Proceedings of the 2016 on Genetic and Evolutionary Computation Conference Companion, pp. 929–936. ACM, New York (2016)Google Scholar
  16. 16.
    Bezdek, J.C., Ehrlih, R., Full, W.: FCM: the fuzzy c-means clustering algorithm. Comput. Geosci. 10(2–3), 191–203 (1984)CrossRefGoogle Scholar
  17. 17.
    Kennedy, J., Eberhart, R.: A discrete binary version of the particle swarm algorithm. In: IEEE International Conference on System, Man, and Cybernetics, vol 5, pp. 4104–4108 (1997)Google Scholar
  18. 18.
    Pereria, L.A.M, Rodrigues, D., Almedia, T.N.S., Ramos, C.C.O., Souza, A.N., Yang, X.-S., Papa, J.P.: A binary Cuckoo search and its application for feature selection. In: Cuckoo Search and Firefly Algorithm. Studies in Computational Intelligence, vol. 516, pp. 141–154. Springer, London (2014)Google Scholar

Copyright information

© Springer Nature Switzerland AG 2019

Authors and Affiliations

  • Ilya Hodashinsky
    • 1
  • Konstantin Sarin
    • 1
  • Artyom Slezkin
    • 1
  1. 1.Tomsk State University of Control Systems and RadioelectronicsTomskRussia

Personalised recommendations