Advertisement

Memetic Computing

, Volume 10, Issue 3, pp 291–300 | Cite as

PSO with surrogate models for feature selection: static and dynamic clustering-based methods

  • Hoai Bach Nguyen
  • Bing Xue
  • Peter Andreae
Regular Research Paper
  • 141 Downloads

Abstract

Feature selection is an important but often expensive process, especially with a large number of instances. This problem can be addressed by using a small training set, i.e. a surrogate set. In this work, we propose to use a hierarchical clustering method to build various surrogate sets, which allows to analyze the effect of surrogate sets with different qualities and quantities on the feature subsets. Further, a dynamic surrogate model is proposed to automatically adjust surrogate sets for different datasets. Based on this idea, a feature selection system is developed using particle swarm optimization as the search mechanism. The experiments show that the hierarchical clustering method can build better surrogate sets to reduce the computational time, improve the feature selection performance, and alleviate overfitting. The dynamic method can automatically choose suitable surrogate sets to further improve the classification accuracy.

Keywords

Surrogate model Feature selection Particle swarm optimization Clustering Classification 

References

  1. 1.
    Banka H, Dara S (2015) A hamming distance based binary particle swarm optimization (HDBPSO) algorithm for high dimensional feature selection, classification and validation. Pattern Recognit Lett 52:94–100CrossRefGoogle Scholar
  2. 2.
    Chen Q, Zhang M, Xue B (2017) Feature selection to improve generalization of genetic programming for high-dimensional symbolic regression. IEEE Trans Evol Comput 21(5):792–806.  https://doi.org/10.1109/TEVC.2017.2683489 CrossRefGoogle Scholar
  3. 3.
    Chinnaswamy A, Srinivasan R (2016) Hybrid feature selection using correlation coefficient and particle swarm optimization on microarray gene expression data. In: Snášel V, Abraham A, Krömer P, Pant M, Muda A (eds) Innovations in bio-inspired computing and applications. Springer, pp 229–239Google Scholar
  4. 4.
    Eberhart RC, Shi Y (1998) Comparison between genetic algorithms and particle swarm optimization. In: International conference on evolutionary programming. Springer, pp 611–616Google Scholar
  5. 5.
    Friedman J, Hastie T, Tibshirani R (2001) The elements of statistical learning, vol 1. Springer series in statistics. Springer, BerlinzbMATHGoogle Scholar
  6. 6.
    Guyon I, Elisseeff A (2003) An introduction to variable and feature selection. J Mach Learn Res 3(Mar):1157–1182zbMATHGoogle Scholar
  7. 7.
    Jiang S, Chin KS, Wang L, Qu G, Tsui KL (2017) Modified genetic algorithm-based feature selection combined with pre-trained deep neural network for demand forecasting in outpatient department. Expert Syst Appl 82:216–230CrossRefGoogle Scholar
  8. 8.
    Kennedy J (2011) Particle swarm optimization. In: Sammut C, Webb GI (eds) Encyclopedia of machine learning. Springer, pp 760–766Google Scholar
  9. 9.
    Kennedy J, Eberhart RC (1997) A discrete binary version of the particle swarm algorithm. In: IEEE international conference on systems, man, and cybernetics, computational cybernetics and simulation, vol 5. IEEE, pp 4104–4108Google Scholar
  10. 10.
    Koza JR (1999) Genetic programming III: darwinian invention and problem solving, vol 3. Morgan Kaufmann, BurlingzbMATHGoogle Scholar
  11. 11.
    Li Z, Liu J, Yang Y, Zhou X, Lu H (2014) Clustering-guided sparse structural learning for unsupervised feature selection. IEEE Trans Knowl Data Eng 26(9):2138–2150CrossRefGoogle Scholar
  12. 12.
    Lichman M (2013) UCI machine learning repository. University of California, School of Information and Computer Sciences, Irvine, CA. http://archive.ics.uci.edu/ml
  13. 13.
    MacQueen J et al (1967) Some methods for classification and analysis of multivariate observations. In: Proceedings of the fifth Berkeley symposium on mathematical statistics and probability, Oakland, CA, USA, vol 1, pp 281–297Google Scholar
  14. 14.
    Marill T, Green DM (1963) On the effectiveness of receptors in recognition systems. IEEE Trans Inf Theory 9(1):11–17CrossRefGoogle Scholar
  15. 15.
    Muni DP, Pal NR, Das J (2006) Genetic programming for simultaneous feature selection and classifier design. IEEE Trans Syst Man Cybern Part B (Cybern) 36(1):106–117CrossRefGoogle Scholar
  16. 16.
    Murtagh F, Legendre P (2014) Wards hierarchical agglomerative clustering method: which algorithms implement wards criterion? J Classif 31(3):274–295MathSciNetCrossRefzbMATHGoogle Scholar
  17. 17.
    Neshatian K, Zhang M, Andreae P (2012) A filter approach to multiple feature construction for symbolic learning classifiers using genetic programming. IEEE Trans Evol Comput 16(5):645–661CrossRefGoogle Scholar
  18. 18.
    Nguyen BH, Xue B, Andreae P (2017a) A novel binary particle swarm optimization algorithm and its applications on knapsack and feature selection problems. In: Proceeding of the 20th Asia pacific symposium on intelligent and evolutionary systems. Springer, pp 319–332Google Scholar
  19. 19.
    Nguyen HB, Xue B, Liu I, Andreae P, Zhang M (2015) Gaussian transformation based representation in particle swarm optimisation for feature selection. In: European conference on the applications of evolutionary computation. Springer, pp 541–553Google Scholar
  20. 20.
    Nguyen HB, Xue B, Andreae P (2016) Mutual information for feature selection: estimation or counting? Evol Intel 9(3):95–110CrossRefGoogle Scholar
  21. 21.
    Nguyen HB, Xue B, Andreae P (2017b) Surrogate-model based particle swarm optimisation with local search for feature selection in classification, vol 10199. Springer, Berlin, pp 487–505Google Scholar
  22. 22.
    Niu G (2017) Feature selection optimization. Springer, Berlin, pp 139–171Google Scholar
  23. 23.
    Olvera-López JA, Carrasco-Ochoa JA, Martínez-Trinidad JF, Kittler J (2010) A review of instance selection methods. Artif Intell Rev 34(2):133–143CrossRefGoogle Scholar
  24. 24.
    Siedlecki W, Sklansky J (1989) A note on genetic algorithms for large-scale feature selection. Pattern Recognit Lett 10(5):335–347CrossRefzbMATHGoogle Scholar
  25. 25.
    Tang J, Alelyani S, Liu H (2014) Feature selection for classification: a review. In: Data classification: algorithms and applications. CRC PressGoogle Scholar
  26. 26.
    Wang F, Liang J (2016) An efficient feature selection algorithm for hybrid data. Neurocomputing 193:33–41CrossRefGoogle Scholar
  27. 27.
    Whitney AW (1971) A direct method of nonparametric measurement selection. IEEE Trans Comput 100(9):1100–1103CrossRefzbMATHGoogle Scholar
  28. 28.
    Xue B, Zhang M, Browne WN (2012) Multi-objective particle swarm optimisation (pso) for feature selection. In: Proceedings of the 14th annual conference on genetic and evolutionary computation. ACM, pp 81–88Google Scholar
  29. 29.
    Xue B, Nguyen S, Zhang M (2014) A new binary particle swarm optimisation algorithm for feature selection. In: European conference on the applications of evolutionary computation. Springer, pp 501–513Google Scholar
  30. 30.
    Xue B, Zhang M, Browne WN, Yao X (2016) A survey on evolutionary computation approaches to feature selection. IEEE Trans Evol Comput 20(4):606–626CrossRefGoogle Scholar

Copyright information

© Springer-Verlag GmbH Germany, part of Springer Nature 2018

Authors and Affiliations

  1. 1.Evolutionary Computation Research GroupVictoria University of WellingtonWellingtonNew Zealand

Personalised recommendations