Advertisement

Applications of Flower Pollination Algorithm in Feature Selection and Knapsack Problems

  • Hossam M. ZawbaaEmail author
  • E. Emary
Chapter
Part of the Studies in Computational Intelligence book series (SCI, volume 744)

Abstract

This chapter presents one of the recently proposed bio-inspired optimization methods, namely, flower pollination algorithm (FPA). FPA for its capability to adaptively search a large search space with maybe many local optima has been employed to solve many real problems. FPA is used to handle the feature selection problem in wrapper-based approach where it is used to search the space of feature for an optimal feature set maximizing a given criteria. The used feature selection methodology was applied in classification and regression data sets and was found to be successful. Moreover, FPA was applied to handle the knapsack problem where different data sets with different dimensions were adopted to assess FPA performance. On all the mentioned problems FPA was benchmarked against bat algorithm (BA), genetic algorithm (GA), particle swarm optimization (PSO) and is found to be very competitive.

Keywords

Flower pollination algorithm Bio-inspired optimization Evolutionary computation Feature selection Knapsack problem 

References

  1. 1.
    Chizi, B., Rokach, L., Maimon, O.: A survey of feature selection techniques, pp. 1888–1895. IGI Global (2009)Google Scholar
  2. 2.
    Chandrashekar, G., Sahin, F.: A survey on feature selection methods. Comput. Electr. Eng. 40(1), 16–28 (2014)Google Scholar
  3. 3.
    Huang, C.L.: ACO-based hybrid classification system with feature subset selection and model parameters optimization. Neurocomputing 73(1–3), 438–448 (2009)CrossRefGoogle Scholar
  4. 4.
    Chen, Y., Miao, D., Wang, R.: A rough set approach to feature selection based on ant colony optimization. Pattern Recognit. Lett. 31(3), 226–233 (2010)Google Scholar
  5. 5.
    Kohavi, R., John, G.H.: Wrappers for feature subset selection. Artif. Intell. 97(1), 273–324 (1997)Google Scholar
  6. 6.
    Xue, B., Zhang, M., Browne, W.N.: Particle swarm optimisation for feature selection in classification: novel initialisation and updating mechanisms. Appl. Soft Comput. 18, 261–276 (2014)Google Scholar
  7. 7.
    Guyon, I., Elisseeff, A.: An introduction to variable and attribute selection. Mach. Learn. Res. 3, 1157–1182 (2003)Google Scholar
  8. 8.
    Chuang, L.Y., Tsai, S.W., Yang, C.H.: Improved binary particle swarm optimization using catfish effect for feature selection. Expert Syst. Appl. 38(10), 12699–12707 (2011)Google Scholar
  9. 9.
    Xue, B., Zhang, M., Browne, W.N.: Particle swarm optimization for feature selection in classification: a multi-objective approach. IEEE Trans. Cybern. 43(6), 1656–1671 (2013)Google Scholar
  10. 10.
    Shoghian, S., Kouzehgar, M.: A comparison among wolf pack search and four other optimization algorithms. Comput. Electr. Autom. Control Inf. Eng. 6(12), 1619–1624 (2012)Google Scholar
  11. 11.
    Valdez, F.: Bio-Inspired Optimization Methods. Handbook of Computational Intelligence, pp. 1533–1538. Springer (2015)Google Scholar
  12. 12.
    Jr, I.F., Yang, X.S., Fister, I., Brest, J., Fister, D.: A brief review of nature-inspired algorithms for optimization. Elektrotehniski Vestnik 80(3), 116–122 (2013)Google Scholar
  13. 13.
    Holland, J.H.: Adaptation in natural and artificial systems. MIT Press, Cambridge, MA, USA (1992)Google Scholar
  14. 14.
    Xue, X., Yao, M., Wu, Z., Yang, J.: Genetic ensemble of extreme learning machine. Neurocomputing 129(1), 175–184 (2014)CrossRefGoogle Scholar
  15. 15.
    Zhu, Z.X., Ong, Y.S., Dash, M.: Wrapper-filter feature selection algorithm using a memetic framework. IEEE Trans. Syst. Man Cybern. Part B: Cybern 37, 70–76 (2007)Google Scholar
  16. 16.
    Eberhart, R., Kennedy, J.: A new optimizer using particle swarm theory, pp. 39–43. International Symposium on Micro Machine and Human, Science (1995)Google Scholar
  17. 17.
    Kennedy, J., Eberhart, R.C.: A discrete binary version of the particle swarm algorithm. IEEE International Conference on System, Man and Cybernetics, vol. 5, pp. 4104–4108 (1997)Google Scholar
  18. 18.
    Firpi, H.A., Goodman, E.: Swarmed feature selection. In: 33rd Applied Imagery Pattern Recognition Workshop, USA, pp. 112–118 (2004)Google Scholar
  19. 19.
    Nakamura, R.Y.M., Pereira, L.A.M., Costa, K.A., Rodrigues, D., Papa, J.P., Yang, X.S.: BBA: a binary bat algorithm for feature selection. In: IEEE XXV Conference on Graphics, Patterns and Images, pp. 291–297 (2012)Google Scholar
  20. 20.
    Ming, H.: A rough set based hybrid method to feature selection. In: International Symposium on Knowledge Acquisition and Modeling, pp. 585–588 (2008)Google Scholar
  21. 21.
    Li, X.L., Shao, Z.J., Qian, J.X.: An optimizing method based on autonomous animates: Fish-swarm algorithm, pp. 32–38. Methods and practices of system, engineering (2002)Google Scholar
  22. 22.
    Karaboga, D., Basturk, B.: A powerful and efficient algorithm for numerical function optimization: artificial bee colony (ABC) algorithm. J. Glob. Optim. 39, 459–471 (2007)Google Scholar
  23. 23.
    Sundareswaran, K., Sreedevi, V.T.: Development of novel optimization procedure based on honey bee foraging behavior. In: International Conference on Systems, Man and Cybernetics, pp. 1220–1225 (2008)Google Scholar
  24. 24.
    Mirjalili, S.: The Ant Lion optimizer. Adv. Eng. Softw. 83, 80–98 (2015)Google Scholar
  25. 25.
    Miche, Y., Sorjamaa, A., Bas, P., Simula, O., Jutten, C., Lendasse, A.: OP-ELM: optimally pruned extreme learning machine. IEEE Trans. Neural Netw. 21(1), 158–162 (2010)Google Scholar
  26. 26.
    Han, F., Huang, D.S.: Improved extreme learning machine for function approximation by encoding a priori information. Neurocomputing 69(1), 2369–2373 (2006)CrossRefGoogle Scholar
  27. 27.
    Xu, H., Yu, B.: Automatic thesaurus construction for spam filtering using revised back propagation neural network. Expert Syst. Appl. 37, 18–23 (2010)Google Scholar
  28. 28.
    Jiuwen, C., Zhiping, L.: Extreme Learning Machines on High Dimensional and Large Data Applications: A Survey. Mathematical Problems in Engineering, Hindawi Publishing Corporation, vol. 2015, no. 1, pp. 1–13 (2015)Google Scholar
  29. 29.
    Huang, G.B., Zhu, Q.Y., Siew, C.K.: Extreme learning machine: a new learning scheme of feedforward neural networks. In: International Joint Conference on Neural Networks, pp. 985–990 (2004)Google Scholar
  30. 30.
    Huang, G.B., Zhu, Q.Y., Siew, C.K.: Extreme learning machine: theory and applications. Neurocomputing 70(1), 489–501 (2006)Google Scholar
  31. 31.
    Li, X., Xie, H., Wang, R., Cai, Y., Cao, J., Wang, F., Min, H., Deng, X.: Empirical analysis: stock market prediction via extreme learning machine. Neural Comput. Appl. 1(3), 1–12 (2014)Google Scholar
  32. 32.
    Zhao, G.P., Hen, Z.Q., Miao, C.Y., Man, Z.H.: On improving the conditioning of extreme learning machine: a linear case. In: International Conference on Information, Communications and Signal Processing, pp. 1–5 (2009)Google Scholar
  33. 33.
    Yang, X.S.: Flower pollination algorithm for global optimization. Unconventional Computation and Natural Computation. Lecture Notes in Computer Science, vol. 7445, pp. 240–249 (2012)Google Scholar
  34. 34.
    Yang, X.S., karamanoglu, M., He, X.: Multi-objective Flower Algorithm for optimization. In: International Conference on Computational Science, Procedia Computer Science, vol. 18, pp. 861–868 (2013)Google Scholar
  35. 35.
    Ghosh, D., Goldengorin, B.: The binary knapsack problem: solutions with guaranteed quality. In: SOM-theme A Primary Processes within Firms (2001)Google Scholar
  36. 36.
    Yeniay, O.: Penalty function methods for constrained optimization with genetic algorithms. Math. Comput. Appl. 10(1), 45–56 (2005)Google Scholar
  37. 37.
    Yang, C.S., Chuang, L.Y., Li, J.C., Yang, C.H.: Chaotic binary particle swarm optimization for feature selection using logistic map. In: IEEE Conference on Soft Computing in Industrial Applications, pp. 107–112 (2008)Google Scholar
  38. 38.
    Tilahun, S.L., Ong, H.C.: Prey-predator algorithm: a new metaheuristic algorithm for optimization problems. Inf. Technol. Decis. Mak. 14(6), 1331–1352 (2015)Google Scholar
  39. 39.
    Duda, R.O., Hart, P.E., Stork, D.G.: Pattern Classification, 2nd edn. Wiley-Interscience (2000)Google Scholar
  40. 40.
    Wilcoxon, F.: Individual comparisons by ranking methods. Biom. Bull. 1(6), 80–83 (1945)Google Scholar
  41. 41.
    Rice, J.A.: Mathematical Statistics and Data Analysis, 3rd edn. Duxbury Advanced (2006)Google Scholar
  42. 42.
    Hastie, T., Tibshirani, R., Friedman, J.: The Elements of Statistical Learning. Series in Statistics (2009)Google Scholar
  43. 43.
    Bache, K., Lichman, M.: UCI Machine Learning Repository, University of California, Irvine, School of Information and Computer Sciences, 2013, lastchecked on 15 May 2017. http://archive.ics.uci.edu/ml
  44. 44.
    Raman, B., Ioerger, T.R.: Instance-Based Filter for Feature Selection. Machine Learning Research, pp. 1–23 (2002)Google Scholar
  45. 45.
    Yang, X.S.: Nature-Inspired Metaheuristic Algorithms, 2nd edn. Luniver Press, UK (2010)Google Scholar
  46. 46.
    Yang, X.S.: A New Metaheuristic Bat-Inspired Algorithm. Nature Inspired Cooperative Strategies for Optimization, vol. 284, pp. 65–74. Springer (2010)Google Scholar

Copyright information

© Springer International Publishing AG 2018

Authors and Affiliations

  1. 1.Faculty of Computers and InformationBeni-Suef UniversityBeni SuefEgypt
  2. 2.Faculty of Computers and InformationCairo UniversityGizaEgypt

Personalised recommendations