Skip to main content

Binary Thermal Exchange Optimization for Feature Selection

  • Chapter
  • First Online:
Data Management and Analysis

Part of the book series: Studies in Big Data ((SBD,volume 65))

  • 1028 Accesses

Abstract

A Feature Selection (FS) is a preprocessing step that becomes a mandatory when dealing with data a large set of features. FS process is known to be a NP-hard optimization problem. Therefore, metaheuristics algorithms proved their ability to tackle this problem as in other optimization problems. The Thermal Exchange Optimization (TEO) is a recent population-based metaheuristic algorithm that is based on Newton’s law of cooling. In this paper, a binary version of TEO algorithm (called BTEO) as a search strategy was used in a wrapper feature selection method for the first time in literature. Both K-Nearest Neighborhood (KNN) and Decision Tree (DT) classifiers were used in the evaluation process. Eighteen well-known UCI datasets were utilized to assess the performance of the proposed approach. To prove the efficiency of proposed approach, three popular wrapper FS methods that use nature inspired algorithms (i.e., Genetic Algorithm (GA), Particle Swarm Optimizer (PSO), and Grey Wolf Optimizer (GWO)) as search strategies, were used for comparison purposes, and the results demonstrate the effectiveness of the proposed approach in solving different feature selection tasks.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 149.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 199.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 199.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Aljarah, I., Mafarja, M., Heidari, A. A., Faris, H., Zhang, Y., & Mirjalili, S. (2018). Asynchronous accelerating multi-leader salp chains for feature selection. Applied Soft Computing, 71, 964–979.

    Article  Google Scholar 

  2. Eberhart, R., & Kennedy, J. (1995). A new optimizer using particle swarm theory. In Proceedings of the Sixth International Symposium on Micro Machine and Human Science, 1995. MHS’95 (pp. 39–43). Piscataway: IEEE.

    Google Scholar 

  3. Emary, E., & Zawbaa, H. M. (2016). Impact of chaos functions on modern swarm optimizers. PLoS One, 11(7), e0158738.

    Article  Google Scholar 

  4. Emary, E., Zawbaa, H. M., & Hassanien, A. E. (2016). Binary ant lion approaches for feature selection. Neurocomputing, 213, 54–65.

    Article  Google Scholar 

  5. Faris, H., Mafarja, M. M., Heidari, A. A., Aljarah, I., Ala’M, A. Z., Mirjalili, S., et al. (2018). An efficient binary salp swarm algorithm with crossover scheme for feature selection problems. Knowledge-Based Systems, 154, 43–67.

    Article  Google Scholar 

  6. Grigull, U. (1984). Newton’s temperature scale and the law of cooling. Heat and Mass Transfer, 18(4), 195–199.

    Google Scholar 

  7. Guyon, I., & Elisseeff, A. (2003). An introduction to variable and feature selection. Journal of Machine Learning Research, 3(Mar), 1157–1182.

    MATH  Google Scholar 

  8. Holland, J. H. (1992). Adaptation in natural and artificial systems: An introductory analysis with applications to biology, control, and artificial intelligence. Cambridge: MIT Press.

    Book  Google Scholar 

  9. Jović, A., Brkić, K., & Bogunović, N. (2015). A review of feature selection methods with applications. In 38th International Convention on Information and Communication Technology, Electronics and Microelectronics (MIPRO), 2015 (pp. 1200–1205). Piscataway: IEEE.

    Google Scholar 

  10. Kashef, S., & Nezamabadi-pour, H. (2015). An advanced ACO algorithm for feature subset selection. Neurocomputing, 147, 271–279.

    Article  Google Scholar 

  11. Kaveh, A., & Bakhshpoori, T. (2016). Water evaporation optimization: A novel physically inspired optimization algorithm. Computers & Structures, 167, 69–85.

    Article  Google Scholar 

  12. Kaveh, A., & Dadras, A. (2017). A novel meta-heuristic optimization algorithm: Thermal exchange optimization. Advances in Engineering Software, 110, 69–84.

    Article  Google Scholar 

  13. Kaveh, A., Dadras, A., & Bakhshpoori, T. (2018). Improved thermal exchange optimization algorithm for optimal design of skeletal structures. Smart Structures and Systems, 21, 263–278.

    Google Scholar 

  14. Kaveh, A., & Talatahari, S. (2010). A novel heuristic optimization method: Charged system search. Acta Mechanica, 213(3), 267–289.

    Article  Google Scholar 

  15. Kirkpatrick, S., Gelatt, C. D., Vecchi, M. P., et al. (1983). Optimization by simulated annealing. Science, 220(4598), 671–680.

    Article  MathSciNet  Google Scholar 

  16. Kohavi, R., & John, G. H. (1997). Wrappers for feature subset selection. Artificial intelligence, 97(1–2), 273–324.

    Article  Google Scholar 

  17. Lichman, M. (2013). UCI machine learning repository. http://archive.ics.uci.edu/ml

    Google Scholar 

  18. Liu, H., & Motoda, H. (2012). Feature selection for knowledge discovery and data mining (Vol. 454). Berlin: Springer.

    MATH  Google Scholar 

  19. Mafarja, M., & Abdullah, S. (2013). Investigating memetic algorithm in solving rough set attribute reduction. International Journal of Computer Applications in Technology, 48(3), 195–202.

    Article  Google Scholar 

  20. Mafarja, M., & Abdullah, S. (2013). Record-to-record travel algorithm for attribute reduction in rough set theory. Journal of Theoretical and Applied Information Technology, 49(2), 507–513.

    MATH  Google Scholar 

  21. Mafarja, M., & Mirjalili, S. (2017). Hybrid whale optimization algorithm with simulated annealing for feature selection. Neurocomputing, 260, 302–312.

    Article  Google Scholar 

  22. Mafarja, M., & Mirjalili, S. (2017). Whale optimization approaches for wrapper feature selection. Applied Soft Computing, 62, 441–453.

    Article  Google Scholar 

  23. Mafarja, M., & Mirjalili, S. (2018). Hybrid binary ant lion optimizer with rough set and approximate entropy reducts for feature selection. Soft Computing, 23, 6249–6265.

    Article  Google Scholar 

  24. Mafarja, M., Aljarah, I., Heidari, A. A., Hammouri, A. I., Faris, H., Ala’M, A. Z., et al. (2017). Evolutionary population dynamics and grasshopper optimization approaches for feature selection problems. Knowledge-Based Systems, 145, 25–45.

    Article  Google Scholar 

  25. Mafarja, M., Eleyan, D., Abdullah, S., & Mirjalili, S. (2017). S-shaped vs. v-shaped transfer functions for ant lion optimization algorithm in feature selection problem. In Proceedings of the International Conference on Future Networks and Distributed Systems (p. 14). New York: ACM.

    Google Scholar 

  26. Mafarja, M., Aljarah, I., Heidari, A. A., Faris, H., Fournier-Viger, P., Li, X., et al. (2018). Binary dragonfly optimization for feature selection using time-varying transfer functions. Knowledge-Based Systems, 161, 185–204.

    Article  Google Scholar 

  27. Mafarja, M., Aljarah, I., Heidari, A. A., Hammouri, A. I., Faris, H., Ala’M, A. Z., et al. (2018). Evolutionary population dynamics and grasshopper optimization approaches for feature selection problems. Knowledge-Based Systems, 145, 25–45.

    Article  Google Scholar 

  28. Mirjalili, S., & Lewis, A. (2013). S-shaped versus v-shaped transfer functions for binary particle swarm optimization. Swarm and Evolutionary Computation, 9, 1–14. https://doi.org/10.1016/j.swevo.2012.09.002, http://www.sciencedirect.com/science/article/pii/S221065021 2000648

    Article  Google Scholar 

  29. Nagpal, S., Arora, S., Dey, S., et al. (2017). Feature selection using gravitational search algorithm for biomedical data. Procedia Computer Science, 115, 258–265.

    Article  Google Scholar 

  30. Rashedi, E., & Nezamabadi-pour, H. (2014). Feature subset selection using improved binary gravitational search algorithm. Journal of Intelligent & Fuzzy Systems, 26(3), 1211–1221.

    Article  Google Scholar 

  31. Rashedi, E., Nezamabadi-Pour, H., & Saryazdi, S. (2009). GSA: A gravitational search algorithm. Information Sciences, 179(13), 2232–2248.

    Article  Google Scholar 

  32. Rashedi, E., Nezamabadi-Pour, H., & Saryazdi, S. (2010). BGSA: Binary gravitational search algorithm. Natural Computing, 9(3), 727–745.

    Article  MathSciNet  Google Scholar 

  33. Xiang, J., Han, X., Duan, F., Qiang, Y., Xiong, X., Lan, Y., et al. (2015). A novel hybrid system for feature selection based on an improved gravitational search algorithm and k-NN method. Applied Soft Computing, 31, 293–307.

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2020 Springer Nature Switzerland AG

About this chapter

Check for updates. Verify currency and authenticity via CrossMark

Cite this chapter

Taradeh, M., Mafarja, M. (2020). Binary Thermal Exchange Optimization for Feature Selection. In: Alhajj, R., Moshirpour, M., Far, B. (eds) Data Management and Analysis. Studies in Big Data, vol 65. Springer, Cham. https://doi.org/10.1007/978-3-030-32587-9_14

Download citation

  • DOI: https://doi.org/10.1007/978-3-030-32587-9_14

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-030-32586-2

  • Online ISBN: 978-3-030-32587-9

  • eBook Packages: EngineeringEngineering (R0)

Publish with us

Policies and ethics