Multimodal States of Matter Search

  • Erik Cuevas
  • Daniel Zaldívar
  • Marco Pérez-Cisneros
Chapter
Part of the Studies in Computational Intelligence book series (SCI, volume 775)

Abstract

The idea in multi-modal optimization is to detect multiple global and local optima as possible in only one run. Identifying several solutions is particularly important for some problems because the best solution could not be applicable due to different practical limitations. The States of Matter Search (SMS) is a metaheuristic technique. Even though SMS is efficient in finding the global optimum, it misses in providing various solutions by using an only single run. Under this condition, a new version called the Multi-modal States of Matter Search (MSMS) has been proposed.

References

  1. 1.
    Panos, P., Edwin, R., Tuy, H.: Recent developments and trends in global optimization. J. Comput. Appl. Math. 124, 209–228 (2000)MathSciNetCrossRefGoogle Scholar
  2. 2.
    Floudas, C., Akrotirianakis, I., Caratzoulas, S., Meyer, C., Kallrath, J.: Global optimization in the 21st century: advances and challenges. Comput. Chem. Eng. 29(6), 1185–1202 (2005)CrossRefGoogle Scholar
  3. 3.
    Ying, J., Ke-Cun, Z., Shao-Jian, Q.: A deterministic global optimization algorithm. Appl. Math. Comput. 185(1), 382–387 (2007)MathSciNetMATHGoogle Scholar
  4. 4.
    Georgieva, A., Jordanov, I.: Global optimization based on novel heuristics, low-discrepancy sequences and genetic algorithms. Eur. J. Oper. Res. 196, 413–422 (2009)CrossRefGoogle Scholar
  5. 5.
    Lera, D., Sergeyev, Y.: Lipschitz and Hölder global optimization using space-filling curves. Appl. Numer. Math. 60(1–2), 115–129 (2010)MathSciNetCrossRefGoogle Scholar
  6. 6.
    Fogel, L.J., Owens, A.J., Walsh, M.J.: Artificial Intelligence Through Simulated Evolution. John Wiley, Chichester, UK (1966)MATHGoogle Scholar
  7. 7.
    Schwefel, H.P.: Evolution strategies: a comprehensive introduction. J. Nat. Comput. 1(1), 3–52 (2002)MathSciNetCrossRefGoogle Scholar
  8. 8.
    Koza, J.R.: Genetic programming: a paradigm for genetically breeding populations of computer programs to solve problems. Rep. No. STAN-CS-90–1314. Stanford University, CA (1990)Google Scholar
  9. 9.
    Holland, J.H.: Adaptation in Natural and Artificial Systems. University of Michigan Press, Ann Arbor, MI (1975)Google Scholar
  10. 10.
    Goldberg, D.E.: Genetic Algorithms in Search, Optimization and Machine Learning. Addison Wesley, Boston, MA (1989)MATHGoogle Scholar
  11. 11.
    De Castro, L.N., Von Zuben, F.J.: Artificial immune systems: part I—basic theory and applications. Technical report, TR-DCA 01/99. December 1999Google Scholar
  12. 12.
    Storn, R., Price, K.: Differential evolution-a simple and efficient adaptive scheme for global optimisation over continuous spaces. Technical Report TR-95–012. ICSI, Berkeley, Calif (1995)Google Scholar
  13. 13.
    Kirkpatrick, S., Gelatt, C., Vecchi, M.: Optimization by simulated annealing. Science 220(4598), 671–680 (1983)MathSciNetCrossRefGoogle Scholar
  14. 14.
    İlker, B., Birbil, S., Shu-Cherng, F.: An electromagnetism-like mechanism for global optimization. J. Global Optim. 25, 263–282 (2003)MathSciNetCrossRefGoogle Scholar
  15. 15.
    Rashedia, E., Nezamabadi-pour, H., Saryazdi, S.: Filter modeling using gravitational search algorithm. Eng. Appl. Artif. Intell. 24(1), 117–122 (2011)CrossRefGoogle Scholar
  16. 16.
    Kennedy, J., Eberhart, R.: Particle swarm optimization. In: Proceedings of the 1995 IEEE International Conference on Neural Networks, vol. 4, pp. 1942–1948, December 1995Google Scholar
  17. 17.
    Dorigo, M., Maniezzo, V., Colorni, A.: Positive feedback as a search strategy. Technical Report No. 91-016. Politecnico di Milano (1991)Google Scholar
  18. 18.
    Das, S., Maity, S., Qu, B.Y., Suganthan, P.N.: Real-parameter evolutionary multimodal optimization—a survey of the state-of-the-art. Swarm Evol. Comput. 1(2), 71–88 (2011)CrossRefGoogle Scholar
  19. 19.
    Wong, K.-C., Wu, C.-H., Mok, R.K.P., Peng, C., Zhang, Z.: Evolutionary multimodal optimization using the principle of locality. Inf. Sci. 194, 138–170 (2012)CrossRefGoogle Scholar
  20. 20.
    Tan, K.C., Chiam, S.C., Mamun, A.A., Goh, C.K.: Balancing exploration and exploitation with adaptive variation for evolutionary multi-objective optimization. Eur. J. Oper. Res. 197, 701–713 (2009)CrossRefGoogle Scholar
  21. 21.
    Qu, B.Y., Liang, J.J., Wang, Z.Y., Chen, Q., Suganthan, P.N.: Novel benchmark functions for continuous multimodal optimization with comparative results. Swarm Evol. Comput. 26, 23–34 (2016)CrossRefGoogle Scholar
  22. 22.
    Basak, A., Das, S., Chen-Tan, K.: Multimodal optimization using a biobjective differential evolution algorithm enhanced with mean distance-based selection. IEEE Trans. Evol. Comput. 17(5), 666–685 (2013)CrossRefGoogle Scholar
  23. 23.
    De Jong, K.A.: An analysis of the behavior of a class of genetic adaptive systems. Ph.D. dissertation, University of Michigan, Ann Arbor (1975)Google Scholar
  24. 24.
    Goldberg, D.E., Richardson, J.: Genetic algorithms with sharing for multimodal function optimization. In: Proceedings of 2nd International Conference on Genetic Algorithms, pp. 41–49 (1987)Google Scholar
  25. 25.
    Petrovski, A., Wilson, A., McCall, J.: Statistical analysis of genetic algorithms and inference about optimal factors. Technical Report 2, SCMS Technical Report 1998/2. School of Computer and Mathematical Sciences, Faculty of Science and Technology, The Robert Gordon University, Aberdeen, U.K. (1998)Google Scholar
  26. 26.
    Li, L., Tang, K.: History-based topological speciation for multimodal optimization. IEEE Trans. Evol. Comput. 19(1), 136–150 (2015)MathSciNetCrossRefGoogle Scholar
  27. 27.
    Mengshoel, O.J., Galán, S.F., De Dios, A.: Adaptive generalized crowding for genetic algorithms. Inf. Sci. 258, 140–159 (2014)MathSciNetCrossRefGoogle Scholar
  28. 28.
    Miller, B.L., Shaw, M.J.: Genetic algorithms with dynamic niche sharing for multimodal function optimization. In: Proceedings of the 3rd IEEE Conference on Evolutionary Computation, pp. 786–791 (1996)Google Scholar
  29. 29.
    Thomsen, R.: Multimodal optimization using crowding-based differential evolution. In: Congress on Evolutionary Computation, 2004, CEC2004, vol. 2, pp. 1382–1389Google Scholar
  30. 30.
    Chen, C.-H., Liu, T.-K., Chou, J.-H.: A novel crowding genetic algorithm and its applications to manufacturing robots. IEEE Trans. Ind. Inf. 10(3), 1705–1716 (2014)CrossRefGoogle Scholar
  31. 31.
    Yazdani, S., Nezamabadi-pour, H., Kamyab, S.: A gravitational search algorithm for multimodal optimization. Swarm Evol. Comput. 14, 1–14 (2014)CrossRefGoogle Scholar
  32. 32.
    Chang, W.-D.: A modified particle swarm optimization with multiple subpopulations for multimodal function optimization problems. Appl. Soft Comput. 33, 170–182 (2015)CrossRefGoogle Scholar
  33. 33.
    Liang, J.J., Qu, B.Y., Mao, X.B., Niu, B., Wang, D.Y.: Differential evolution based on fitness Euclidean-distance ratio for multimodal optimization. Neurocomputing 137, 252–260 (2014)CrossRefGoogle Scholar
  34. 34.
    Biswas, S., Das, S., Kundu, S., Patra, G.R.: Utilizing time-linkage property in DOPs: an information sharing based artificial bee colony algorithm for tracking multiple optima in uncertain environments. Soft Comput. 18, 1199–1212 (2014)CrossRefGoogle Scholar
  35. 35.
    Sacco, W.F., Henderson, N., Rios-Coelho, A.C.: Topographical clearing differential evolution: a new method to solve multimodal optimization problems. Prog. Nucl. Energy 71, 269–278 (2014)CrossRefGoogle Scholar
  36. 36.
    Lianga, Y., Kwong-Sak, L.: Genetic algorithm with adaptive elitist-population strategies for multimodal function optimization. Appl. Soft Comput. 11, 2017–2034 (2011)CrossRefGoogle Scholar
  37. 37.
    Gao, W., Yen, G.G., Liu, S.: A cluster-based differential evolution with self-adaptive strategy for multimodal optimization. IEEE Trans. Cybern. 44(8), 1314–1327 (2014)CrossRefGoogle Scholar
  38. 38.
    Qu, B.Y., Suganthan, P.N., Das, S.: A distance-based locally informed particle swarm model for multimodal optimization. IEEE Trans. Evol. Comput. 17(3), 387–402 (2013)CrossRefGoogle Scholar
  39. 39.
    Dong, W., Zhou, M.: Gaussian classier-based evolutionary strategy for multimodal optimization. IEEE Trans. Neural Networks Learn. Syst. 25(6), 1200–1216 (2014)CrossRefGoogle Scholar
  40. 40.
    Hui, S., Suganthan, P.N.: Ensemble and arithmetic recombination-based speciation differential evolution for multimodal optimization. IEEE Trans. Cybern. (In press)Google Scholar
  41. 41.
    Chen, G., Low, C.P., Yang, Z.: Preserving and exploiting genetic diversity in evolutionary programming algorithms. IEEE Trans. Evol. Comput. 13(3), 661–673 (2009)CrossRefGoogle Scholar
  42. 42.
    De Castro, L.N., Zuben, F.J.: Learning and optimization using the clonal selection principle. IEEE Trans. Evol. Comput. 6, 239–251 (2002)CrossRefGoogle Scholar
  43. 43.
    De Castro, L.N., Timmis, J.: An artificial immune network for multimodal function optimization. In: Proceedings of the 2002 IEEE International Conference on Evolutionary Computation, IEEE Press, New York, Honolulu, Hawaii, pp. 699–704 (2002)Google Scholar
  44. 44.
    Xu, Q., Lei, W., Si, J.: Predication based immune network for multimodal function optimization. Eng. Appl. Artif. Intell. 23, 495–504 (2010)CrossRefGoogle Scholar
  45. 45.
    Cuevas, E., González, M.: An optimization algorithm for multimodal functions inspired by collective animal behavior. Soft Comput. 17(3), 489–502 (2013)CrossRefGoogle Scholar
  46. 46.
    Merrikh-Bayat, F.: The runner-root algorithm: a metaheuristic for solving unimodal and multimodal optimization problems inspired by runners and roots of plants in nature. Appl. Soft Comput. 33, 292–303 (2015)CrossRefGoogle Scholar
  47. 47.
    Lacroix, B., Molina, D., Herrera, F.: Region-based memetic algorithm with archive for multimodal optimisation. Inf. Sci. 367–368, 719–746 (2016)CrossRefGoogle Scholar
  48. 48.
    Roya, S., Minhazul, S., Das, S., Ghosha, S., Vasilakos, A.V.: A simulated weed colony system with subregional differential evolution for multimodal optimization. Eng. Optim. 45(4), 459–481 (2013)MathSciNetCrossRefGoogle Scholar
  49. 49.
    Yahyaiea, F., Filizadeh, S.: A surrogate-model based multi-modal optimization algorithm. Eng. Optim. 43(7), 779–799 (2011)CrossRefGoogle Scholar
  50. 50.
    Cuevas, E., Echavarría, A., Ramírez-Ortegón, M.A.: An optimization algorithm inspired by the states of matter that improves the balance between exploration and exploitation. Appl. Intell. 40(2), 256–272 (2014)CrossRefGoogle Scholar
  51. 51.
    Cuevas, E., Echavarría, A., Zaldívar, D., Pérez-Cisneros, M.: A novel evolutionary algorithm inspired by the states of matter for template matching. Expert Syst. Appl. 40(16), 6359–6373 (2013)CrossRefGoogle Scholar
  52. 52.
    Mohamed, A.-A.A., El-Gaafary, A.A.M., Mohamed, Y.S., Hemeida, A.M.: Multi-objective states of matter search algorithm for TCSC-based smart controller design. Electr. Power Syst. Res. 140, 874–885 (2016)CrossRefGoogle Scholar
  53. 53.
    Bailey, R.A.: Association Schemes: Designed Experiments, Algebra and Combinatory. Cambridge University Press, Cambridge (2004)CrossRefGoogle Scholar
  54. 54.
    Barr, R.S., Golden, B.L., Kelly, J.P., Resende, M.G., Stewart, W.R.: Designing and reporting on computational experiments with heuristic methods. J Heuristics 1, 9–32 (1995)CrossRefGoogle Scholar
  55. 55.
    Bartz-Beielstein, T.: Experimental research in evolutionary computation—the new experimentalism. In: Natural Computing Series, Springer, Berlin (2006)Google Scholar
  56. 56.
    Batista, E., França, E., Borges, M.: Improving the performance of metaheuristics: an approach combining response surface methodology and racing algorithms. Int. J. Eng. Math. 2015, Article ID 167031, 9 pages (2015).  https://doi.org/10.1155/2015/167031
  57. 57.
    Batista, E., França, E.: Improving the fine-tuning of metaheuristics: an approach combining design of experiments and racing algorithms. J. Optim. 2017, Article ID 8042436, 7 pages (2017).  https://doi.org/10.1155/2017/8042436
  58. 58.
    Calvet, L., Juan, A., Serrat, C., Ries, J.: A statistical learning based approach for parameter fine-tuning of metaheuristics. SORT-Stat. Oper. Res. Trans. 40(1), 201–224 (2016)MathSciNetMATHGoogle Scholar
  59. 59.
    Eiben, A.E., Smit, S.K.: Parameter tuning for configuring and analyzing evolutionary algorithms. Swarm Evol. Comput. 1, 19–31 (2011)CrossRefGoogle Scholar
  60. 60.
    Eiben, A.E., Smit, S.K.: Evolutionary algorithm parameters and methods to tune them. In: Monfroy, E., Hamadi, Y., Saubion, F. (eds.) Autonomous Search, pp. 15–36. Springer, Berlin (2012)CrossRefGoogle Scholar
  61. 61.
    Karafotias, G., Hoogendoorn, M., Eiben, A.E.: Parameter control in evolutionary algorithms: trends and challenges. IEEE Trans. Evol. Comput. 19(2), 167–187 (2015)CrossRefGoogle Scholar
  62. 62.
    Kok, K.Y., Rajendran, P.: Differential-evolution control parameter optimization for unmanned aerial vehicle path planning. PLoS ONE 11(3), 1–10 (2016)Google Scholar
  63. 63.
    Ugolotti, R., Cagnoni, S.: Analysis of evolutionary algorithms using multi-objective parameter tuning. In: GECCO ’14 Proceedings of the 2014 Annual Conference on Genetic and Evolutionary Computation, pp. 1343–1350Google Scholar
  64. 64.
    Kramer, O., Gloger, B., Gobels, A: An experimental analysis of evolution strategies and particle swarm optimisers using design of experiments. In: GECCO07, pp. 674–681 (2007)Google Scholar
  65. 65.
    Kramer, O.: Evolutionary self-adaptation: a survey of operators and strategy parameters. Evol. Intell. 3(2), 51–65 (2010)CrossRefGoogle Scholar
  66. 66.
    Boari, E., Gisele Pappa, L., Marques, J., Marcos Goncalves, A., Meira, W.: Tuning genetic programming parameters with factorial designs. In: IEEE Congress on Evolutionary Computation (CEC), pp. 1–8 (2010)Google Scholar
  67. 67.
    Czarn, A., MacNish, C., Vijayan, K., Turlach, B., Gupta, R.: Statistical exploratory analysis of genetic algorithms. IEEE Trans. Evol. Comput. 8(4), 405–421 (2004)CrossRefGoogle Scholar
  68. 68.
    Petrovski, A., Brownlee, A, McCall, J.: Statistical optimisation and tuning of GA factors. In: IEEE Congress on Evolutionary Computation, vol. 1, pp. 758–764 (2005)Google Scholar
  69. 69.
    Stodola, P., Mazal, J., Podhorec, M.: Parameter tuning for the ant colony optimization algorithm used in ISR systems. Int. J. Appl. Math. Inform. 9, 123–126 (2015)Google Scholar
  70. 70.
    Jackson, W., Özcan, E., John, R.: Tuning a simulated annealing metaheuristic for cross-domain search. In: IEEE Congress on Evolutionary Computation 2017, pp. 5–9, Donostia-San Sebastian, Spain (2017)Google Scholar
  71. 71.
    Petrowski, A.: A clearing procedure as a niching method for genetic algorithms. In: Proceedings of the 1996 IEEE International Conference on Evolutionary Computation, pp. 798–803, IEEE Press, New York, Nagoya, Japan (1996)Google Scholar
  72. 72.
    Glover, F.: Tabu search part 1. ORSA J. Comput. 1(3), 190–206 (1989)CrossRefGoogle Scholar
  73. 73.
    Glover, F.: Tabu search part 2. ORSA J. Comput. 1(3), 4–32 (1990)CrossRefGoogle Scholar
  74. 74.
    Garcia, S., Molina, D., Lozano, M., Herrera, F.: A study on the use of non-parametric tests for analyzing the evolutionary algorithms’ behaviour: a case study on the CEC ’2005, Special session on real parameter optimization. J. Heuristics 15(6), 617–644 (2009)CrossRefGoogle Scholar
  75. 75.
    Wilcoxon, F.: Individual comparisons by ranking methods. Biometrics 1, 80–83 (1945)MathSciNetCrossRefGoogle Scholar
  76. 76.
    Li, X., Engelbrecht, A., Epitropakis, M.G.: Benchmark functions for CEC ’2013, Special session and competition on niching methods for multimodal function optimization. Evolutionary Computation (CEC) (2013)Google Scholar

Copyright information

© Springer International Publishing AG, part of Springer Nature 2018

Authors and Affiliations

  • Erik Cuevas
    • 1
  • Daniel Zaldívar
    • 1
  • Marco Pérez-Cisneros
    • 1
  1. 1.CUCEIUniversidad de GuadalajaraGuadalajaraMexico

Personalised recommendations