Large Scale Problems in Practice: The Effect of Dimensionality on the Interaction Among Variables

  • Fabio Caraffini
  • Ferrante Neri
  • Giovanni Iacca
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 10199)


This article performs a study on correlation between pairs of variables in dependence on the problem dimensionality. Two tests, based on Pearson and Spearman coefficients, have been designed and used in this work. In total, 86 test problems ranging between 10 and 1000 variables have been studied. If the most commonly used experimental conditions are used, the correlation between pairs of variables appears, from the perspective of the search algorithm, to consistently decrease. This effect is not due to the fact that the dimensionality modifies the nature of the problem but is a consequence of the experimental conditions: the computational feasibility of the experiments imposes an extremely shallow search in case of high dimensions. An exponential increase of budget and population with the dimensionality is still practically impossible. Nonetheless, since real-world application may require that large scale problems are tackled despite of the limited budget, an algorithm can quickly improve upon initial guesses if it integrates the knowledge that an apparent weak correlation between pairs of variables occurs, regardless the nature of the problem.


Large scale optimization Covariance matrix Correlation 



This project has received funding from the European Union’s Horizon 2020 research and innovation programme under grant agreement No 665347.


  1. 1.
    Eiben, A.E., Smith, J.E.: Introduction to Evolutionary Computation. Natural Computing Series. Springer, Berlin (2003)CrossRefzbMATHGoogle Scholar
  2. 2.
    van den Bergh, F., Engelbrecht, A.P.: A cooperative approach to particle swarm optimization. IEEE Trans. Evol. Comput. 8(3), 225–239 (2004)CrossRefGoogle Scholar
  3. 3.
    Neri, F., Tirronen, V.: Recent advances in differential evolution: a review and experimental analysis. Artif. Intell. Rev. 33(1–2), 61–106 (2010)CrossRefGoogle Scholar
  4. 4.
    Li, S., Wei, D.: Extremely high-dimensional feature selection via feature generating samplings. IEEE Trans. Cybern. 44(6), 737–747 (2014)MathSciNetCrossRefGoogle Scholar
  5. 5.
    Hooke, R., Jeeves, T.A.: Direct search solution of numerical and statistical problems. J. ACM 8, 212–229 (1961)CrossRefzbMATHGoogle Scholar
  6. 6.
    Auger, A., Hansen, N.: A restart CMA evolution strategy with increasing population size. In: Proceedings of the IEEE Congress on Evolutionary Computation, pp. 1769–1776 (2005)Google Scholar
  7. 7.
    Molina, D., Lozano, M., Garcia-Martinez, C., Herrera, F.: Memetic algorithms for continuous optimization based on local search chains. Evol. Comput. 18(1), 27–63 (2010)CrossRefGoogle Scholar
  8. 8.
    Marchiori, E., Steenbeek, A.: An evolutionary algorithm for large scale set covering problems with application to airline crew scheduling. In: Cagnoni, S. (ed.) EvoWorkshops 2000. LNCS, vol. 1803, pp. 370–384. Springer, Heidelberg (2000). Scholar
  9. 9.
    Kononova, A.V., Hughes, K.J., Pourkashanian, M., Ingham, D.B.: Fitness diversity based adaptive memetic algorithm for solving inverse problems of chemical kinetics. In: Proceedings of the IEEE Congress on Evolutionary Computation, pp. 2366–2373 (2007)Google Scholar
  10. 10.
    Kononova, A.V., Ingham, D.B., Pourkashanian, M.: Simple scheduled memetic algorithm for inverse problems in higher dimensions: application to chemical kinetics. In: Proceedings of the IEEE World Congress on Computational Intelligence, pp. 3906–3913 (2008)Google Scholar
  11. 11.
    Akay, B., Karaboga, D.: Artificial bee colony algorithm for large-scale problems and engineering design optimization. J. Intell. Manufact. 23(4), 1001–1014 (2012)CrossRefGoogle Scholar
  12. 12.
    Iacca, G., Caraffini, F., Neri, F.: Multi-strategy coevolving aging particle optimization. Int. J. Neural Syst. 24(1), 1450008 (2014)CrossRefGoogle Scholar
  13. 13.
    Korošec, P., Šilc, J.: The differential ant-stigmergy algorithm for large scale real-parameter optimization. In: Dorigo, M., Birattari, M., Blum, C., Clerc, M., Stützle, T., Winfield, A.F.T. (eds.) ANTS 2008. LNCS, vol. 5217, pp. 413–414. Springer, Heidelberg (2008). Scholar
  14. 14.
    Fister, I., Jr., I.F., Brest, J., Zumer, V.: Memetic artificial bee colony algorithm for large-scale global optimization. In: Proceedings of the IEEE Congress on Evolutionary Computation, pp. 1–8 (2012)Google Scholar
  15. 15.
    Parsopoulos, K.E.: Cooperative micro-differential evolution for high-dimensional problems. In: Proceedings of the Conference on Genetic and Evolutionary Computation, pp. 531–538 (2009)Google Scholar
  16. 16.
    Parsopoulos, K.E.: Cooperative micro-particle swarm optimization. In: Proceedings of the First ACM/SIGEVO Summit on Genetic and Evolutionary Computation, pp. 467–474. ACM (2009)Google Scholar
  17. 17.
    Rajasekhar, A., Das, S., Das, S.: Abc: a micro artificial bee colony algorithm for large scale global optimization. In: GECCO (Companion), pp. 1399–1400 (2012)Google Scholar
  18. 18.
    Dasgupta, S., Biswas, A., Das, S., Panigrahi, B.K., Abraham, A.: A micro-bacterial foraging algorithm for high-dimensional optimization. In: IEEE Congress on Evolutionary Computation, pp. 785–792 (2009)Google Scholar
  19. 19.
    Brest, J., Maučec, M.S.: Population size reduction for the differential evolution algorithm. Appl. Intell. 29(3), 228–247 (2008)CrossRefGoogle Scholar
  20. 20.
    Zamuda, A., Brest, J., Bošković, B., Žumer, V.: High-dimensional real-parameter optimization using self-adaptive differential evolution algorithm with population size reduction. In: Proceedings of the IEEE World Congress on Computational Intelligence, pp. 2032–2039 (2008)Google Scholar
  21. 21.
    Iacca, G., Mallipeddi, R., Mininno, E., Neri, F., Suganthan, P.N.: Super-fit and population size reduction mechanisms in compact differential evolution. In: Proceedings of IEEE Symposium on Memetic Computing, pp. 21–28 (2011)Google Scholar
  22. 22.
    Brest, J., Maucec, M.S.: Self-adaptive differential evolution algorithm using population size reduction and three strategies. Soft Comput. 15(11), 2157–2174 (2011)CrossRefGoogle Scholar
  23. 23.
    Tseng, L.Y., Chen, C.: Multiple trajectory search for large scale global optimization. In: Proceedings of the IEEE Congress on Evolutionary Computation, pp. 3052–3059 (2008)Google Scholar
  24. 24.
    Zhao, S.Z., Suganthan, P.N., Das, S.: Self-adaptive differential evolution with multi-trajectory search for large-scale optimization. Soft Comput. 15(11), 2175–2185 (2011)CrossRefGoogle Scholar
  25. 25.
    Caraffini, F., Neri, F., Poikolainen, I.: Micro-differential evolution with extra moves along the axes. In: Proceedings of the IEEE Symposium Series on Computational Intelligence, pp. 46–53 (2013)Google Scholar
  26. 26.
    Iacca, G., Neri, F., Mininno, E., Ong, Y.S., Lim, M.H.: Ockham’s razor in memetic computing: three stage optimal memetic exploration. Inf. Sci. 188, 17–43 (2012)MathSciNetCrossRefGoogle Scholar
  27. 27.
    Caraffini, F., Neri, F., Iacca, G., Mol, A.: Parallel memetic structures. Inf. Sci. 227, 60–82 (2013)MathSciNetCrossRefGoogle Scholar
  28. 28.
    Caraffini, F., Neri, F., Passow, B., Iacca, G.: Re-sampled inheritance search: high performance despite the simplicity. Soft Comput. 17(12), 2235–2256 (2014)CrossRefGoogle Scholar
  29. 29.
    Ros, R., Hansen, N.: A simple modification in CMA-ES achieving linear time and space complexity. In: Proceesdings of the Parallel Problem Solving in Nature, pp. 296–305 (2008)Google Scholar
  30. 30.
    Potter, M.A., Jong, K.A.: A cooperative coevolutionary approach to function optimization. In: Davidor, Y., Schwefel, H.-P., Männer, R. (eds.) PPSN 1994. LNCS, vol. 866, pp. 249–257. Springer, Heidelberg (1994).  10.1007/3-540-58484-6_269CrossRefGoogle Scholar
  31. 31.
    Liu, Y., Zhao, Q.: Scaling up fast evolutionary programming with cooperative coevolution. In: Proceedings of the IEEE Congress on Evolutionary Computation, pp. 1101–1108 (2001)Google Scholar
  32. 32.
    Sofge, D., De Jong, K., Schultz, A.: A blended population approach to cooperative coevolution for decomposition of complex problems. In: Proceedings of the IEEE Congress on Evolutionary Computation, pp. 413–418 (2002)Google Scholar
  33. 33.
    Potter, M.A., De Jong, K.: Cooperative coevolution: an architecture for evolving coadapted subcomponents. Evol. Comput. 8(1), 1–29 (2000)CrossRefGoogle Scholar
  34. 34.
    Shi, Y., Teng, H., Li, Z.: Cooperative co-evolutionary differential evolution for function optimization. In: Wang, L., Chen, K., Ong, Y.S. (eds.) ICNC 2005. LNCS, vol. 3611, pp. 1080–1088. Springer, Heidelberg (2005).  10.1007/11539117_147CrossRefGoogle Scholar
  35. 35.
    Yang, Z., Tang, K., Yao, X.: Differential evolution for high-dimensional function optimization. In: Proceedings of the IEEE Congress on Evolutionary Computation, pp. 3523–3530 (2007)Google Scholar
  36. 36.
    Zamuda, A., Brest, J., Bošković, B., Žumer, V.: Large scale global optimization using differential evolution with self-adaptation and cooperative co-evolution. In: Proceedings of the IEEE World Congress on Computational Intelligence, pp. 3719–3726 (2008)Google Scholar
  37. 37.
    Olorunda, O., Engelbrecht, A.P.: Differential evolution in high-dimensional search spaces. In: Proceedings of the IEEE Congress on Evolutionary Computation, pp. 1934–1941 (2007)Google Scholar
  38. 38.
    Yang, Z., Tang, K., Yao, X.: Large scale evolutionary optimization using cooperative coevolution. Inf. Sci. 178(15), 2985–2999 (2008)MathSciNetCrossRefzbMATHGoogle Scholar
  39. 39.
    Li, X., Yao, X.: Cooperatively coevolving particle swarms for large scale optimization. IEEE Trans. Evol. Comput. 16(2), 210–224 (2012)CrossRefGoogle Scholar
  40. 40.
    Hansen, N., Müller, S.D., Koumoutsakos, P.: Reducing the time complexity of the derandomized evolution strategy with covariance matrix adaptation (CMA-ES). Evol. Comput. 11(1), 1–18 (2003)CrossRefGoogle Scholar
  41. 41.
    Hansen, N., Kern, S.: Evaluating the CMA evolution strategy on multimodal test functions. In: Yao, X., et al. (eds.) PPSN 2004. LNCS, vol. 3242, pp. 282–291. Springer, Heidelberg (2004).  10.1007/978-3-540-30217-9_29CrossRefGoogle Scholar
  42. 42.
    Hansen, N., Ostermeier, A.: Adapting arbitrary normal mutation distributions in evolution strategies: the covariance matrix adaptation. In: Proceedings of the IEEE International Conference on Evolutionary Computation, pp. 312–317 (1996)Google Scholar
  43. 43.
    Hansen, N., Ostermeier, A.: Completely derandomized self-adaptation in evolution strategies. Evol. Comput. 9(2), 159–195 (2001)CrossRefGoogle Scholar
  44. 44.
    Pearson, K.: Mathematical contributions to the theory of evolution. XI. on the influence of natural selection on the variability and correlation of organs. Philos. Trans. Roy. Soc. Lon. Ser. A, Contain. Papers Math. Phys. Char. 200, 1–66 (1903)CrossRefzbMATHGoogle Scholar
  45. 45.
    Spearman, C.: The proof and measurement of association between two things. Am. J. Psychol. 15(1), 72–101 (1904)CrossRefGoogle Scholar
  46. 46.
    Hauke, J., Kossowski, T.: Comparison of values of pearson’s and spearman’s correlation coefficients on the same sets of data. Quaestiones Geographicae 2, 87–93 (2011)Google Scholar
  47. 47.
    Cox, D.R., Hinkley, D.: Theoretical Statistics. Chapman & Hall, London (1974)CrossRefzbMATHGoogle Scholar
  48. 48.
    Lozano, M., Molina, D., Herrera, F.: Editorial scalability of evolutionary algorithms and other metaheuristics for large-scale continuous optimization problems. Soft Comput. 15(11), 2085–2087 (2011)CrossRefGoogle Scholar
  49. 49.
    Caraffini, F., Neri, F., Picinali, L.: An analysis on separability for memetic computing automatic design. Inf. Sci. 265, 1–22 (2014)MathSciNetCrossRefGoogle Scholar
  50. 50.
    Liang, J.J., Qu, B.Y., Suganthan, P.N., Hernández-Díaz, A.G. : Problem definitions and evaluation criteria for the CEC 2013 special session on real-parameter optimization. Technical report 201212, Zhengzhou University and Nanyang Technological University, Zhengzhou China and Singapore (2013)Google Scholar
  51. 51.
    Hansen, N., Auger, A., Finck, S., Ros, R., et al.: Real-parameter black-box optimization benchmarking 2010: noiseless functions definitions. Technical report RR-6829, INRIA (2010)Google Scholar

Copyright information

© Springer International Publishing AG 2017

Authors and Affiliations

  • Fabio Caraffini
    • 1
    • 2
  • Ferrante Neri
    • 1
    • 2
  • Giovanni Iacca
    • 1
    • 2
  1. 1.Centre for Computational IntelligenceDe Montfort UniversityLeicesterUK
  2. 2.INCAS3AssenThe Netherlands

Personalised recommendations