Advertisement

Total Memory Optimiser: Proof of Concept and Compromises

  • Maurice ClercEmail author
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 10103)

Abstract

For most usual optimisation problems, the Nearer is Better assumption is true (in probability). Classical iterative algorithms take this property into account, either explicitly or implicitly, by forgetting some information collected during the process, assuming it is not useful any more. However, when the property is not globally true, i.e. for deceptive problems, it may be necessary to keep all the sampled points and their values, and to exploit this increasing amount of information. Such a basic Total Memory Optimiser is presented here. We experimentally show that this technique can outperform classical methods on small deceptive problems. As it gets very computing time expensive when the dimension of the problem increases, a few compromises are suggested to speed it up.

Keywords

Search Space Combinatorial Problem Classical Optimisers Surrogate Function Global Correlation 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

References

  1. 1.
    Beyhaghi, P., Cavaglieri, D., Bewley, T.: Delaunay-based derivative-free optimization via global surrogates, part I: linear constraints. J. Glob. Optim., 1–52 (2015)Google Scholar
  2. 2.
    Clerc, M.: When Nearer is Better, p. 19 (2007). https://hal.archives-ouvertes.fr/hal-00137320
  3. 3.
    Clerc, M.: Guided Randomness in Optimization. ISTE (International Scientific and Technical Encyclopedia). Wiley (2015)Google Scholar
  4. 4.
    de Berg, M., Cheong, O., van Kreveld, M., Overmars, M.: Computational Geometry. Springer, Heidelberg (2008)CrossRefzbMATHGoogle Scholar
  5. 5.
    Elsayed, S.M., Sarker, R.A., Essam, D.L.: GA with a New Multi-Parent Crossover for Solving IEEE-CEC2011 Competition Problems (2011)Google Scholar
  6. 6.
    Glover, F., Laguna, M.: Tabu Search. Kluwer Academic Publishers (1997)Google Scholar
  7. 7.
    Han, Z.-H., Zhang, K.-S.: Surrogate-based optimization. INTECH Open Access Publisher (2012)Google Scholar
  8. 8.
    Hansen, N.: The CMA Evolution Strategy: A Tutorial. Technical report (2009)Google Scholar
  9. 9.
    Omran, M.G.H., Clerc, M.: An adaptive population-based simplex method for continuous optimization. Int. J. Swarm Intell. Res. 7(4), 22–49 (2016)CrossRefGoogle Scholar
  10. 10.
    Weise, T., Zapf, M., Chiong, R., Nebro, A.J.: Why is optimization difficult? In: Kacprzyk, J., Chiong, R. (eds.) Nature-Inspired Algorithms for Optimisation. SCI, vol. 193, pp. 1–50. Springer, Heidelberg (2009)CrossRefGoogle Scholar

Copyright information

© Springer International Publishing AG 2016

Authors and Affiliations

  1. 1.Independent ConsultantGroisyFrance

Personalised recommendations