Skip to main content

OneMax Is Not the Easiest Function for Fitness Improvements

  • Conference paper
  • First Online:
Evolutionary Computation in Combinatorial Optimization (EvoCOP 2023)

Part of the book series: Lecture Notes in Computer Science ((LNCS,volume 13987))

Abstract

We study the \((1:s+1)\) success rule for controlling the population size of the \((1,\lambda )\)-EA. It was shown by Hevia Fajardo and Sudholt that this parameter control mechanism can run into problems for large s if the fitness landscape is too easy. They conjectured that this problem is worst for the OneMax benchmark, since in some well-established sense OneMax is known to be the easiest fitness landscape.

In this paper we disprove this conjecture. We show that there exist s and \(\varepsilon \) such that the self-adjusting \((1,\lambda )\)-EA with the \((1:s+1)\)-rule optimizes OneMax efficiently when started with \(\varepsilon n\) zero-bits, but does not find the optimum in polynomial time on Dynamic BinVal. Hence, we show that there are landscapes where the problem of the \((1:s+1)\)-rule for controlling the population size of the \((1 , \lambda )\)-EA is more severe than for OneMax. The key insight is that, while OneMax is the easiest function for decreasing the distance to the optimum, it is not the easiest fitness landscape with respect to finding fitness-improving steps.

M. Kaufmann—The author was supported by the Swiss National Science Foundation [grant number 200021_192079].

X. Zou—The author was supported by the Swiss National Science Foundation [grant number CR-SII5_173721].

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 54.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 69.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    Traditionally the most popular value is \(s=4\), leading to the famous one-fifth rule [2].

  2. 2.

    Empirically they found the threshold between the two regimes to be around \(s\approx 3.4\).

  3. 3.

    See [23, Section 2.1] for a detailed discussion of the target population size \(\lambda ^*\).

  4. 4.

    A function \(f:\{0,1\}^n \rightarrow \mathbb {R}\) is monotone if flipping a zero-bit into a one-bit always improves the fitness. In the dynamic monotone setting, selection may be based on a different function in each generation, but it must always be a monotone function. The formal definition is not relevant for this paper, but can be found in [23].

  5. 5.

    In Lemma 7 () and Lemma 13 we consider a constant \(\varepsilon >0\), introduce an \(\varepsilon ' = \varepsilon '(\varepsilon )\) and look at all states in the range \((\varepsilon \pm \varepsilon ')n\). Again, we implicitly assume that \((\varepsilon - \varepsilon ')n\) and \((\varepsilon + \varepsilon ')n\) are integers, since we use those in the calculations.

  6. 6.

    The subscript indicates dependency on \(\lambda \), i.e., for all \(c,C >0\) there exists \(\lambda _0\) such that for all \(\lambda \ge \lambda _0\) we have \(\tilde{\varepsilon }(\lambda ) \le c/\lambda \) and \(\tilde{s}(\lambda ) \ge C\).

  7. 7.

    Defined as \(\textsc {BinVal} (x) = \sum _{i=1}^{n} 2^{i-1} x_i \) and \(\textsc {Binary} (x) = \sum _{i=1}^{\lfloor n/2 \rfloor } x_i n + \sum _{i=\lfloor n/2 \rfloor + 1}^n x_i\).

  8. 8.

    We note that there are other types of “easiness”, e.g. with respect to a fixed budget.

References

  1. Antipov, D., Doerr, B., Yang, Q.: The efficiency threshold for the offspring population size of the \((\mu , \lambda )\) EA. In: Genetic and Evolutionary Computation Conference (GECCO), pp. 1461–1469 (2019)

    Google Scholar 

  2. Auger, A.: Benchmarking the (1+ 1) evolution strategy with one-fifth success rule on the BBOB-2009 function testbed. In: Genetic and Evolutionary Computation Conference (GECCO), pp. 2447–2452 (2009)

    Google Scholar 

  3. Badkobeh, G., Lehre, P.K., Sudholt, D.: Unbiased black-box complexity of parallel search. In: Bartz-Beielstein, T., Branke, J., Filipič, B., Smith, J. (eds.) PPSN 2014. LNCS, vol. 8672, pp. 892–901. Springer, Cham (2014). https://doi.org/10.1007/978-3-319-10762-2_88

    Chapter  Google Scholar 

  4. Böttcher, S., Doerr, B., Neumann, F.: Optimal fixed and adaptive mutation rates for the LeadingOnes problem. In: Schaefer, R., Cotta, C., Kołodziej, J., Rudolph, G. (eds.) PPSN 2010. LNCS, vol. 6238, pp. 1–10. Springer, Heidelberg (2010). https://doi.org/10.1007/978-3-642-15844-5_1

    Chapter  Google Scholar 

  5. Colin, S., Doerr, B., Férey, G.: Monotonic functions in EC: anything but monotone! In: Genetic and Evolutionary Computation Conference (GECCO), pp. 753–760 (2014)

    Google Scholar 

  6. Corus, D., He, J., Jansen, T., Oliveto, P.S., Sudholt, D., Zarges, C.: On easiest functions for mutation operators in bio-inspired optimisation. Algorithmica 78(2), 714–740 (2017)

    Article  MathSciNet  MATH  Google Scholar 

  7. Devroye, L.: The compound random search. Ph.D. dissertation, Purdue Univ., West Lafayette, IN (1972)

    Google Scholar 

  8. Doerr, B.: Probabilistic tools for the analysis of randomized optimization heuristics. In: Theory of Evolutionary Computation. NCS, pp. 1–87. Springer, Cham (2020). https://doi.org/10.1007/978-3-030-29414-4_1

    Chapter  Google Scholar 

  9. Doerr, B., Doerr, C.: Theory of parameter control for discrete black-box optimization: provable performance gains through dynamic parameter choices. In: Theory of Evolutionary Computation. NCS, pp. 271–321. Springer, Cham (2020). https://doi.org/10.1007/978-3-030-29414-4_6

    Chapter  MATH  Google Scholar 

  10. Doerr, B., Doerr, C., Ebel, F.: From black-box complexity to designing new genetic algorithms. Theor. Comput. Sci. 567, 87–104 (2015)

    Article  MathSciNet  MATH  Google Scholar 

  11. Doerr, B., Doerr, C., Lengler, J.: Self-adjusting mutation rates with provably optimal success rules. Algorithmica 83(10), 3108–3147 (2021)

    Article  MathSciNet  MATH  Google Scholar 

  12. Doerr, B., Doerr, C., Yang, J.: Optimal parameter choices via precise black-box analysis. Theor. Comput. Sci. 801, 1–34 (2020)

    Article  MathSciNet  MATH  Google Scholar 

  13. Doerr, B., Johannsen, D., Winzen, C.: Multiplicative drift analysis. Algorithmica 64, 673–697 (2012)

    Article  MathSciNet  MATH  Google Scholar 

  14. Doerr, B., Witt, C., Yang, J.: Runtime analysis for self-adaptive mutation rates. Algorithmica 83(4), 1012–1053 (2021)

    Article  MathSciNet  MATH  Google Scholar 

  15. Doerr, C., Wagner, M.: Simple on-the-fly parameter selection mechanisms for two classical discrete black-box optimization benchmark problems. In: Genetic and Evolutionary Computation Conference (GECCO), pp. 943–950 (2018)

    Google Scholar 

  16. Droste, S.: A rigorous analysis of the compact genetic algorithm for linear functions. Nat. Comput. 5(3), 257–283 (2006)

    Article  MathSciNet  MATH  Google Scholar 

  17. Eiben, A.E., Hinterding, R., Michalewicz, Z.: Parameter control in evolutionary algorithms. IEEE Trans. Evol. Comput. 3, 124–141 (1999)

    Article  Google Scholar 

  18. He, J., Chen, T., Yao, X.: On the easiest and hardest fitness functions. IEEE Trans. Evol. Comput. 19(2), 295–305 (2014)

    Article  Google Scholar 

  19. Hevia Fajardo, M.A., Sudholt, D.: Self-adjusting population sizes for non-elitist evolutionary algorithms: why success rates matter. arXiv preprint arXiv:2104.05624 (2021)

  20. Hevia Fajardo, M.A., Sudholt, D.: Self-adjusting population sizes for non-elitist evolutionary algorithms: why success rates matter. In: Genetic and Evolutionary Computation Conference (GECCO), pp. 1151–1159 (2021)

    Google Scholar 

  21. Jansen, T.: On the brittleness of evolutionary algorithms. In: Stephens, C.R., Toussaint, M., Whitley, D., Stadler, P.F. (eds.) FOGA 2007. LNCS, vol. 4436, pp. 54–69. Springer, Heidelberg (2007). https://doi.org/10.1007/978-3-540-73482-6_4

    Chapter  MATH  Google Scholar 

  22. Kaufmann, M., Larcher, M., Lengler, J., Zou, X.: Onemax is not the easiest function for fitness improvements. arXiv preprint arXiv:2204.07017 (2022)

  23. Kaufmann, M., Larcher, M., Lengler, J., Zou, X.: Self-adjusting population sizes for the \((1, \lambda )\)-EA on monotone functions (2022). https://arxiv.org/abs/2204.00531

  24. Kaufmann, M., Larcher, M., Lengler, J., Zou, X.: Self-adjusting population sizes for the \((1, \lambda )\)-EA on monotone functions. In: Rudolph, G., Kononova, A.V., Aguirre, H., Kerschke, P., Ochoa, G., Tusar, T. (eds.) Parallel Problem Solving from Nature (PPSN) XVII. PPSN 2022. Lecture Notes in Computer Science, vol. 13399, pp. 569–585. Springer, Cham (2022). https://doi.org/10.1007/978-3-031-14721-0_40

  25. Kern, S., Müller, S.D., Hansen, N., Büche, D., Ocenasek, J., Koumoutsakos, P.: Learning probability distributions in continuous evolutionary algorithms-a comparative review. Nat. Comput. 3(1), 77–112 (2004)

    Article  MathSciNet  MATH  Google Scholar 

  26. Kötzing, T.: Concentration of first hitting times under additive drift. Algorithmica 75(3), 490–506 (2016)

    Article  MathSciNet  MATH  Google Scholar 

  27. Lehre, P., Qin, X.: More precise runtime analyses of non-elitist evolutionary algorithms in uncertain environments. Algorithmica 1–46 (2022). https://doi.org/10.1007/s00453-022-01044-5

  28. Lengler, J.: A general dichotomy of evolutionary algorithms on monotone functions. IEEE Trans. Evol. Comput. 24(6), 995–1009 (2019)

    Article  Google Scholar 

  29. Lengler, J.: Drift analysis. In: Theory of Evolutionary Computation. NCS, pp. 89–131. Springer, Cham (2020). https://doi.org/10.1007/978-3-030-29414-4_2

    Chapter  Google Scholar 

  30. Lengler, J., Meier, J.: Large population sizes and crossover help in dynamic environments. In: Bäck, T., et al. (eds.) PPSN 2020. LNCS, vol. 12269, pp. 610–622. Springer, Cham (2020). https://doi.org/10.1007/978-3-030-58112-1_42

    Chapter  Google Scholar 

  31. Lengler, J., Riedi, S.: Runtime analysis of the \((\mu + 1)\)-EA on the dynamic BinVal function. In: Zarges, C., Verel, S. (eds.) EvoCOP 2021. LNCS, vol. 12692, pp. 84–99. Springer, Cham (2021). https://doi.org/10.1007/978-3-030-72904-2_6

    Chapter  Google Scholar 

  32. Lengler, J., Schaller, U.: The \((1+1)\)-EA on noisy linear functions with random positive weights. In: Symposium Series on Computational Intelligence (SSCI), pp. 712–719. IEEE (2018)

    Google Scholar 

  33. Oliveto, P., Witt, C.: On the analysis of the simple genetic algorithm. Theor. Comput. Sci. 545, 2–19 (2014)

    Article  MathSciNet  MATH  Google Scholar 

  34. Rechenberg, I.: Evolutionsstrategien. In: Schneider, B., Ranft, U. (eds.) Simulationsmethoden in der Medizin und Biologie. Medizinische Informatik und Statistik, vol. 8, pp. 83–114. Springer, Berlin (1978). https://doi.org/10.1007/978-3-642-81283-5_8

    Chapter  Google Scholar 

  35. Rowe, J.E., Sudholt, D.: The choice of the offspring population size in the (1, \(\lambda \)) evolutionary algorithm. Theor. Comput. Sci. 545, 20–38 (2014)

    Article  MathSciNet  MATH  Google Scholar 

  36. Schumer, M., Steiglitz, K.: Adaptive step size random search. IEEE Trans. Autom. Control 13(3), 270–276 (1968)

    Article  Google Scholar 

  37. Sudholt, D.: A new method for lower bounds on the running time of evolutionary algorithms. IEEE Trans. Evol. Comput. 17(3), 418–435 (2012)

    Article  Google Scholar 

  38. Witt, C.: Tight bounds on the optimization time of a randomized search heuristic on linear functions. Comb. Probab. Comput. 22(2), 294–318 (2013)

    Article  MathSciNet  MATH  Google Scholar 

Download references

Acknowledgements

We thank Dirk Sudholt for helpful discussions during the Dagstuhl seminar 22081 “Theory of Randomized Optimization Heuristics” and Mario Hevia Fajardo for sharing his simulation code for comparison.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Xun Zou .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2023 The Author(s), under exclusive license to Springer Nature Switzerland AG

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Kaufmann, M., Larcher, M., Lengler, J., Zou, X. (2023). OneMax Is Not the Easiest Function for Fitness Improvements. In: Pérez Cáceres, L., Stützle, T. (eds) Evolutionary Computation in Combinatorial Optimization. EvoCOP 2023. Lecture Notes in Computer Science, vol 13987. Springer, Cham. https://doi.org/10.1007/978-3-031-30035-6_11

Download citation

  • DOI: https://doi.org/10.1007/978-3-031-30035-6_11

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-031-30034-9

  • Online ISBN: 978-3-031-30035-6

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics