A Review of No Free Lunch Theorems, and Their Implications for Metaheuristic Optimisation

  • Thomas JoyceEmail author
  • J. Michael Herrmann
Part of the Studies in Computational Intelligence book series (SCI, volume 744)


The No Free Lunch Theorem states that, averaged over all optimisation problems, all non-resampling optimisation algorithms perform equally well. In order to explain the relevance of these theorems for metaheuristic optimisation, we present a detailed discussion on the No Free Lunch Theorem, and various extensions including some which have not appeared in the literature so far. We then show that understanding the No Free Lunch theorems brings us to a position where we can ask about the specific dynamics of an optimisation algorithm, and how those dynamics relate to the properties of optimisation problems.


No Free Lunch (NFL) Optimisation Search Metaheuristics 


  1. 1.
    Wolpert, D.H., Macready, W.G.: No free lunch theorems for search. Technical report. SFI-TR-95-02-010, Santa Fe Institute (1995)Google Scholar
  2. 2.
    Wolpert, D.H., Macready, W.G.: No free lunch theorems for optimization. In: IEEE Trans. Evol. Comput. 1.1, 67–82 (1997)Google Scholar
  3. 3.
    Whitley, D., Rowe, J.: Focused no free lunch theorems. In: Proceedings of the 10th Annual Conference on Genetic and Evolutionary Computation, pp. 811–818. ACM (2008)Google Scholar
  4. 4.
    Whitley, D.: Functions as permutations: regarding no free lunch, walsh analysis and summary statistics. In: Parallel Problem Solving from Nature PPSN VI, pp. 169–178. Springer (2000)Google Scholar
  5. 5.
    Culberson, J.C.: On the futility of blind search: an algorithmic view of ’no free lunch. Evol. Comput. 6(2), 109–127 (1998)CrossRefGoogle Scholar
  6. 6.
    Lattimore, T., Hutter, M.: No free lunch versus Occam’s razor in supervised learning. In: Algorithmic Probability and Friends. Bayesian Prediction and Artificial Intelligence, pp. 223–235. Springer (2013)Google Scholar
  7. 7.
    Serafino, L.: No Free Lunch Theorem and Bayesian probability theory: two sides of the same coin. Some implications for black-box optimization and metaheuristics. In: (2013). arXiv:1311.6041
  8. 8.
    English, T.: No more lunch: analysis of sequential search. In: Proceedings of the 2004 Congress on Evolutionary Computation CEC2004, Vol. 1, pp. 227–234. IEEE (2004)Google Scholar
  9. 9.
    English, T.: Optimization is easy and learning is hard in the typical function. In: Proceedings of the 2000 Congress on Evolutionary Computation, vol. 2, pp. 924–931. IEEE (2000)Google Scholar
  10. 10.
    English, T.: On the structure of sequential search: beyond ’no free lunch’. In: Evolutionary Computation in Combinatorial Optimization, pp. 95–103. Springer (2004)Google Scholar
  11. 11.
    Ho, Yu-Chi, Pepyne, D.L.: Simple explanation of the no-free-lunch theorem and its implications. J. Optim. Theory Appl. 115(3), 549–570 (2002)MathSciNetCrossRefzbMATHGoogle Scholar
  12. 12.
    Duéñez-Guzmán, E.A., Vose, M.D.: No free lunch and benchmarks. Evol. Comput. 21(2), 293–312 (2013)CrossRefGoogle Scholar
  13. 13.
    Radcliffe, N.J., Surry, P.D.: Fundamental limitations on search algorithms: evolutionary computing in perspective. In: Computer Science Today, pp. 275–291. Springer (1995)Google Scholar
  14. 14.
    Schumacher, C., Vose, M.D., Whitley, L.D.: The no free lunch and problem description length. In: Proceedings of the Genetic and Evolutionary Computation Conference (GECCO-2001), pp. 565–570 (2001)Google Scholar
  15. 15.
    Igel, C., Toussaint, M.: A no-free-lunch theorem for non-uniform distributions of target functions. J. Math. Model. Algorithm. 3(4), 313–322 (2005)MathSciNetCrossRefzbMATHGoogle Scholar
  16. 16.
    Droste, S., Thomas, J., Ingo, W.: Optimization with randomized search heuristics—the (A) NFL theorem, realistic scenarios, and difficult functions. Theor. Comput. Sci. 287(1), 131–144 (2002)MathSciNetCrossRefzbMATHGoogle Scholar
  17. 17.
    Griffiths, E.J., Orponen, P.: Optimization, block designs and no free lunch theorems. Inf. Process. Lett. 94(2), 55–61 (2005)MathSciNetCrossRefzbMATHGoogle Scholar
  18. 18.
    Corne, D.W., Knowles, J.D.: No free lunch and free leftovers theorems for multiobjective optimisation problems. In: Evolutionary Multi- Criterion Optimization. Springer (2003)Google Scholar
  19. 19.
    Service, T.C.: A no free Lunch theorem for multi-objective optimization. Inf. Process. Lett. 110(21), 917–923 (2010)MathSciNetCrossRefGoogle Scholar
  20. 20.
    Corne, D., Knowles, J.: Some multiobjective optimizers are better than others. In: The 2003 Congress on Evolutionary Computation, Vol. 4, pp. 2506–2512. IEEE (2003)Google Scholar
  21. 21.
    Everitt, T.: Universal indution and optimisation: no free lunch? In: Thesis (2013)Google Scholar
  22. 22.
    Auger, A., Teytaud, O.: Continuous lunches are free plus the design of optimal optimization algorithms. Algorithmica 57(1), 121–146 (2010)MathSciNetCrossRefzbMATHGoogle Scholar
  23. 23.
    Rowe, J., Vose, M., Wright, A.: Reinterpreting no free lunch. Evol. Comput. 17(1), 117–129 (2009)CrossRefGoogle Scholar
  24. 24.
    Alabert, A. et al.: No-free-lunch theorems in the continuum. In: (2014). arXiv:1409.2175
  25. 25.
    Yang, X.-S.: Free lunch or no free lunch: that is not just a question? Int. J. Artif. Intell. Tools 21(03), 1240010 (2012)CrossRefGoogle Scholar
  26. 26.
    Yang, X.-S., Deb, S.: Cuckoo search via Lévy flights. In: World Congress on Nature & Biologically Inspired Computing, pp. 210–214. IEEE (2009)Google Scholar
  27. 27.
    Kennedy, J.: Particle swarm optimization. In: Encyclopedia of Machine Learning, pp. 760–766. Springer (2011)Google Scholar
  28. 28.
    Poli, R.: Mean and variance of the sampling distribution of particle swarm optimizers during stagnation. IEEE Trans. Evol. Comput. 13(4), 712–721 (2009)CrossRefGoogle Scholar
  29. 29.
    Erskine, A., Joyce, T., Herrmann, J.M.: Parameter selection in particle Swarm optimisation from stochastic stability analysis. In: International Conference on Swarm Intelligence, pp. 161–172. Springer (2016)Google Scholar
  30. 30.
    Serafino, L.: Optimizing without derivatives: what does the no free lunch theorem actually say? In: Notices of the AMS 61.7 (2014)Google Scholar
  31. 31.
    Ben-David, S., Srebro, N., Urner, R.: Universal learning versus no free lunch results. In: Philosophy and Machine Learning Workshop NIPS (2011)Google Scholar
  32. 32.
    David, J.C.M.: Information theory, inference and learning algorithms. Cambridge University Press (2003)Google Scholar
  33. 33.
    Streeter, M.J.: Two broad classes of functions for which a no free lunch result does not hold. In: Genetic and Evolutionary Computation-GECCO, pp. 1418–1430 Springer (2003)Google Scholar

Copyright information

© Springer International Publishing AG 2018

Authors and Affiliations

  1. 1.The University of EdinburghEdinburghScotland

Personalised recommendations