Skip to main content

Designing an Optimal Search Algorithm with Respect to Prior Information

  • Chapter
  • First Online:
Theory and Principled Methods for the Design of Metaheuristics

Part of the book series: Natural Computing Series ((NCS))

  • 1603 Accesses

Abstract

There are many optimization algorithms, most of them with many parameters. When you know which family of problems you face, you would like to design the optimization algorithm which is the best for this family (e.g., on average against a given distribution of probability on this family of optimization algorithms). This chapter is devoted to this framework: we assume that we know a probability distribution, from which the fitness function is drawn, and we look for the optimal optimization algorithm. This can be based (i) on experimentations, i.e. tuning the parameters on a set of problems, (ii) on mathematical approaches automatically building an optimization algorithm from a probability distribution on fitness functions (reinforcement learning approaches), or (iii) some simplified versions of the latter, with more reasonable computational cost (Gaussian processes for optimization).

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 54.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. A. Auger, H.G. Beyer, N. Hansen, S. Finck, R. Ros, M. Schoenauer, D. Whitley, Black-box optimization benchmarking, in GECCO’09: Proceedings of the 11th Annual Conference on Genetic and Evolutionary Computation, Montreal, 2009

    Google Scholar 

  2. A. Auger, O. Teytaud, Continuous lunches are free plus the design of optimal optimization algorithms. Algorithmica 57(1), 121–146 (2010)

    Article  MATH  MathSciNet  Google Scholar 

  3. J.M. Calvin, A one-dimensional optimization algorithm and its convergence rate under Wiener measure. J. Complex. 17, 306–344 (2001)

    Article  MATH  MathSciNet  Google Scholar 

  4. G. Chaslot, M.H.M. Winands, J.W.H.M. Uiterwijk, H. van den Herik, B. Bouzy, Progressive strategies for Monte Carlo tree search, in Proceedings of the 10th Joint Conference on Information Sciences (JCIS 2007), Salt Lake City, ed. by P. Wang et al. (World Scientific Publishing Co. Pvt. Ltd., 2007), pp. 655–661

    Google Scholar 

  5. J.P. Chilès, P. Delfiner, Geostatistics: Modeling Spatial Uncertainty (Wiley, New York, 1999)

    Book  MATH  Google Scholar 

  6. R. Coulom, Efficient selectivity and backup operators in Monte Carlo tree search, in Proceedings of the 5th International Conference on Computers and Games, Turin, 2006, ed. by P. Ciancarini, H.J. van den Herik

    Google Scholar 

  7. S. Droste, T. Jansen, I. Wegener, Perhaps not a free lunch but at least a free appetizer, in Proceedings of the First Genetic and Evolutionary Computation Conference (GECCO’99), San Francisco, 13–17, ed. by W. Banzhaf, J. Daida, A.E. Eiben, M.H. Garzon, V. Honavar, M. Jakiela, R.E. Smith (Morgan Kaufmann, 1999), pp. 833–839

    Google Scholar 

  8. A.E. Eiben, Principled approaches to tuning EA parameters, in Proceedings of CEC (tutorial), Trondheim, 2009

    Google Scholar 

  9. R. Fourer, D.M. Gay, B.W. Kernighan, AMPL: A Modeling Language for Mathematical Programming (Duxbury Press, Belmont, 2002)

    Google Scholar 

  10. S. Gelly, S. Ruette, O. Teytaud, Comparison-based algorithms are robust and randomized algorithms are anytime. Evol. Comput. J. (Special Issue on Bridging Theory and Practice) 15(4), 26 (2007)

    Google Scholar 

  11. N.I.M. Gould, D. Orban, P.L. Toint, CUTEr and SifDec: a constrained and unconstrained testing environment, revisited. ACM Trans. Math. Softw. 29(4), 373–394 (2003)

    Article  MATH  MathSciNet  Google Scholar 

  12. D. Huang, T. Allen, W. Notz, N. Zeng, Global optimization of stochastic black-box systems via sequential kriging meta-models. J. Glob. Optim. 34, 441–466 (2006)

    Article  MATH  MathSciNet  Google Scholar 

  13. C. Igel, M. Toussaint, On classes of functions for which no free lunch results hold. Inf. Process. Lett. 86, 317–321 (2003). See also Los Alamos Preprint cs.NE/0108011

    Google Scholar 

  14. D.R. Jones, A taxonomy of global optimization methods based on response surfaces. J. Glob. Optim. 21, 345–383 (2001)

    Article  MATH  Google Scholar 

  15. D.R. Jones, M. Schonlau, W.J. Welch, Efficient global optimization of expensive black-box functions. J. Glob. Optim. 13, 455–492 (1998)

    Article  MATH  MathSciNet  Google Scholar 

  16. L. Kocsis, C. Szepesvari, Bandit-based Monte Carlo planning, in ECML’06, Berlin, 2006, pp. 282–293

    Google Scholar 

  17. H.J. Kushner, A new method of locating the maximum point of an arbitrary multipeak curve in the presence of noise. J. Basic Eng. 86, 97–106 (1964)

    Article  Google Scholar 

  18. C-S. Lee, M-H. Wang, G. Chaslot, J-B. Hoock, A. Rimmel, O. Teytaud, S-R. Tsai, S-C. Hsu, T-P. Hong, The computational intelligence of MoGo revealed in Taiwan’s Computer Go tournaments, IEEE Trans. Comput. Intell. AI Games 1(1), 73–89 (2009)

    Google Scholar 

  19. J. Mockus, Bayesian Approach to Global Optimization: Theory and Applications (Kluwer, Dordrecht/Boston/London, 1989)

    Book  MATH  Google Scholar 

  20. J. Mockus, V. Tiesis, A. Zilinskas, The application of Bayesian methods for seeking the extremum, in Towards Global Optimization, vol. 2, ed. by L.C.W. Dixon, G.P. Szego (North-Holland, New York, 1978) pp. 117–129

    Google Scholar 

  21. V. Nannen, A.E. Eiben, Relevance estimation and value calibration of evolutionary algorithm parameters, in International Joint Conference on Artificial Intelligence (IJCAI’07), Hyderabad, 2007, pp. 975–980

    Google Scholar 

  22. V. Nannen, A.E. Eiben, Variance reduction in meta-EDA, in GECCO’07: Proceedings of the 9th Annual Conference on Genetic and Evolutionary Computation, London (UK) (ACM, New York, 2007) pp. 627–627

    Google Scholar 

  23. Pascal Challenges, http://pascallin2.ecs.soton.ac.uk/Challenges/, 2011

  24. C.E. Rasmussen, C.K.I. Williams, Gaussian Processes for Machine Learning (MIT, Cambridge, 2006)

    MATH  Google Scholar 

  25. I. Rechenberg, Evolutionsstrategie: Optimierung technischer Systeme nach Prinzipien des biologischen Evolution (Frommann-Holzboog Verlag, Stuttgart, 1973)

    Google Scholar 

  26. P. Rolet, M. Sebag, O. Teytaud, Optimal robust expensive optimization is tractable, in GECCO’09: Proceedings of the 9th Annual Conference on Genetic and Evolutionary Computation, Montréal (ACM, 2009)

    Google Scholar 

  27. J. Sacks, W.J. Welch, T.J. Mitchell, H.P. Wynn, Design and analysis of computer experiments. Stat. Sci. 4(4), 409–435 (1989)

    Article  MATH  MathSciNet  Google Scholar 

  28. M. Schonlau, Computer experiments and global optimization, Ph.D. thesis, University of Waterloo, Waterloo, 1997

    Google Scholar 

  29. M. Schonlau, W.J. Welch, Global optimization with nonparametric function fitting, in Proceedings of the ASA, Section on Physical and Engineering Sciences, Alexandria (American Statistical Association, 1996) pp. 183–186

    Google Scholar 

  30. M. Schonlau, W.J. Welch, D.R. Jones, A data analytic approach to Bayesian global optimization, in Proceedings of the ASA, Section on Physical and Engineering Sciences, Anaheim (American Statistical Association, 1997) pp. 186–191

    Google Scholar 

  31. M.L. Stein, Interpolation of Spatial Data: Some Theory for Kriging (Springer, New York, 1999)

    Book  MATH  Google Scholar 

  32. I. Steinwart, A. Christmann, Support Vector Machines (Springer, New York, 2008)

    MATH  Google Scholar 

  33. A. Törn, A. Zilinskas, Global Optimization (Springer, Berlin, 1989)

    Book  MATH  Google Scholar 

  34. J. Villemonteix, Optimisation de fonctions coûteuses, PhD thesis, Université Paris-Sud XI, Faculté des Sciences d’Orsay, 2008

    Google Scholar 

  35. J. Villemonteix, E. Vazquez, M. Sidorkiewicz, E. Walter, Global optimization of expensive-to-evaluate functions: an empirical comparison of two sampling criteria. J. Glob. Optim. 43(2–3), 373–389 (2009)

    Article  MATH  MathSciNet  Google Scholar 

  36. J. Villemonteix, E. Vazquez, E. Walter, An informational approach to the global optimization of expensive-to-evaluate functions. J. Glob. Optim. 44(4), 509–534 (2009). doi:10.1007/s10898-008-9354-2

    Article  MATH  MathSciNet  Google Scholar 

  37. H. Wackernagel, Multivariate Geostatistics (Springer, Berlin, 1995)

    Book  MATH  Google Scholar 

  38. G. Wahba, in Spline Models for Observational Data. Volume 59 of CBMS-NSF Regional Conference Series in Applied Mathematics (SIAM, Philadelphia, 1990)

    Google Scholar 

  39. B. Weinberg, E.G. Talbi, NFL theorem is unusable on structured classes of problems, in Proceedings of the 2004 IEEE Congress on Evolutionary Computation, Portland (IEEE, 2004), pp. 220–226

    Google Scholar 

  40. H. Wendland, Scattered Data Approximation. Monographs on Applied and Computational Mathematics (Cambridge University Press, Cambridge, 2005)

    Google Scholar 

  41. D.H. Wolpert, W.G. Macready, No free lunch theorems for search, Technical Report, Santa Fe Institute, 1995

    Google Scholar 

  42. A. Zilinskas, A review of statistical models for global optimization. J. Glob. Optim. 2, 145–153 (1992)

    Article  MATH  MathSciNet  Google Scholar 

Download references

Acknowledgements

O. Teytaud is grateful to NSC for funding NSC100-2811-E-024-001, to ANR for funding COSINUS program (project EXPLO-RA ANR-08-COSI-004), and to the European FP7 program (European Project Nr. FP7-ICT-247022).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Olivier Teytaud .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2014 Springer-Verlag Berlin Heidelberg

About this chapter

Cite this chapter

Teytaud, O., Vazquez, E. (2014). Designing an Optimal Search Algorithm with Respect to Prior Information. In: Borenstein, Y., Moraglio, A. (eds) Theory and Principled Methods for the Design of Metaheuristics. Natural Computing Series. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-33206-7_6

Download citation

  • DOI: https://doi.org/10.1007/978-3-642-33206-7_6

  • Published:

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-642-33205-0

  • Online ISBN: 978-3-642-33206-7

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics