Skip to main content

Modern Continuous Optimization Algorithms for Tuning Real and Integer Algorithm Parameters

  • Conference paper
Book cover Swarm Intelligence (ANTS 2010)

Part of the book series: Lecture Notes in Computer Science ((LNTCS,volume 6234))

Included in the following conference series:

Abstract

To obtain peak performance from optimization algorithms, it is required to set appropriately their parameters. Frequently, algorithm parameters can take values from the set of real numbers, or from a large integer set. To tune this kind of parameters, it is interesting to apply state-of-the-art continuous optimization algorithms instead of using a tedious, and error-prone, hands-on approach. In this paper, we study the performance of several continuous optimization algorithms for the algorithm parameter tuning task. As case studies, we use a number of optimization algorithms from the swarm intelligence literature.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Adenso-Díaz, B., Laguna, M.: Fine-tuning of algorithms using fractional experimental designs and local search. Operations Research 54(1), 99–114 (2006)

    Article  MATH  Google Scholar 

  2. Ansotegui Gil, C., Sellmann, M., Tierney, K.: A gender-based genetic algorithm for the automatic configuration of solvers. In: Gent, I.P. (ed.) CP 2009. LNCS, vol. 5732, pp. 142–157. Springer, Heidelberg (2009)

    Chapter  Google Scholar 

  3. Audet, C., Dennis, J.E., Mesh, J.: adaptive direct search algorithms for constrained optimization. SIAM Journal on Optimization 17(1), 188–217 (2006)

    Article  MATH  MathSciNet  Google Scholar 

  4. Auger, A., Hansen, N., Zerpa, J.M.P., Ros, R., Schoenauer, M.: Experimental comparisons of derivative free optimization algorithms. In: SEA 2009. LNCS, vol. 5526, pp. 3–15. Springer, Heidelberg (2009)

    Google Scholar 

  5. Bartz-Beielstein, T.: Experimental Research in Evolutionary Computation–The New Experimentalism. Springer, Berlin (2006)

    MATH  Google Scholar 

  6. Birattari, M.: The Problem of Tuning Metaheuristics as seen from a Machine Learning Perspective. Ph.D. thesis, Université Libre de Bruxelles (2004)

    Google Scholar 

  7. Birattari, M.: Tuning Metaheuristics: A machine learning perspective. Springer, Berlin (2009)

    MATH  Google Scholar 

  8. Birattari, M., Yuan, Z., Balaprakash, P., Stützle, T.: F-Race and iterated F-Race: An overview. In: Bartz-Beielstein, T., et al. (eds.) Experimental Methods for the Analysis of Optimization Algorithms, pp. 311–336. Springer, Berlin (2009)

    Google Scholar 

  9. Dorigo, M., Stützle, T.: Ant Colony Optimization. MIT Press, Cambridge (2004)

    MATH  Google Scholar 

  10. Fukunaga, A.S.: Automated discovery of local search heuristics for satisfiability testing. Evolutionary Computation 16(1), 31–61 (2008)

    Article  Google Scholar 

  11. Hansen, N.: The CMA evolution strategy: a comparing review. In: Lozano, J., et al. (eds.) Towards a new evolutionary computation, pp. 75–102. Springer, Berlin (2006)

    Chapter  Google Scholar 

  12. Hutter, F., Hoos, H.H., Leyton-Brown, K., Murphy, K.P.: An experimental investigation of model-based parameter optimisation: SPO and beyond. In: Proc. of GECCO 2009, pp. 271–278. ACM press, New York (2009)

    Chapter  Google Scholar 

  13. Hutter, F., Hoos, H.H., Leyton-Brown, K., Stützle, T.: ParamILS: An automatic algorithm configuration framework. Journal of Artificial Intelligence Research 36, 267–306 (2009)

    MATH  Google Scholar 

  14. Johnson, D.S., McGeoch, L.A., Rego, C., Glover, F.: 8th DIMACS implementation challenge, http://www.research.att.com/~dsj/chtsp/

  15. Jones, T., Forrest, S.: Fitness distance correlation as a measure of problem difficulty for genetic algorithms. In: Proc. of 6th Int. Conf. on Genetic Algorithms, pp. 184–192. Morgan Kaufmann, San Francisco (1995)

    Google Scholar 

  16. Nannen, V., Eiben, A.E.: Relevance estimation and value calibration of evolutionary algorithm parameters. In: Proc. of IJCAI 2007, pp. 975–980 (2007)

    Google Scholar 

  17. Oltean, M.: Evolving evolutionary algorithms using linear genetic programming. Evolutionary Computation 13(3), 387–410 (2005)

    Article  Google Scholar 

  18. Poli, R., Kennedy, J., Blackwell, T.: Particle swarm optimization. An overview. Swarm Intelligence 1(1), 33–57 (2007)

    Article  Google Scholar 

  19. Powell, M.J.D.: The NEWUOA software for unconstrained optimization. In: Large-Scale Nonlinear Optimization, Nonconvex Optimization and Its Applications, vol. 83, pp. 255–297. Springer, Berlin (2006)

    Google Scholar 

  20. Powell, M.J.D.: The BOBYQA algorithm for bound constrained optimization without derivatives. Tech. Rep. NA2009/06, Department of Applied Mathematics and Theoretical Physics, University of Cambridge (2009)

    Google Scholar 

  21. Storn, R.: Differential evolution homepage, http://www.icsi.berkeley.edu/~storn/code.html#prac

  22. Storn, R., Price, K.: Differential evolution – a simple and efficient heuristic for global optimization over continuous spaces. Journal of Global Optimization 11(4), 341–359 (1997)

    Article  MATH  MathSciNet  Google Scholar 

  23. Stützle, T.: Software ACOTSP, http://iridia.ulb.ac.be/~mdorigo/ACO/aco-code/public-software.html

  24. Stützle, T., Hoos, H.: \(\cal MAX\)\(\cal MIN\). Ant System. Future Generation Computer Systems 16(8), 889–914 (2000)

    Article  Google Scholar 

  25. Ting, C.K., Huang, C.H.: Varying number of difference vectors in differential evolution. In: Proc. of CEC 2009, pp. 1351–1358. IEEE Press, Piscataway (2009)

    Google Scholar 

  26. Torczon, V.: On the convergence of pattern search algorithms. SIAM Journal on Optimization 7(1), 1–25 (1997)

    Article  MATH  MathSciNet  Google Scholar 

  27. Yuan, Z., Stützle, T., Birattari, M.: MADS/F-race: mesh adaptive direct search meets F-race. In: Ali, M., et al. (eds.) Trends in Applied Intelligent Systems. LNCS, vol. 6096, pp. 41–50. Springer, Heidelberg (2010)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2010 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Yuan, Z., de Oca, M.A.M., Birattari, M., Stützle, T. (2010). Modern Continuous Optimization Algorithms for Tuning Real and Integer Algorithm Parameters. In: Dorigo, M., et al. Swarm Intelligence. ANTS 2010. Lecture Notes in Computer Science, vol 6234. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-15461-4_18

Download citation

  • DOI: https://doi.org/10.1007/978-3-642-15461-4_18

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-642-15460-7

  • Online ISBN: 978-3-642-15461-4

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics