Skip to main content

Tuning the Performance of the MMAS Heuristic

  • Conference paper

Part of the book series: Lecture Notes in Computer Science ((LNTCS,volume 4638))

Abstract

This paper presents an in-depth Design of Experiments (DOE) methodology for the performance analysis of a stochastic heuristic. The heuristic under investigation is Max-Min Ant System (MMAS) for the Travelling Salesperson Problem (TSP). Specifically, the Response Surface Methodology is used to model and tune MMAS performance with regard to 10 tuning parameters, 2 problem characteristics and 2 performance metrics—solution quality and solution time. The accuracy of these predictions is methodically verified in a separate series of confirmation experiments. The two conflicting responses are simultaneously optimised using desirability functions. Recommendations on optimal parameter settings are made. The optimal parameters are methodically verified. The large number of degrees-of-freedom in the MMAS design are overcome with a Minimum Run Resolution V design. Publicly available algorithm and problem generator implementations are used throughout. The paper should therefore serve as an illustrative case study of the principled engineering of a stochastic heuristic.

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   39.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD   54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Dorigo, M., Stützle, T.: Ant Colony Optimization. MIT Press, Cambridge, MA (2004)

    MATH  Google Scholar 

  2. Czitrom, V.: One-Factor-at-a-Time versus Designed Experiments. The American Statistician 53(2), 126–131 (1999)

    Article  Google Scholar 

  3. Stützle, T., Hoos, H.H.: Max-Min Ant System. Future Generation Computer Systems 16(8), 889–914 (2000)

    Article  Google Scholar 

  4. Myers, R.H., Montgomery, D.C.: Response Surface Methodology. Process and Product Optimization Using Designed Experiments. John Wiley and Sons Inc., Chichester (1995)

    Google Scholar 

  5. Oehlert, G., Whitcomb, P.: Small, Efficient, Equireplicated Resolution V Fractions of 2K designs and their Application to Central Composite Designs. In: Proceedings of 46th Fall Technical Conference. American Statistical Association (2002)

    Google Scholar 

  6. Johnson, D.S.: A Theoretician’s Guide to the Experimental Analysis of Algorithms. In: Proceedings of the Fifth and Sixth DIMACS Implementation Challenges (2002)

    Google Scholar 

  7. Zlochin, M., Dorigo, M.: Model based search for combinatorial optimization: a comparative study. In: Guervós, J.J.M., Adamidis, P.A., Beyer, H.-G., Fernández-Villacañas, J.-L., Schwefel, H.-P. (eds.) Parallel Problem Solving from Nature - PPSN VII. LNCS, vol. 2439, Springer, Heidelberg (2002)

    Google Scholar 

  8. Applegate, D., Bixby, R., Chvatal, V., Cook, W.: Implementing the Dantzig-Fulkerson-Johnson algorithm for large traveling salesman problems. Mathematical Programming Series B 97(1-2), 91–153 (2003)

    MATH  Google Scholar 

  9. Johnson, D.S., McGeoch, L.A.: Experimental analysis of heuristics for the STSP. In: The Traveling Salesman Problem and Its Variations, Kluwer Academic Publishers, Dordrecht (2002)

    Google Scholar 

  10. Cheeseman, P., Kanefsky, B., Taylor, W.M.: Where the Really Hard Problems Are. In: Proceedings of the Twelfth International Joint Conference on Artificial Intelligence, vol. 1, pp. 331–337. Morgan Kaufman, USA (1991)

    Google Scholar 

  11. Ridge, E., Kudenko, D.: An Analysis of Problem Difficulty for a Class of Optimisation Heuristics. In: Proceedings of EvoCOP 2007. LNCS, vol. 4446, pp. 198–209. Springer, Heidelberg (2007)

    Google Scholar 

  12. Ostle, B.: Statistics in Research, 2nd edn. Iowa State University Press (1963)

    Google Scholar 

  13. Ridge, E., Kudenko, D.: Analyzing Heuristic Performance with Response Surface Models: Prediction, Optimization and Robustness. In: Proceedings of the Genetic and Evolutionary Computation Conference, ACM Press, New York (2007)

    Google Scholar 

  14. Montgomery, D.C.: Design and Analysis of Experiments, 6th edn. Wiley, Chichester (2005)

    MATH  Google Scholar 

  15. Derringer, G., Suich, R.: Simultaneous Optimization of Several Response Variables. Journal of Quality Technology 12(4), 214–219 (1980)

    Google Scholar 

  16. Press, W.H., Flannery, B.P., Teukolsky, S.A., Vetterling, W.T.: Numerical Recipes in Pascal: the art of scientific computing. Cambridge University Press, Cambridge (1989)

    MATH  Google Scholar 

  17. Adenso-Dıaz, B., Laguna, M.: Fine-Tuning of Algorithms Using Fractional Experimental Designs and Local Search. Operations Research 54(1), 99–114 (2006)

    Article  Google Scholar 

  18. Coy, S., Golden, B., Runger, G., Wasil, E.: Using Experimental Design to Find Effective Parameter Settings for Heuristics. Journal of Heuristics 7(1), 77–97 (2001)

    Article  MATH  Google Scholar 

  19. Park, M.W., Kim, Y.D.: A systematic procedure for setting parameters in simulated annealing algorithms. Computers and Operations Research 25(3) (1998)

    Google Scholar 

  20. Parsons, R., Johnson, M.: A Case Study in Experimental Design Applied to Genetic Algorithms with Applications to DNA Sequence Assembly. American Journal of Mathematical and Management Sciences 17(3), 369–396 (1997)

    Google Scholar 

  21. Birattari, M.: The Problem of Tuning Metaheuristics. Phd, Université Libre de Bruxelles (2006)

    Google Scholar 

  22. Gaertner, D., Clark, K.L.: On Optimal Parameters for Ant Colony Optimization Algorithms. In: Proceedings of the 2005 International Conference on Artificial Intelligence, vol. 1, pp. 83–89. CSREA Press (2005)

    Google Scholar 

  23. Ridge, E., Kudenko, D.: Sequential Experiment Designs for Screening and Tuning Parameters of Stochastic Heuristics. In: Workshop on Empirical Methods for the Analysis of Algorithms, Reykjavik, Iceland. pp. 27–34 (2006)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Thomas Stützle Mauro Birattari Holger H. Hoos

Rights and permissions

Reprints and permissions

Copyright information

© 2007 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Ridge, E., Kudenko, D. (2007). Tuning the Performance of the MMAS Heuristic. In: Stützle, T., Birattari, M., H. Hoos, H. (eds) Engineering Stochastic Local Search Algorithms. Designing, Implementing and Analyzing Effective Heuristics. SLS 2007. Lecture Notes in Computer Science, vol 4638. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-540-74446-7_4

Download citation

  • DOI: https://doi.org/10.1007/978-3-540-74446-7_4

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-74445-0

  • Online ISBN: 978-3-540-74446-7

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics