Abstract
An important challenge in black-box optimization is to be able to understand the relative performance of different algorithms on problem instances. This has motivated research in exploratory landscape analysis and algorithm selection, leading to a number of frameworks for analysis. However, these procedures often involve significant assumptions, or rely on information not typically available. In this paper we propose a new, model-based framework for the characterization of black-box optimization problems using Gaussian Process regression. The framework allows problem instances to be compared in a relatively simple way. The model-based approach also allows us to assess the goodness of fit and Gaussian Processes lead to an efficient means of model comparison. The implementation of the framework is described and validated on several test sets.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Bajer, L., Pitra, Z., Holeňa, M.: Benchmarking Gaussian processes and random forests surrogate models on the BBOB noiseless testbed. In: Proceedings of the Companion Publication of the 2015 Conference on Genetic and Evolutionary Computation, pp. 1143–1150. ACM (2015)
Belkhir, N., Dréo, J., Savéant, P., Schoenauer, M.: Surrogate assisted feature computation for continuous problems. In: Festa, P., Sellmann, M., Vanschoren, J. (eds.) LION 2016. LNCS, vol. 10079, pp. 17–31. Springer, Cham (2016). https://doi.org/10.1007/978-3-319-50349-3_2
Forrester, A., Keane, A., et al.: Engineering Design via Surrogate Modelling: A Practical Guide. Wiley, Hoboken (2008)
Frean, M., Boyle, P.: Using Gaussian processes to optimize expensive functions. In: Wobcke, W., Zhang, M. (eds.) AI 2008. LNCS (LNAI), vol. 5360, pp. 258–267. Springer, Heidelberg (2008). https://doi.org/10.1007/978-3-540-89378-3_25
Hansen, N., Finck, S., Ros, R., Auger, A.: Real-parameter black-box optimization benchmarking: noiseless functions definitions. Technical report (2009)
Jin, Y.: Surrogate-assisted evolutionary computation: recent advances and future challenges. Swarm Evol. Comput. 1(2), 61–70 (2011)
Legrand, L., Grivel, E., Giremus, A.: Jeffrey’s divergence between autoregressive moving-average processes. In: 2017 25th European Signal Processing Conference (EUSIPCO), pp. 1085–1089. IEEE (2017)
Maaten, L.V.D., Hinton, G.: Visualizing data using t-SNE. J. Mach. Learn. Res. 9(Nov), 2579–2605 (2008)
Malan, K.M., Engelbrecht, A.P.: Fitness landscape analysis for metaheuristic performance prediction. In: Richter, H., Engelbrecht, A. (eds.) Recent Advances in the Theory and Application of Fitness Landscapes. ECC, vol. 6, pp. 103–132. Springer, Heidelberg (2014). https://doi.org/10.1007/978-3-642-41888-4_4
Mersmann, O., Bischl, B., Trautmann, H., Preuss, M., Weihs, C., Rudolph, G.: Exploratory landscape analysis. In: Proceedings of the 13th Annual Conference on Genetic and Evolutionary Computation, pp. 829–836. ACM (2011)
Mersmann, O., Preuss, M., Trautmann, H., Bischl, B., Weihs, C.: Analyzing the bbob results by means of benchmarking concepts. Evol. Comput. 23(1), 161–185 (2015)
Morgan, R., Gallagher, M.: Analysing and characterising optimization problems using length scale. Soft Comput. 21(7), 1735–1752 (2017)
Muñoz, M.A., Kirley, M., Halgamuge, S.K.: A meta-learning prediction model of algorithm performance for continuous optimization problems. In: Coello, C.A.C., Cutello, V., Deb, K., Forrest, S., Nicosia, G., Pavone, M. (eds.) PPSN 2012. LNCS, vol. 7491, pp. 226–235. Springer, Heidelberg (2012). https://doi.org/10.1007/978-3-642-32937-1_23
Munoz, M.A., Kirley, M., Halgamuge, S.K.: The algorithm selection problem on the continuous optimization domain. In: Moewes, C., Nürnberger, A. (eds.) Computational Intelligence in Intelligent Data Analysis, pp. 75–89. Springer, Heidelberg (2013). https://doi.org/10.1007/978-3-642-32378-2_6
Muñoz, M.A., Smith-Miles, K.A.: Performance analysis of continuous black-box optimization algorithms via footprints in instance space. Evol. Comput. 25(4), 529–554 (2017)
Muñoz, M.A., Sun, Y., Kirley, M., Halgamuge, S.K.: Algorithm selection for black-box continuous optimization problems: a survey on methods and challenges. Inf. Sci. 317, 224–245 (2015)
Osborne, M.A., Garnett, R., Roberts, S.J.: Gaussian processes for global optimization. In: 3rd International Conference on Learning and Intelligent Optimization (LION3), pp. 1–15. Citeseer (2009)
Rasmussen, C.E., Nickisch, H.: Gaussian processes for machine learning (GPML) toolbox. J. Mach. Learn. Res. 11(Nov), 3011–3015 (2010)
Rasmussen, C.E., Williams, C.K.: Gaussian Process for Machine Learning. MIT press, Cambridge (2006)
Rice, J.R.: The algorithm selection problem. Adv. Comput. 15, 65–118 (1976)
Su, G.: Accelerating particle swarm optimization algorithms using Gaussian process machine learning. In: 2009 International Conference on Computational Intelligence and Natural Computing, CINC 2009, vol. 2, pp. 174–177. IEEE (2009)
Sun, Y., Halgamuge, S.K., Kirley, M., Munoz, M.A.: On the selection of fitness landscape analysis metrics for continuous optimization problems. In: 2014 7th International Conference on Information and Automation for Sustainability (ICIAfS), pp. 1–6. IEEE (2014)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2018 Springer Nature Switzerland AG
About this paper
Cite this paper
Saleem, S., Gallagher, M., Wood, I. (2018). A Model-Based Framework for Black-Box Problem Comparison Using Gaussian Processes. In: Auger, A., Fonseca, C., Lourenço, N., Machado, P., Paquete, L., Whitley, D. (eds) Parallel Problem Solving from Nature – PPSN XV. PPSN 2018. Lecture Notes in Computer Science(), vol 11102. Springer, Cham. https://doi.org/10.1007/978-3-319-99259-4_23
Download citation
DOI: https://doi.org/10.1007/978-3-319-99259-4_23
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-99258-7
Online ISBN: 978-3-319-99259-4
eBook Packages: Computer ScienceComputer Science (R0)