Abstract
Algorithm Configuration is still an intricate problem especially in the continuous black box optimization domain. This paper empirically investigates the relationship between continuous problem features (measuring different problem characteristics) and the best parameter configuration of a given stochastic algorithm over a bench of test functions — namely here, the original version of Differential Evolution over the BBOB test bench. This is achieved by learning an empirical performance model from the problem features and the algorithm parameters. This performance model can then be used to compute an empirical optimal parameter configuration from features values. The results show that reasonable performance models can indeed be learned, resulting in a better parameter configuration than a static parameter setting optimized for robustness over the test bench.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Notes
- 1.
The same \(\theta _j\) need not have been tried for all \(f_i\).
- 2.
d, the dimension of the search space, can be considered as the only external feature — or the Algorithm Configuration can be conducted anew for each dimension (more in Sect. 5).
- 3.
- 4.
- 5.
Measured as the number of function evaluations.
- 6.
- 7.
- 8.
Additional plots are available at https://drive.google.com/open?id=0B9GuQcCjvwt FdkotR1h1N3dlOG8.
References
Auger, A., Teytaud, O.: Continuous lunches are free plus the design of optimal optimization algorithms. Algorithmica (2009). https://hal.inria.fr/inria-00369788
Belkhir, N., Dréo, J., Savéant, P., Schoenauer, M.: Surrogate assisted feature computation for continuous problems. In: Proceedings of LION 10 (2016, to appear). https://hal.archives-ouvertes.fr/hal-01303320
Bischl, B., Mersmann, O., Trautmann, H., Preuß, M.: Algorithm selection based on exploratory landscape analysis and cost-sensitive learning. In: Proceedings of the 14th Annual Conference on Genetic and Evolutionary Computation, pp. 313–320. ACM (2012)
Bossek, J., Bischl, B., Wagner, T., Rudolph, G.: Learning feature-parameter mappings for parameter tuning via the profile expected improvement. In: Proceedings of the 2015 on Genetic and Evolutionary Computation Conference, pp. 1319–1326. ACM (2015)
Hansen, N., Auger, A., Finck, S., Ros, R.: Real-parameter black-box optimization benchmarking 2010: experimental setup. Technical report, RR-7215, INRIA (2010)
Hoos, H.H.: Programming by optimization. Commun. ACM 55(2), 70–80 (2012)
Hutter, F., Hamadi, Y., Hoos, H.H., Leyton-Brown, K.: Performance prediction and automated tuning of randomized and parametric algorithms. In: Benhamou, F. (ed.) CP 2006. LNCS, vol. 4204, pp. 213–228. Springer, Heidelberg (2006)
Hutter, F., Hoos, H.H., Leyton-Brown, K.: Sequential model-based optimization for general algorithm configuration. In: Coello, C.A.C. (ed.) LION 5, 2011. LNCS, vol. 6683, pp. 507–523. Springer, Heidelberg (2011)
Hutter, F., Xu, L., Hoos, H.H., Leyton-Brown, K.: Algorithm runtime prediction: methods & evaluation. Artif. Intell. 206, 79–111 (2014)
Leyton-Brown, K., Nudelman, E., Shoham, Y.: Learning the empirical hardness of optimization problems: the case of combinatorial auctions. In: Hentenryck, P. (ed.) CP 2002. LNCS, vol. 2470, pp. 556–572. Springer, Heidelberg (2002)
Lunacek, M., Whitley, D.: The dispersion metric and the cma evolution strategy. In: Proceedings of the 8th GECCO, pp. 477–484. ACM (2006)
Mersmann, O., Bischl, B., Trautmann, H., Preuss, M., Weihs, C., Rudolph, G.: Exploratory landscape analysis. In: Proceedings of the 13th GECCO, pp. 829–836. ACM (2011)
Munoz, M., Kirley, M., Halgamuge, S.K., et al.: Exploratory landscape analysis of continuous space optimization problems using information content. IEEE Trans. Evol. Comput. 19(1), 74–87 (2015)
Muñoz, M.A., Kirley, M., Halgamuge, S.K.: A meta-learning prediction model of algorithm performance for continuous optimization problems. In: Coello, C.A.C., Cutello, V., Deb, K., Forrest, S., Nicosia, G., Pavone, M. (eds.) PPSN 2012, Part I. LNCS, vol. 7491, pp. 226–235. Springer, Heidelberg (2012)
Rice, J.R.: The algorithm selection problem. Adv. Comput. 15, 65–118 (1976)
Storn, R., Price, K.: Differential evolution-a simple and efficient heuristic for global optimization over continuous spaces. J. Glob. Optim. 11(4), 341–359 (1997)
Wolpert, D., Macready, W.: No free lunch theorems for optimization. IEEE Trans. Evol. Comput. 1(1), 67–82 (1997)
Xu, L., Hutter, F., Hoos, H.H., Leyton-Brown, K.: Satzilla: portfolio-based algorithm selection for sat. J. Artif. Intell. Res. 565–606 (2008)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2016 Springer International Publishing AG
About this paper
Cite this paper
Belkhir, N., Dréo, J., Savéant, P., Schoenauer, M. (2016). Feature Based Algorithm Configuration: A Case Study with Differential Evolution. In: Handl, J., Hart, E., Lewis, P., López-Ibáñez, M., Ochoa, G., Paechter, B. (eds) Parallel Problem Solving from Nature – PPSN XIV. PPSN 2016. Lecture Notes in Computer Science(), vol 9921. Springer, Cham. https://doi.org/10.1007/978-3-319-45823-6_15
Download citation
DOI: https://doi.org/10.1007/978-3-319-45823-6_15
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-45822-9
Online ISBN: 978-3-319-45823-6
eBook Packages: Computer ScienceComputer Science (R0)