Abstract
Many optimization problems that have to be solved in practice are black box problems. Often, not much is known about an optimization problem except the information one can get via function evaluations. Neither derivatives nor constraints are known. In the worst case, nothing is even known about the characteristics of the fitness function, e.g., whether it is uni- or multimodal.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
N. Bhatia, Vandana, Survey of nearest neighbor techniques. CoRR, abs/1007.0085, (2010)
T. Mitchell, Machine Learning (McGraw Hill, Maidenhead, 1997)
R. Rojas, Neural Networks - A Systematic Introduction (Springer, Berlin, 1996)
D. Rumelhart, G. Hintont, R. Williams, Learning representations by backpropagating errors. Nature 323(6088), 533–536 (1986)
B. Schölkopf, A.J. Smola, Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond (MIT Press, Cambridge, 2001)
J.A.K. Suykens, J. Vandewalle, Least squares support vector machine classifiers. Neural Process. Lett. 9(3), 293–300 (1999)
C.M. Bishop, Pattern Recognition and Machine Learning (Information Science and Statistics) (Springer, New York, 2007)
T. Hastie, R. Tibshirani, J. Friedman, The Elements of Statistical Learning (Springer, Berlin, 2009)
D.H. Wolpert, W.G. Macready, No Free Lunch Theorems for Optimization. IEEE Trans. Evol. Comput. 1(1), 67–82 (1997)
O. Kramer, Iterated local search with Powell’s method: a memetic algorithm for continuous global optimization. Memetic Comput. 2(1), 69–83 (2010)
E.-G. Talbi, A taxonomy of hybrid metaheuristics. J. Heuristics 8(5), 541–564 (2002)
G.R. Raidl, A unified view on hybrid metaheuristics. in Hybrid Metaheuristics (HM) (Springer, Gran Canaria, 2006), pp. 1–12
H.-G. Beyer, B. Sendhoff, Covariance matrix adaptation revisited - the CMSA evolution strategy, in Proceedings of the 10th Conference on Parallel Problem Solving from Nature (PPSN), 2008, pp. 123–132
O. Kramer, D. E. Ciaurri, S. Koziel, Derivative-Free Optimization, in Computational Optimization and Applications in Engineering and Industry, Studies in Computational Intelligence, Springer, 2011, pp. 61–83
O. Kramer, On Mutation Rate Tuning and Control for the (1+1)-EA, in International Conference on Artificial, Artificial Intelligence, 2013, pp. 98–105
O. Kramer. Fast black box optimization: iterated local search and the strategy of powell. in International Conference on Genetic and Evolutionary Methods (GEM), CSREA Press, 2009
O. Kramer, P. Koch, Rake selection: A novel evolutionary multi-objective optimization algorithm. in Proceedings of the German Annual Conference on Artificial Intelligence (KI), Springer, Berlin, pp. 177–184
O. Kramer, F. Gieseke, Evolutionary kernel density regression. Expert Syst. Appl. 39(10), 9246–9254 (2012)
Author information
Authors and Affiliations
Corresponding author
Rights and permissions
Copyright information
© 2014 The Author(s)
About this chapter
Cite this chapter
Kramer, O. (2014). Introduction. In: A Brief Introduction to Continuous Evolutionary Optimization. SpringerBriefs in Applied Sciences and Technology(). Springer, Cham. https://doi.org/10.1007/978-3-319-03422-5_1
Download citation
DOI: https://doi.org/10.1007/978-3-319-03422-5_1
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-03421-8
Online ISBN: 978-3-319-03422-5
eBook Packages: EngineeringEngineering (R0)