Skip to main content

Part of the book series: SpringerBriefs in Applied Sciences and Technology ((BRIEFSINTELL))

Abstract

Many optimization problems that have to be solved in practice are black box problems. Often, not much is known about an optimization problem except the information one can get via function evaluations. Neither derivatives nor constraints are known. In the worst case, nothing is even known about the characteristics of the fitness function, e.g., whether it is uni- or multimodal.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. N. Bhatia, Vandana, Survey of nearest neighbor techniques. CoRR, abs/1007.0085, (2010)

    Google Scholar 

  2. T. Mitchell, Machine Learning (McGraw Hill, Maidenhead, 1997)

    MATH  Google Scholar 

  3. R. Rojas, Neural Networks - A Systematic Introduction (Springer, Berlin, 1996)

    MATH  Google Scholar 

  4. D. Rumelhart, G. Hintont, R. Williams, Learning representations by backpropagating errors. Nature 323(6088), 533–536 (1986)

    Article  Google Scholar 

  5. B. Schölkopf, A.J. Smola, Learning with Kernels: Support Vector Machines, Regularization, Optimization, and Beyond (MIT Press, Cambridge, 2001)

    Google Scholar 

  6. J.A.K. Suykens, J. Vandewalle, Least squares support vector machine classifiers. Neural Process. Lett. 9(3), 293–300 (1999)

    Article  MathSciNet  Google Scholar 

  7. C.M. Bishop, Pattern Recognition and Machine Learning (Information Science and Statistics) (Springer, New York, 2007)

    Google Scholar 

  8. T. Hastie, R. Tibshirani, J. Friedman, The Elements of Statistical Learning (Springer, Berlin, 2009)

    Book  MATH  Google Scholar 

  9. D.H. Wolpert, W.G. Macready, No Free Lunch Theorems for Optimization. IEEE Trans. Evol. Comput. 1(1), 67–82 (1997)

    Article  Google Scholar 

  10. O. Kramer, Iterated local search with Powell’s method: a memetic algorithm for continuous global optimization. Memetic Comput. 2(1), 69–83 (2010)

    Article  Google Scholar 

  11. E.-G. Talbi, A taxonomy of hybrid metaheuristics. J. Heuristics 8(5), 541–564 (2002)

    Article  Google Scholar 

  12. G.R. Raidl, A unified view on hybrid metaheuristics. in Hybrid Metaheuristics (HM) (Springer, Gran Canaria, 2006), pp. 1–12

    Google Scholar 

  13. H.-G. Beyer, B. Sendhoff, Covariance matrix adaptation revisited - the CMSA evolution strategy, in Proceedings of the 10th Conference on Parallel Problem Solving from Nature (PPSN), 2008, pp. 123–132

    Google Scholar 

  14. O. Kramer, D. E. Ciaurri, S. Koziel, Derivative-Free Optimization, in Computational Optimization and Applications in Engineering and Industry, Studies in Computational Intelligence, Springer, 2011, pp. 61–83

    Google Scholar 

  15. O. Kramer, On Mutation Rate Tuning and Control for the (1+1)-EA, in International Conference on Artificial, Artificial Intelligence, 2013, pp. 98–105

    Google Scholar 

  16. O. Kramer. Fast black box optimization: iterated local search and the strategy of powell. in International Conference on Genetic and Evolutionary Methods (GEM), CSREA Press, 2009

    Google Scholar 

  17. O. Kramer, P. Koch, Rake selection: A novel evolutionary multi-objective optimization algorithm. in Proceedings of the German Annual Conference on Artificial Intelligence (KI), Springer, Berlin, pp. 177–184

    Google Scholar 

  18. O. Kramer, F. Gieseke, Evolutionary kernel density regression. Expert Syst. Appl. 39(10), 9246–9254 (2012)

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Oliver Kramer .

Rights and permissions

Reprints and permissions

Copyright information

© 2014 The Author(s)

About this chapter

Cite this chapter

Kramer, O. (2014). Introduction. In: A Brief Introduction to Continuous Evolutionary Optimization. SpringerBriefs in Applied Sciences and Technology(). Springer, Cham. https://doi.org/10.1007/978-3-319-03422-5_1

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-03422-5_1

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-03421-8

  • Online ISBN: 978-3-319-03422-5

  • eBook Packages: EngineeringEngineering (R0)

Publish with us

Policies and ethics