Skip to main content

Extending Selection Learning toward Fixed-Length d-Ary Strings

  • Conference paper
  • First Online:
Artificial Evolution (EA 2001)

Part of the book series: Lecture Notes in Computer Science ((LNCS,volume 2310))

  • 599 Accesses

Abstract

The aim of this paper is to extend selection learning, initially designed for the optimization of real functions over fixed-length binary strings, toward fixed-length strings on an arbitrary finite alphabet. We derive selection learning algorithms from clear principles. First, we are looking for product probability measures over d-ary strings, or equivalently, random variables whose components are statistically independent. Second, these distributions are evaluated relatively to the expectation of the fitness function. More precisely, we consider the logarithm of the expectation to introduce fitness proportional and Boltzmann selections. Third, we define two kinds of gradient systems to maximize the expectation. The first one drives unbounded parameters, whereas the second one directly drives probabilities, à la PBIL. We also introduce composite selection, that is algorithms which take into account positively as well as negatively selected strings.We propose stochastic approximations for the gradient systems, and finally, we apply three of the resulting algorithms to two test functions, One Max and Big Jump, and draw some conclusions on their relative strengths and weaknesses.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. S. Baluja and R. Caruana. Removing the genetics from the standard genetic algorithm. In A. Prieditis and S. Russel, editors, Proceedings of the 12th International Conference on Machine Learning, pages 38–46. Morgan Kaufmann, 1995.

    Google Scholar 

  2. A. Berny. Statistical machine learning and combinatorial optimization. in [7].

    Google Scholar 

  3. A. Berny. An adaptive scheme for real function optimization acting as a selection operator. In X. Yao and D.B. Fogel, editors, First IEEE Symposium on Combinations of Evolutionary Computation and Neural Networks, pages 140–149, San Antonio, May 2000.

    Google Scholar 

  4. A. Berny. Apprentissage et optimisation statistiques, application à la radiotéléphonie mobile. PhD thesis, Université de Nantes, 2000. in french.

    Google Scholar 

  5. A. Berny. Selection and reinforcement learning for combinatorial optimization. In M. Schoenauer et al., editors, Parallel Problem Solving from Nature VI, Lecture Notes in Computer Science, pages 601–610, Paris, September 2000. Springer-Verlag.

    Chapter  Google Scholar 

  6. A. Johnson and J. Shapiro. The importance of selection mechanisms in distribution estimation algorithms. In Artificial Evolution, Le Creusot, France, October 2001.

    Google Scholar 

  7. L. Kallel, B. Naudts, and A. Rogers, editors. Theoretical Aspects of Evolutionary Computing. Natural Computing Series. Springer-Verlag, 2001.

    Google Scholar 

  8. C.-C. Lo and C.-C. Hsu. An annealing framework with learning memory. IEEE Trans. on Systems, Man, and Cybernetics part A, 28(5):648–661, September 1998.

    Article  Google Scholar 

  9. N. Meuleau and M. Dorigo. Ant colony optimization and stochastic gradient descent. Technical report, IRIDIA, December 2000.

    Google Scholar 

  10. H. Mühlenbein. Evolutionary algorithms: from recombination to search distributions. in [7].

    Google Scholar 

  11. A. Ratle and M. Sebag. Avoiding the bloat with stochastic grammar-based genetic programming. In Artificial Evolution, Le Creusot, France, October 2001.

    Google Scholar 

  12. D. Robilliard and C. Fonlupt. A shepherd and a sheepdog to guide evolutionary computation? In C. Fonlupt, J.-K. Hao, E. Lutton, E. Ronald, and M. Schoenauer, editors, Artificial Evolution, Lecture Notes in Computer Science, pages 277–291. Springer-Verlag, 1999.

    Google Scholar 

  13. M. Sebag and M. Schoenauer. A society of hill-climbers. In Proc. IEEE Int. Conf. on Evolutionary Computation, pages 319–324, Indianapolis, April 1997.

    Google Scholar 

  14. M. P. Servais, G. de Jaer, and J. R. Geene. Function optimization using multiplebase population based incremental learning. In Proc. Eight South African Workshop on Pattern Recognition, 1997.

    Google Scholar 

  15. G. Yin, G. Rudolph, and H.-P. Schwefel. Analyzing (1, λ) Evolution Strategy via stochastic approximation methods. Informatica, 3(4):473–489, 1995.

    MathSciNet  Google Scholar 

  16. G. Yin, G. Rudolph, and H.-P. Schwefel. Establishing connections between evolutionary algorithms and stochastic approximation. Informatica, 6(1):93–116, 1995.

    MATH  MathSciNet  Google Scholar 

Download references

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2002 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Berny, A. (2002). Extending Selection Learning toward Fixed-Length d-Ary Strings. In: Collet, P., Fonlupt, C., Hao, JK., Lutton, E., Schoenauer, M. (eds) Artificial Evolution. EA 2001. Lecture Notes in Computer Science, vol 2310. Springer, Berlin, Heidelberg. https://doi.org/10.1007/3-540-46033-0_5

Download citation

  • DOI: https://doi.org/10.1007/3-540-46033-0_5

  • Published:

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-43544-0

  • Online ISBN: 978-3-540-46033-6

  • eBook Packages: Springer Book Archive

Publish with us

Policies and ethics