Advertisement

Speeding Up Evolution through Learning: LEM

  • Ryszard S. Michalski
  • Guido Cervone
  • Kenneth Kaufman
Part of the Advances in Soft Computing book series (AINSC, volume 4)

Abstract

This paper reports briefly on the development of a new approach to evolutionary computation, called the Learnable Evolution Model or LEM. In contrast to conventional Darwinian-type evolutionary algorithms that employ mutation and/or recombination, LEM employs machine learning to generate new populations. At each step of evolution, LEM determines hypotheses explaining why certain individuals in the population are superior to others in performing the designated class of tasks. These hypotheses are then instantiated to create a next generation. In the testing studies described here, we compared a program implementing LEM with selected evolutionary computation algorithms on a range optimization problems and a filter design problem. In these studies, LEM significantly outperformed the evolutionary computation algorithms, sometimes speeding up the evolution by two or more orders of magnitude in the number of evolutionary steps (births). LEM was also applied to a real-world problem of designing optimized heat exchangers. The resulting designs matched or — outperformed the best human designs.

Keywords

Genetic Algorithm Evolutionary Computation Belief Space Heat Exchanger Design Cultural Algorithm 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Augier, S., Venturini, G. and Kodratoff, Y., “Learning First Order Logic Rules with a Genetic Algorithm,” Proceedings of the First 1 International Conference on Knowledge Discovery and Data Mining, pp. 21–26, Montreal, Canada, AAAI Press, 1995.Google Scholar
  2. 2.
    Baeck, T., Fogel, D.B., and Michalewicz, Z., (Eds.), Handbook of Evolutionary Computation, Oxford University Press, 1997.Google Scholar
  3. 3.
    Baluja, S. and Caruana R., Removing the Genetics from the Standard Genetic Algorithm, Proceedings of the 12 th International Conference on Machine Learning, Tahoe City, California, July 9–12, 1995.Google Scholar
  4. 4.
    Banzhaf, W., Daida, J., Eiben, A.E., Garzon, M.H., Honavar, V., Jakiela, M. and Smith, R.E. (eds.), Proceedings of the 1999 Genetic and Evolutionary Computation Conference (GECCO), Orlando, Florida, July 13–17, 1999.Google Scholar
  5. 5.
    Cervone, G., “An Experimental Validation of the Learnable Evolution Model to Selected Optimization Problems,” Master’s Thesis, Department of Computer Science, George Mason University, Fairfax, VA, November 1999.Google Scholar
  6. 6.
    Cervone, G. and Coletti, M, R., “ECC++: A Generic C++ Library for Evolutionary Computation,” Reports of the Machine Learning and Inference Laboratory, George Mason University, Fairfax, VA, 2000 (to appear).Google Scholar
  7. 7.
    Cervone, G., Kaufman, K. and Michalski, R.S., “Experimental Validations of the Learnable Evolution Model,” 2000 Congress on Evolutionary Computation, San Diego CA, July, 2000 (to appear).Google Scholar
  8. 8.
    Cervone and Michalski, “Design and Experiments with the LEM2 Implementation of the Learnable Evolution Model,” Reports of the Machine Learning and Inference Laboratory, George Mason University, 2000 (to appear).Google Scholar
  9. 9.
    Coletti, M., Lash, T., Mandsager C., Michalski, R.S., and Moustafa, R., “Comparing Performance of the Learnable Evolution Model and Genetic Algorithms on Problems in Digital Signal Filter Design.” Proceedings of Genetic and Evolutionary Computation Conference (GECCO), Orlando, Florida, July 14–17, 1999 (an extended version was published in Reports of the Machine.Learning and Inference Laboratory, MLI 99–5, 1999 ).Google Scholar
  10. 10.
    De Jong, K.A., “An Analysis of the Behavior of a Class of Genetic Adaptive Systems”, Ph.D. thesis, Department of Computer and Communication Sciences, University of Michigan, Aim Arbor, 1975.Google Scholar
  11. 11.
    De Jong, K. A., Spears, W. M., and Gordon, F. D., “Using Genetic Algorithms for Concept Learning,” Machine Learning, 13, pp. 161–188, 1993.CrossRefGoogle Scholar
  12. 12.
    Dietterich, T.G., “Machine-Learning Research: Four Current Directions,” AI Magazine, 18, No. 4, 1997.Google Scholar
  13. 13.
    Forsburg, S., “AQPLUS: ”An Adaptive Random Search Method for Selecting a Best Set of Attributes from a Large Collection of Candidates,“ Internal Technical Report, Department of Computer Science, University of Illinois, Urbana, 1976.Google Scholar
  14. 14.
    Giordana A. and Neri, F., “Search-intensive Concept Induction,” Evolutionary Computation, 3 (4), pp. 375–416, 1995.CrossRefGoogle Scholar
  15. 15.
    Grefenstette, J. “Lamarckian Learning in Multi-agent Environment,” Proceedings of the Fourth International Conference on Genetic Algorithms, R. Belew and L. Booker (Eds.), San Mateo, CA: Morgan Kaufmann, pp. 303–310, 1991.Google Scholar
  16. 16.
    Greene D. P. and Smith, S.F., “Competition-based Induction of Decision Models from Examples,” Machine Learning, 13, pp. 229–257, 1993.CrossRefGoogle Scholar
  17. 17.
    Hekanaho, J, “GA-based Rule Enhancement Concept Learning,” Proceedings of the Third’ International Conference on Knowledge Discovery and Data Mining, pp. 183–186, Newport Beach, CA, AAAI Press, 1997Google Scholar
  18. 18.
    Hekanaho, J., “DOGMA: A GA-based Relational Learner,: TUCS Technical Reports Series, Report No. 168, March 1998.Google Scholar
  19. 19.
    Janikow, C. Z., “A Knowledge-intensive Genetic Algorithm for Supervised Learning,” Machine Learning, 13, pp. 189–228, 1993.CrossRefGoogle Scholar
  20. 20.
    Kaufman K. and Michalski, R.S., “Applying Learnable Evolution Model to Heat Exchanger Design,” Twelfth International Conference on Innovative Applications of Artificial Intelligence, Austin, TX, August, 2000 (to appear).Google Scholar
  21. 21.
    Koza, J.R., Genetic Programming II: Automatic Discovery of Reusable Programs, The MIT Press, 1994.Google Scholar
  22. 22.
    Michalewicz, Z., Genetic Algorithms + Data Structures = Evolution Programs, Springer Verlag, third edition, 1996.Google Scholar
  23. 23.
    Michalewicz, Z., Schoenauer, M., Yao, X., and Zazala, A. (eds.), Proceedings of the Congress on Evolutionary Computation, Washington, DC, July 6–9, 1999.Google Scholar
  24. 24.
    Michalski, R.S., “A Theory and Methodology of Inductive Learning,” Artificial Intelligence, 20 (2), pp. 111–161, 1983.MathSciNetCrossRefGoogle Scholar
  25. 25.
    Michalski, R.S., “Learnable Evolution: Combining Symbolic and Evolutionary Learning,” Proceedings of the Fourth International Workshop on Multistrategy Learning, Desenzano del Garda, Italy, pp. 14–20, June, 1998.Google Scholar
  26. 26.
    Michalski, R.S., “LEARNABLE EVOLUTION MODEL: Evolutionary Processes Guided by Machine Learning,” Machine Learning, 38 (1–2), pp. 940, January/February, 2000.Google Scholar
  27. 27.
    Michalski, R.S., “NATURAL INDUCTION: Theory, Methodology, and Applications to Machine Learning and Knowledge Mining,” Reports of the Machine Learning and Inference Laboratory, George Mason University, 2000 (to appear).Google Scholar
  28. 28.
    Michalski R.S. and Zhang, Q., “Initial Experiments with the LEM1 Learnable Evolutionary Model: An Application to Function Optimization and Evolvable Hardware,” Reports of the Machine Learning and Inference Laboratory, George Mason University, MLI 99–4, 1999.Google Scholar
  29. 29.
    Mitchell, M. An Introduction to Genetic Algorithms, Cambridge, MA, MIT Press, 1996.Google Scholar
  30. 30.
    Mitchell, T. M., “Does Machine Learning Really Work,” AI Magazine, 18 (3), 1997.Google Scholar
  31. 31.
    Ravise, C. and Sebag, M., “An Advanced Evolution Should Not Repeat Its Past Errors,” Proceedings of the 13 th International Conference on Machine Learning, L. Saitta (ed.), pp. 400–408, 1996.Google Scholar
  32. 32.
    Reynolds, R.G., “An Introduction to Cultural Algorithms, ” Proceedings of the 3 rd Annual Conference on Evolutionary Programming, Selbak, A.V. Fogel L.J. (eds.), River Edge, NJ World Scientific Publishing, pp 131–139, 1994.Google Scholar
  33. 33.
    Rychtyckyj, N. and Reynolds, R.G., “Using Cultural Algorithms to Improve Performance in Semantic Networks,” Proceedings of the Congress on Evolutionary Computation, Washington, DC, pp. 1651–1663, 1999.Google Scholar
  34. 34.
    Sebag, M. and Schoenauer, M., “Controlling Crossover Through Inductive Learning,” in Davidor, Y., Schwefel, H.P. and Manner, R. (eds.), Proceedings of the Third Conference on Parallel Problem Solving from Nature, Springer-Verlag, LNVS 866, pp. 209–218, 1994.CrossRefGoogle Scholar
  35. 35.
    Sebag, M., Schoenauer M., and Ravise C., “Inductive Learning of Multation Step-size in Evolutionary Parameter Optimization,” Proceedings of the Eighth Annual Conference on Evolutionary Programming, LNCS 1213, pp. 247–261, Indianapolis, April 1997.Google Scholar
  36. 36.
    Sebag, M., Shoenauer, M., and Ravise, C., “Toward Civilized Evolution: Developing Inhibitions,” Proceedings of the Seventh International Conference on Genetic Algorithms, pp. 291–298, 1997.Google Scholar
  37. 37.
    Vafaie, H. and De Jong, K.A., “Improving the Performance of a Rule Induction System Using Genetic Algorithms,” Proceedings of the First International Workshop on Multistrategy Learning (MSL-91), Harpers Ferry, WV, November 7–9, 1991.Google Scholar

Copyright information

© Physica-Verlag Heidelberg 2000

Authors and Affiliations

  • Ryszard S. Michalski
    • 1
    • 2
  • Guido Cervone
    • 1
  • Kenneth Kaufman
    • 1
  1. 1.Machine Learning and Inference LaboratoryGeorge Mason UniversityFairfaxVirginia
  2. 2.Institute of Computer SciencePolish Academy of SciencesWarsawPoland

Personalised recommendations