Skip to main content

Problem Solving and Evolutionary Computation

  • Chapter
  • First Online:
  • 665 Accesses

Abstract

Optimization algorithms impose an implicit network structure on fitness landscapes. For a given algorithm A operating on a problem that has a fitness landscape F, connections between solutions are defined by the transitions allowed by A.

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   84.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD   109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD   109.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

Notes

  1. 1.

    If we have a maximization problem, we can transform it to a minimization problem, e.g., by optimizing the negative of the objective function. In addition, an equation can be represented by two inequalities that hold at the same time: a “less than or equal” and a “greater than or equal” inequality. A “greater than or equal” inequality can be transformed to a “less than or equal” inequality by multiplying both sides by (\(-1\)). Thus, any optimization problem can be represented in the general form given in P1.

  2. 2.

    When the decision variable domain is unspecified, it will always be assumed to belong to \(\mathbb {R}\).

  3. 3.

    In the definitions of optimality (which we will give shortly) and in other definitions to follow, we will make extensive use of the open ball concept. This though does not mean that these definitions rely on open balls. Actually, they are also valid when more general notions of neighborhood are used. It is merely for the sake of striking a balance between illustrative clarity and mathematical accuracy that we will stick to the concept of open ball.

  4. 4.

    Ref. [1] defines the \(O\) notation as follows: Suppose \(f\) and \(g\) are positive functions defined for positive integers, \(f, \,\, g : I^+ \rightarrow R^+\), then

    1. 1.

      \(f=O(g)\), if  \(\exists \) positive constants \(c\) and \(N\) such that \(f(n) \le cg(n)\),    \(\forall \) \(n \ge N\).

    2. 2.

      \(f=\Omega (g)\), if  \(\exists \) positive constant \(c\) and \(N\) such that \(f(n) \ge cg(n)\),    \(\forall \) \(n \ge N\).

    3. 3.

      \(f=\Theta (g)\), if  \(\exists \) positive constant \(c\), \(d\), and \(N\) such that \(dg(n) \le f(n) \le cg(n)\),    \(\forall \) \(n \ge N\).

  5. 5.

    In optimization theory, a solution is an assignment of a value to each variable in the problem. This value is drawn from the domain of the corresponding variable defining the decision space. Each solution has an image when evaluated with the objective function(s) in the objective space. The use of the term “solution” in optimization theory is thus slightly counterintuitive; it differs from how we talk about “solutions” in our everyday language.

  6. 6.

    The values a variable can take is called the “domain” of the variable.

  7. 7.

    Remember, an LP problem is defined over a continuous domain; therefore, if the region of feasible solutions exists, it will be a continuous domain containing an infinite number of solutions.

  8. 8.

    Note that a genetic “algorithm” is not an algorithm as defined at the beginning of Sect. 3.3.1 because there is no mathematical proof that guarantees that a GA finds the optimum.

References

  1. N. Ansari, E. Hou, Computational intelligence for optimization (Kluwer Academic Publisher, Dordrecht, 1997)

    Book  MATH  Google Scholar 

  2. F. Glover, Tabu search: part 1. ORSA J. Comput. 1(3), 190–206 (1989)

    Article  MATH  Google Scholar 

  3. F. Glover, Tabu search: part 2. ORSA J. Comput. 2(1), 4–32 (1990)

    Article  MATH  Google Scholar 

  4. S. Kirkpatrick, C.D. Gelatt, M.P. Vecchi, Optimization by simulated annealing. Science 220, 671–680 (1983)

    Article  MathSciNet  MATH  Google Scholar 

  5. J. Kennedy, R.C. Eberhart et al., Particle swarm optimization, in Proceedings of IEEE international conference on neural networks, vol. 4 (Perth, Australia, 1995), pp. 1942–1948

    Google Scholar 

  6. H.A. Abbass. MBO: marriage in honey bees optimization-a Haplometrosis polygynous swarming approach. In Proceedings of the 2001 Congress on Evolutionary Computation, volume 1, pages 207–214. IEEE, 2001.

    Google Scholar 

  7. H.A. Abbass. A single queen single worker honey bees approach to 3-SAT. In Proceedings of Genetic Evolutionary Computation Conference. San Mateo, CA: Morgan Kaufmann. Citeseer, 2001.

    Google Scholar 

  8. H.A. Abbass.An agent based approach to 3-SAT using marriage in honey-bees optimization. International journal of knowledge based intelligent systems, 6(2):64–71, 2002.

    Google Scholar 

  9. M. Dorigo, V. Maniezzo, A. Colorni, Ant system: optimization by a colony of cooperating agents. Systems, Man, and Cybernetics, Part B: Cybernetics, IEEE Transactions on 26(1), 29–41 (2002)

    Article  Google Scholar 

  10. P. Larranaga, J.A. Lozano, Estimation of distribution algorithms: A new tool for evolutionary computation (Springer, Amsterdam, 2002)

    Google Scholar 

  11. S. Kirkpatrick, C.D. Gelatt, M.P. Vecchi, Optimization by simulated annealing. Science 220, 671–680 (1983)

    Article  MathSciNet  MATH  Google Scholar 

  12. R.V.V. Vidal. Applied simulated annealing. Springer-Verlag, 1993.

    Google Scholar 

  13. H.J. Bremermann, Optimization through evolution and recombination, in Self-Organizing Systems, ed. by M.C. Yovitsetal (Spartan, Washington, D C, 1962)

    Google Scholar 

  14. R.M. Friedberg, A learning machine: Parti. IBMJ. 2(1), 2–13 (1958)

    MathSciNet  Google Scholar 

  15. R.M. Friedberg, B. Dunham, J.H. North, A learning machine: Partii. IBMJ. 3(7), 282–287 (1959)

    MathSciNet  Google Scholar 

  16. G. E. P. Box. Evolutionary operation: A method for increasing industrial productivity. Appl. Statistics, VI(2):81–101, 1957.

    Google Scholar 

  17. T. Bäck, U. Hammel, H.P. Schwefel, Evolutionary computation: comments on the history and current state. IEEE Trans. Evol. Comput. 1(1), 3–17 (1997)

    Article  Google Scholar 

  18. J.H. Holland, Outline for a logical theory of adaptive systems. J. Assoc. Comput. Mach. 3, 297–314 (1962)

    Article  Google Scholar 

  19. I. Rechenberg. Cybernetic solution path of an experimental problem. Royal Aircraft Establishment, Library translation No.1122, Farnbor-ough, Hants., U.K., 1965.

    Google Scholar 

  20. H. P. Schwefel. Projekt MHD-Staustrahlrohr: Experimentelle Optimierung einer Zweiphasendüse, Teil I. Technischer Bericht 11.034/68, 35, AEG Forschungsinstitut, Berlin, Germany, 1968.

    Google Scholar 

  21. L.J. Fogel, Autonomous automata. Ind. Res. 4, 14–19 (1962)

    Google Scholar 

  22. Z. Michalewicz. Genetic Algorithms + Data Structures = Evolution Programs. New York: Springer-Verlag, 3rd Revised and Extended, 1996.

    Google Scholar 

  23. D. B. Fogel. Evolutionary Computation: Toward a New Philosophy of Machine Intelligence. Wiley-IEEE Press: New York, 2nd Edition, 1999.

    Google Scholar 

  24. M. Mitchell, An Introduction to Genetic Algorithms (MIT Press, Cambridge, MA, Reprint Edition, 1998)

    Google Scholar 

  25. T. Bäck, Evolutionary Algorithms in Theory and Practice: Evolution Strategies, Evolutionary Programming, Genetic Algorithms (Oxford University Press, Oxford, 1996)

    Google Scholar 

  26. L. D. Whitley. The genitor algorithm and selection pressure: Whyrank-based allocation of reproductive trialsis best. In Proc. 3rd Int. Conf. on Genetic Algorithms, pages 116–121. San Mateo, CA: Morgan Kaufmann, 1989.

    Google Scholar 

  27. L. D. Whitley and J. Kauth. Genitor: A different genetic algorithm. In Proc. Rocky Mountain Conf. Artificia Intel., pages 118–130, Denver, CO., 1988.

    Google Scholar 

  28. K.A. De Jong, J. Sarma, Generation gaps revisited, Foundations of Genetic Algorithms 2 (San Mateo, CA, Morgan Kaufmann, 1993), pp. 19–28

    Google Scholar 

  29. J.H. Holland, Adaptation in Natural and Artificial Systems (Univ. of Michigan Press, Ann Arbor, MI, 1975)

    Google Scholar 

  30. J.H. Holland, J.S. Reitman, Cognitive systems based on adaptive algorithms, in Pattern-Directed Inference Systems, ed. by D.A. Waterman, F. Hayes-Roth (Academic, New York, 1978)

    Google Scholar 

  31. K. A. De Jong. Ananalysis of thebehavior of a class of genetic adaptive systems. Ph.D. dissertation, Univ. of Michigan, Ann, Arbor, 1975.

    Google Scholar 

  32. K. A. De Jong. On using genetic algorithms to search program spaces. In Proceedings of the 2nd Int. Conf. on Genetic Algorithms and Their Applications, pages 210–216. Hillsdale, NJ: Lawrence Erlbaum, 1987.

    Google Scholar 

  33. K.A. De Jong, Are genetic algorithms function optimizers?, Parallel Problem Solving from Nature 2 (Amsterdam, The Netherlands, Elsevier, 1992), pp. 3–13

    Google Scholar 

  34. K.A. De Jong, Genetic algorithms are not function optimizers, Foundations of Genetic Algorithms 2 (San Mateo, CA, Morgan Kaufmann, 1993), pp. 5–17

    Google Scholar 

  35. D. E. Goldberg. Genetic algorithms and rule learning in dynamic system control. In Proc. 1st Int. Conf. on Genetic Algorithms and Their Applications, pages 8–15. Hillsdale, NJ: Lawrence Erlbaum, 1985.

    Google Scholar 

  36. D.E. Goldberg, Genetic Algorithms in Search, Optimization and Machine Learning (Addison-Wesley, Reading, MA, 1989)

    Google Scholar 

  37. D. E. Goldberg. The theory of virtual alphabets. In Proc. 1st Workshop on Parallel Problem Solving from Nature, pages 13–22. Berlin, Germany: Springer, 1991.

    Google Scholar 

  38. D.E. Goldberg, K. Deb, J.H. Clark, Genetic algorithms, noise, and the sizing of populations. Complex Syst. 6, 333–362 (1992)

    MATH  Google Scholar 

  39. D. E. Goldberg, K. Deb, H. Kargupta, and G. Harik. Rapid, accurate optimization of difficult problems using fast messy genetic algorithms. In Proc. 5th Int. Conf. on Genetic Algorithms, pages 56–64. San Mateo, CA: Morgan Kaufmann, 1993.

    Google Scholar 

  40. L.J. Fogel, On the organization of intellect (University of California, Los Angeles, Ph.D. dissertation, 1964)

    Google Scholar 

  41. G. H. Burgin. On playing two-person zero-sum games against nonmin-imax players. IEEE Trans. Syst. Sci. Cybern., SSC-5(4):369–370, 1969.

    Google Scholar 

  42. G.H. Burgin, Systems identification by quasilinearization and evolutionary programming. J. Cybern. 3(2), 56–75 (1973)

    Article  MathSciNet  MATH  Google Scholar 

  43. J. W. Atmar. Speculation on the evolution of intelligence and its possible realization in machine form. Ph.D. dissertation, New Mexico State Univ., Las Cruces, 1976.

    Google Scholar 

  44. L.J. Fogel, A.J. Owens, M.J. Walsh, Artificial Intelligence Through Simulated Evolution (Wiley, New York, 1966)

    Google Scholar 

  45. D. B. Fogel. An evolutionary approach to the traveling sales man problem. Biological Cybern., :139–144, 1988.

    Google Scholar 

  46. D. B. Fogel. Evolving artificial intelligence. Ph.D. dissertation, Univ. of California, San Diego, 1992.

    Google Scholar 

  47. I. Rechenberg, Evolutionsstrategie: Optimierung technischer Systeme nach Prinzipien der biologischen Evolution (Frommann-Holzboog, Stuttgart. Germany, 1973)

    Google Scholar 

  48. I. Rechenberg. Evolutionsstrategie’94. Werkstatt Bionik und Evolutionstechnik. Stuttgart, Germany: Frommann-Holzboog, 1994.

    Google Scholar 

  49. H.P. Schwefel, Evolutionsstrategie und numerische Optimierung Dissertation (Technische Universit Berlin, Germany, 1975)

    Google Scholar 

  50. H.P. Schwefel, Evolution and Optimum Seeking (Wiley, New York, 1995)

    Google Scholar 

  51. M. Herdy, Reproductive isolation as strategy parameter inhierarchically organized evolution strategies, Parallel Problem Solving from Nature 2 (Amsterdam, The Netherlands, Elsevier, 1992), pp. 207–217

    Google Scholar 

  52. F. Kursawe. A variant of evolution strategies for vector optimization. In Proc. 1st Workshop on Parallel Problem Solving from Nature, pages 193–197. Berlin, Germany: Springer, 1991.

    Google Scholar 

  53. A. Ostermeier, An evolution strategy with momentum adaptation of the random number distribution, Parallel Problem Solving from Nature 2 (Amsterdam, The Netherlands, Elsevier, 1992), pp. 197–206

    Google Scholar 

  54. A. Ostermeier, A. Gawelczyk, N. Hansen, Step-size adaptation based on nonlocal use of selection information, Parallel Problem Solving from Nature-PPSNIII (Berlin, Germany, Springer, 1994), pp. 189–198

    Google Scholar 

  55. G. Rudolph. Global optimization by means of distributed evolution strategies. In Proc. 1st Workshop on Parallel Problem Solving from Nature, pages 209–213. Berlin, Germany: Springer, 1991.

    Google Scholar 

  56. J. Klockgether and H. P. Schwefel. Two-phase nozzle and hollow core jet experiments. In D. G. Elliott, editor, Proc. 11th Symp. Engineering Aspects of Magnetohydrodynamics, pages 141–148. Pasadena, CA: California Institute of Technology, 1970.

    Google Scholar 

  57. J.R. Koza, Genetic Programming: On the Programming of Computers by Means of Natural Selection (MIT Press, Cambridge, MA, 1992)

    Google Scholar 

  58. M. L. Cramer. A representation for the adaptive generation of simple sequential programs. In Proc. 1st Int. Conf. on Genetic Algorithms and Their Applications, pages 183–187. Hillsdale, NJ: Lawrence Erlbaum, 1985.

    Google Scholar 

  59. J. R. Koza, D. E. Goldberg, D. B. Fogel, and R. L. Riolo. Proc. 1st Annu. Conf. Genetic Programming. Cambridge, MA: MIT Press, 1996.

    Google Scholar 

  60. K.E. Kinnear, Advances in Genetic Programming (MIT Press, Cambridge, MA, 1994)

    Google Scholar 

  61. F. D. Francone, P. Nordin, and W. Banzhaf. Benchmarking the generalization capabilities of acompiling genetic programming system using sparse datasets. In Proc. 1st Annu. Conf on Genetic Programming, pages 72–80. Cambridge, MA: MIT Press, 1996.

    Google Scholar 

  62. J.H. Holland, Adaptation in Natural and Artificial Systems (University of Michigan Press, Ann Arbor, MI, 1975)

    Google Scholar 

  63. C.H. Darwin, The origins of species by means of natural selection (Penguin Classics, London, 1859)

    Google Scholar 

  64. D. Whitley, A genetic algorithm tutorial. Statistics and Computing 4, 65–85 (1994)

    Article  Google Scholar 

  65. H.P. Schwefel, Numerical optimization of computer models (Wiler, Chichester, 1981)

    Google Scholar 

  66. P. Ross, Genetic algorithms and genetic programming: Lecturer Notes (University of Edinburgh, Department of Artificial Intelligence , 1996)

    Google Scholar 

  67. H. M\(\ddot{u}\)hlenbein and D. Schlierkamp-Voosen. Predictive models for the breeder genetic algorithms: continuous parameter optimization. Evolutionary Computation, 1(1):25–49, 1993.

    Google Scholar 

  68. H. M\(\ddot{u}\)hlenbein and D. Schlierkamp-Voosen. The science of breeding and its application to the breeder genetic algorithm bga. Evolutionary Computation, 1(4):335–360, 1994.

    Google Scholar 

  69. K.A. De Jong. An analysis of the behavior of a class of genetic adaptive systems. PhD thesis, University of Michigan, 1975.

    Google Scholar 

  70. A. Wetzel, Evaluation of the effectiveness of genetic algorithms in combinatorial optimization (University of Pittsburgh, Technical report , 1983)

    Google Scholar 

  71. D. Ackley. A connectionist machine for genetic hill climbing. Kluwer Academic Publisher, 1987.

    Google Scholar 

  72. R. Dawkins. The selfish gene. Oxford Press, 1976.

    Google Scholar 

  73. C.J. Lumsden, E.O. Wilson, Genes, Mind, and Culture (Harvard University Press, Cambridge, 1981)

    Google Scholar 

  74. C.J. Lumsden, E.O. Wilson, Promethean Fire (Harvard University Press, Cambridge, 1983)

    Google Scholar 

  75. P. Moscato. On evolution, search, optimization, genetic algorithms and martial arts: towards memetic algorithms. Technical Report 826, California Institute of Technology, Pasadena, California, USA, 1989.

    Google Scholar 

  76. P. Moscato. Memetic algorithms: a short introduction. In D. Corne, M. Dorigo, and F. Glover, editors, New ideas in optimization, pages 219–234. McGraw-Hill, 1999.

    Google Scholar 

  77. L. Davis, Genetic algorithms and simulated annealing (Pitman, London, 1987)

    Google Scholar 

  78. R.G. Le Riche, C. Knopf-Lenoir, and R.T. Haftka. A segregated genetic algorithms for constraint structural optimisation. In L.J. Eshelman, editor, Proceedings of the sixth international conference on genetic algorithms, pages 558–565. San Mateo, California, July 1995. University of Pittsburgh, Morgan Kaufmann, 1995.

    Google Scholar 

  79. D. Dasgupta, Z. Michalewicz, Evolutionary algorithms in engineering applications (Springer-Verlag, Berlin, 1997)

    Google Scholar 

  80. J.T. Richardson, M.R. Palmer, G. Liepins, and M. Hilliard. Some guidelines for genetic algorithms with penalty functions. In J.D. Schaffer, editor, Proceedings of the Third International Conference on Genetic Algorithms, pages 191–197. Morgan Kaufmann Publisher, 1989.

    Google Scholar 

  81. A. Homaifar, CX Qi, and SH Lai. Constrained optimization via genetic algorithms, simulations. Engineering Optimization 62, 242–254 (1994)

    Google Scholar 

  82. Z. Michalewicz and N. Attia. Evolutionary optimization of constrained problems. Proceedings of the 3rd Annual Conference on Evolutionary Programming, pages 98–108, 1994.

    Google Scholar 

  83. A.B. Hadj-Alouane and J.C. Bean. A genetic algorithm for the multiple-choice integer program. Technical Report TR-92-50, Department of Industrial and Operations Engineering, The University of Michigan, 1992.

    Google Scholar 

  84. A.K. Morales and C.V. Quezada. A universal eclectic genetic algorithm for constrained optimization. Proceedings of the 6th European Congress on Intelligent Techniques and, Soft Computing, EUFIT’98, pp. 518–522, 1998.

    Google Scholar 

  85. Z. Michalewicz and G. Nazhiyath. Genocop iii: A co-evolutionary algorithm for numerical optimization with nonlinear constraints. In D.B. Fogel, editor, Proceedings of the Second IEEE International Conference on Evolutionary Computation, pages 647–651. IEEE Press, 1995.

    Google Scholar 

  86. M. Schoenauer and S. Xanthakis. Constrained ga optimization. The Fourth International Conference on Genetic Algorithms, ICGA93, 1993.

    Google Scholar 

  87. C.A. Coello, Self-adaptive penalties for ga-based optimization. , Congress on. Evolutionary Computation 1(573–580), 1999 (1999)

    Google Scholar 

  88. M. Lema\(\dot{i}\)tre and G. Verfaillie. An incomplete method for solving distributed valued constraint satisfaction problems. AAAI-97 Workshop on Constraints and Agents, 1997.

    Google Scholar 

  89. S. Minton, M.D. Johnston, A.B. Philips, P. Laird, Minimizing conflicts: a heuristic method for constraint-satisfaction and scheduling problems. Artificial Intelligence 58, 161–205 (1992)

    Article  MathSciNet  MATH  Google Scholar 

  90. G.E. Liepins, M.D. Vose, Representational issues in genetic optimization. Journal of Experimental and Theoretical Computer Science 2(2), 4–30 (1990)

    Google Scholar 

  91. G.E. Liepins and W.D. Potter. A genetic algorithm approach to multiple-fault diagnosis. In L. Davis, editor, Handbook of Genetic Algorithms, pages 237–250. Van Nostrand Reinhold, 1991.

    Google Scholar 

  92. H. M\(\ddot{u}\)hlenbein. Parallel genetic algorithms in combinatorial optimization. In O. Balci, R. Sharda, and S. Zenios, editors, Computer Science and Operations Research, pages 441–456. Pergamon Press, 1992.

    Google Scholar 

  93. Z. Michalewicz and C.Z. Janikow. Handling constraints in genetic algorithms. In L.B. Booker, editor, Proceedings of the Fourth International Conference on Genetic Algorithms, pages 151–157. Morgan Kaufmann, 1991.

    Google Scholar 

  94. D. Orvosh, L. Davis, Using genetic algorithm to optimize problems with feasibility constraints (Proceedings of the First IEEE Conference on, Evolutionary Computation , 1994), pp. 548–553

    Google Scholar 

  95. R.G. Le Riche, R.T. Haftka, Improved genetic algorithm for minimum thickness composite laminate design. Composites Engineering 3(1), 121–139 (1994)

    Google Scholar 

  96. J. Xiao, Z. Michalewicz, K. Trojanowski, Adaptive evolutionary planner/navigator for mobile robots. IEEE Transactions on Evolutionary Computation 1(1), 18–28 (1997)

    Article  Google Scholar 

  97. J. Xiao, Z. Michalewicz, L. Zhang. Evolutionary planner/navigator: operator performance and self-tuning. Proceedings of the 3rd IEEE International Conference on, Evolutionary Computation, 1996.

    Google Scholar 

  98. D. Whitley, V.S. Gordon, K. Mathias, Lamarckian evolution, the baldwin effect and function optimization, in Parallel problem-solving methods from nature PPSN III, pp, ed. by Y. Davidor, H.-P. Schwefel, R. Manner (Springer-Verlag, Berlin, 1994), pp. 6–15

    Google Scholar 

  99. C.R. Houck, J.A. Joines, M.G. Kay, Utilizing Lamarckian evolution and the Baldwin effect in hybrid genetic algorithms (NCSU-IE Technical, Report, 1996), pp. 96–01

    Google Scholar 

  100. C. Wellock and B. J. Ross. An examination of lamarckian genetic algorithms. In 2001 Genetic and Evolutionary Computation Conference (GECCO) Late Breaking Papers, pages 474–481, 2001.

    Google Scholar 

  101. G.E. Hinton, S.J. Nowlan, How learning can guide evolution. Complex Systems 1, 492–502 (1987)

    Google Scholar 

  102. M. G. Kirley, X. Li, D. G. Green. Investigation of a cellular genetic algorithm that mimics landscape ecology. In, Lecture Notes in Computer Science Volume 1585/1999, pp. 90–97, 1999.

    Google Scholar 

  103. M.G. Kirley, A cellular genetic algorithm with disturbances: optimisation using dynamic spatial interactions. Journal of Heuristics 8, 321–242 (2002)

    Article  MATH  Google Scholar 

  104. D.G. Green, Fire and stability in the postglacial forests of southwest nova scotia. Journal of Biogeography 9, 29–40 (1982)

    Article  Google Scholar 

  105. J.H. Holland, Emergence: from Chaos to Order (Addison-Wesley, Redwood City, California, 1998)

    Google Scholar 

  106. D.H. Wolpert, W.G. Macready, No free lunch theorems for optimization. IEEE Trans. on Evolutionary Computation 1(1), 67–82 (1997)

    Article  Google Scholar 

  107. R. E. Smith and N. Taylor. A framework for evolutionary computation in agent-based systems. In Proceedings of the 1998 International conference on Intelligent Systems, 1998.

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to David G. Green .

Rights and permissions

Reprints and permissions

Copyright information

© 2014 Springer Science+Business Media New York

About this chapter

Cite this chapter

Green, D.G., Liu, J., Abbass, H.A. (2014). Problem Solving and Evolutionary Computation. In: Dual Phase Evolution. Springer, New York, NY. https://doi.org/10.1007/978-1-4419-8423-4_3

Download citation

  • DOI: https://doi.org/10.1007/978-1-4419-8423-4_3

  • Published:

  • Publisher Name: Springer, New York, NY

  • Print ISBN: 978-1-4419-8422-7

  • Online ISBN: 978-1-4419-8423-4

  • eBook Packages: EngineeringEngineering (R0)

Publish with us

Policies and ethics