Skip to main content
Log in

Recent Advances in Evolutionary Computation

  • Published:
Journal of Computer Science and Technology Aims and scope Submit manuscript

Abstract

Evolutionary computation has experienced a tremendous growth in the last decade in both theoretical analyses and industrial applications. Its scope has evolved beyond its original meaning of “biological evolution” toward a wide variety of nature inspired computational algorithms and techniques, including evolutionary, neural, ecological, social and economical computation, etc., in a unified framework. Many research topics in evolutionary computation nowadays are not necessarily “evolutionary”. This paper provides an overview of some recent advances in evolutionary computation that have been made in CERCIA at the University of Birmingham, UK. It covers a wide range of topics in optimization, learning and design using evolutionary approaches and techniques, and theoretical results in the computational time complexity of evolutionary algorithms. Some issues related to future development of evolutionary computation are also discussed.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  1. Schwefel H P. Numerical Optimization of Computer Models. Chichester, UK: John Wiley & Sons, 1981.

    Google Scholar 

  2. Fogel L J, Owens A J, Walsh M J. Artificial Intelligence Through Simulated Evolution. New York, USA: John Wiley & Sons, 1966.

    Google Scholar 

  3. Holland J H. Adaptation in Natural and Artificial Systems. Ann Arbor MI, USA: University of Michigan Press, 1975.

    Google Scholar 

  4. De Jong K A. Genetic algorithms: A 10 year perspective. In Proc. the First International Conference on Genetic Algorithms, Lawrence Erlbaum Associates, Hillsdale, HJ, 1985, pp.169–177.

  5. Fraser A. Simulation of genetic systems by automatic digital computers: I. introduction. Australian Journal of Biological Science, 1957, 10: 484–491.

    Google Scholar 

  6. Koza J R. Genetic Programming: On the Programming of Computers by Means of Natural Selection. Cambridge, MA: The MIT Press, 1992.

    Google Scholar 

  7. Fogel D B. The advantages of evolutionary computation. In Proc. Biocomputing and Emergent Computation (BCEC97), Sköve, Sweden, Singapore: World Scientific, Sept. 1997, pp.1–11.

  8. Yao X. An overview of evolutionary computation. Chinese Journal of Advanced Software Research, New York, NY 10011: Allerton Press, Inc., 1996, 3(1): 12–29.

  9. Kirkpatrick S, Gelatt C D, Vecchi M P. Optimization by simulated annealing. Science, 1983, 220: 671–680.

    MathSciNet  Google Scholar 

  10. Szu H H, Hartley R L. Fast simulated annealing. Physics Letters A, 1987, 122: 157–162.

    Article  Google Scholar 

  11. Ingber L. Very fast simulated re-annealing. Mathl. Comput. Modelling, 1989, 12(8): 967–973.

    MATH  MathSciNet  Google Scholar 

  12. Yao X. A new simulated annealing algorithm. Int. J. Computer Math, 1995, 56: 161–168.

    MATH  Google Scholar 

  13. Glover F. Future paths for integer programming and links to artificial intelligence. Computers and Operations Research, 1986, 5: 533–549.

    MathSciNet  Google Scholar 

  14. Fogel D B. System Identification Through Simulated Evolution: A Machine Learning Approach to Modeling. Needham Heights, MA 02194: Ginn Press, 1991.

    Google Scholar 

  15. Fogel D B. Evolving artificial intelligence [Dissertation]. University of California, San Diego, CA, 1992.

  16. Fogel D B. Applying evolutionary programming to selected traveling salesman problems. Cybernetics and Systems, 1993, 24: 27–36.

    MathSciNet  Google Scholar 

  17. Yao X, Liu Y, Lin G. Evolutionary programming made faster. IEEE Trans. Evolutionary Computation, 1999, 3(2): 82–102.

    Google Scholar 

  18. Yao X, Liu Y. Fast evolution strategies. Control and Cybernetics, 1997, 26(3): 467–496.

    MathSciNet  Google Scholar 

  19. Hunt R A. Calculus with Analytic Geometry. New York, NY 10022-5299: Harper & Row Publ., Inc., 1986, p.322.

  20. Lee C Y, Yao X. Evolutionary programming using mutations based on the Lévy probability distribution. IEEE Trans. Evolutionary Computation, 2004, 8(1): 1–13.

    Google Scholar 

  21. He J, Yao X. A game-theoretic approach for designing mixed mutation strategies. In Proc. the 2005 Int. Conf. Natural Computation (ICNC'05), Changsha, China, Aug. 2005, pp.27–29.

  22. Weibull J W. Evolutionary Game Theory. Cambridge, MA: MIT Press, 1995.

    Google Scholar 

  23. Dutta P K. Strategies and Games. Cambridge, MA: The MIT Press, 1999.

    Google Scholar 

  24. Canhi-Paz E. A survey of parallel genetic algorithms. Calculateun Porolllles, Reseau el Sjvtem Repom's, 1998, 10(2): 141–171.

    Google Scholar 

  25. Alba E, Troya J M. A survey of parallel distributed genetic algorithms. Complexity, 1999, 4(4): 31–52.

    Article  MathSciNet  Google Scholar 

  26. Riesscn G A, Williams G J, Yao X. Pepnet: Parallel evolutionary programming for constructing artificial neural networks. In Lecture Notes in Computer Science, Berlin: Springer-Verlag, 1997, 1213: 35–45.

  27. Tongchim S, Yao X. Parallel Evolutionary Programming. In Proc. the 2004 Congress on Evolutionary Computation (CEC'04), Portland, Oregon, USA, June 19–23, 2004, pp.1362–1367.

  28. Yao X, Liu Y. A new evolutionary system for evolving artificial neural networks. IEEE Trans. Neural Networks, 1997, 8(3): 694–713.

    Article  MathSciNet  Google Scholar 

  29. Yao X, Liu Y. Making use of population information in evolutionary artificial neural networks. IEEE Trans. Systems, Man, and Cybernetics, Part B: Cybernetics, 1998, 28(3): 417–425.

    Google Scholar 

  30. Selfridge O G. Pandemonium: A paradigm for learning. In Mechanization of Thought Progresses: Proc. a Symp. Held at the National Physical Lab., HMSO, London, 1958, pp.513–526.

  31. Hansen L K, Salamon P. Neural network ensembles. IEEE Trans. Pattern Analysis and Machine Intelligence, 1990, 12(10): 993–1001.

    Article  Google Scholar 

  32. Sharkey A. On combining artificial neural nets. Connection Science, 1996, 8: 299–313.

    Article  Google Scholar 

  33. Jacobs R A, Jordan M I, Nowlan S J, Hinton G E. Adaptive mixtures of local experts. Neural Computation, 1991, 3: 79–87.

    Google Scholar 

  34. Jacobs R A, Jordan M I, Barto A G. Task decomposition through competition in a modular connectionist architecture: The what and where vision task. Cognitive Science, 1991, 15: 219–250.

    Article  Google Scholar 

  35. Drecker H, Cortes C, Jackel L D, LeCun Y, Vapnik V. Boosting and other ensemble methods. Neural Computation, 1994, 6: 1289–1301.

    Google Scholar 

  36. Liu Y, Yao X, Higuchi T. Evolutionary ensembles with negative correlation learning. IEEE Trans. Evolutionary Computation, 2000, 4(4): 380–387.

    Google Scholar 

  37. Liu Y, Yao X. Negative corrected neural networks can produce best ensembles. Australian Journal of Intelligent Information Processing Systems, 1997, 4(3/4): 176–185.

    Google Scholar 

  38. Liu Y, Yao X. Ensemble learning via negative correlation. Neural Networks, 1999, 12: 1399–1404.

    Article  Google Scholar 

  39. Liu Y, Yao X. Simultaneous training of negatively correlated neural networks in an ensemble. IEEE Trans. Systems, Man, and Cybernetics, Part B: Cybernetics, 1999, 29: 716–725.

    Google Scholar 

  40. Liu Y, Yao X. Toward designing neural network ensembles by evolution. In Proc. 5th Int. Conf. Parallel Problem Solving from Nature (PPSN V), Berlin, Germany: Springer-Verlag, 1998, Lecture Notes in Computer Science, 1498: 623–632.

  41. Liu Y, Yao X, Darwen P. How to make best use of evolutionary learning. In Complex Systems: From Local Interactions to Global Phenomena, Stocker R, Jelinek H, Durnota B (eds.), Amsterdam, Netherlands: IOS Press, 1996, pp.229–242.

    Google Scholar 

  42. Michie D, Spiegelhalter D J, Taylor C C. Machine Learning, Neural and Statistical Classification. London, UK: Ellis Horwood Limited, 1994.

    Google Scholar 

  43. ftp://ics.uci.edu/pub/machine-learning-databases.

  44. Monirul Islam M, Yao X, Murase K. A constructive algorithm for training cooperative neural network ensembles. IEEE Trans. Neural Networks, 2003, 14(4): 820–834.

    Google Scholar 

  45. Wang Z, Yao X, Xu Y. An improved constructive neural network ensemble approach to medical diagnoses. Lecture Notes in Computer Science, 2004, 3177: 572–577.

  46. Brown G, Wyatt J, Harris R, Yao X. Diversity creation methods: A survey and categorisation. Journal of Information Fusion, 2005, 6(1): 5–20.

    Google Scholar 

  47. Yao X, Liu Y. Evolving neural network ensembles by mini-mization of mutual information. International Journal of Hybrid Intelligent Systems, 2004, 1(1): 12–21.

    MathSciNet  Google Scholar 

  48. Chandra A, Yao X. Evolutionary framework for the construction of diverse hybrid ensembles. In Proc. the 13th European Symposium on Artificial Neural Networks (ESANN'2005), Bruges, Belgium, April 27–29, 2005, pp.253–258.

  49. Chandra A, Yao X. DIVACE: Diverse and accurate ensemble learning algorithm. Lecture Notes in Computer Science, 2004, 3177: 619–625.

  50. Chandra A, Yao X. Ensemble learning using multiobjective evolutionary algorithms. Journal of Mathematical Modelling and Algorithms, May 2005 (to appear).

  51. Abbass H A. A memetic Pareto evolutionary approach to artificial neural networks. In Proc. the 14th Australian Joint Conference on Artificial Intelligence, Berlin, 2000, pp.1–12.

  52. Liu Y, Yao X. Learning and evolution by minimization of mutual information. Lecture Notes in Computer Science, 2002, 2439: 495–504.

  53. Chandra A, Yao X. Evolving hybrid ensembles of learning machines for better generalisation. Submitted to Neurocomputing, Elsevier, July 2005.

  54. Coello Coello C A. A comprehensive survey of evolutionary-based multiobjective optimization techniques. Knowledge and Information Systems: An International Journal, 1999, 1(3): 269–308.

    Google Scholar 

  55. Khare V, Yao X, Deb K. Performance scaling of multiobjective evolutionary algorithms. Lecture Notes in Computer Science, 2003, 2632: 376–390.

  56. Deb K, Agrawal S, Pratap A, Meyarivan T. A fast and elitist multiobjective genetic algorithm: NSGA-II. IEEE Trans. Evolutionary Computation, 2002, 6(2): 182–197.

    Google Scholar 

  57. Zitzler E, Laumanns M, Thiele L. SPEA2: Improving the strength Pareto evolutionary algorithm. Technical Report 103, Computer Engineering and Networks Laboratory (TIK), Swiss Federal Institute of Technology (ETH), Zurich, Gloriastrasse 35, CH-8092 Zurich, Switzerland, May 2001.

  58. Corne D W, Knowles J D, Oates M J. The Pareto envelope-based selection algorithm for multiobjective optimization. Lecture Notes in Computer Science, 2000, 1917: 839–848.

  59. Deb K, Thiele L, Laumanns M, Zitzler E. Scalable test problems for evolutionary multiobjective optimization. Technical Report 112, Computer Engineering and Networks Laboratory (TIK), Swiss Federal Institute of Technology (ETH), Zurich, Switzerland, 2001.

  60. Atashkari K, Nariman-Zadeh N, Pilechi A, Yao X. Thermodynamic Pareto optimization of turbojet engines using multiobjective genetic algorithms. International Journal of Thermal Sciences, 2005 (to appear).

  61. Nariman-Zadeh N, Atashkari K, Jamali A, Pilechi A, Yao X. Inverse modelling of multiobjective thermodynamically optimized turbojet engines using GMDH-type neural networks and evolutionary algorithms. Engineering Optimization, 2005, 37(5): 437–462.

    Article  Google Scholar 

  62. Axelrod R M. The Evolution of Cooperation. New York: Basic Books, 1984.

    Google Scholar 

  63. Axelrod R M. The Evolution of Strategies in the Iterated Prisoner's Dilemma. Genetic Algorithms and Simulated Annealing, Chap. 3, Davis L (ed.), Los Altos, California: Morgan Kaufmann, 1987, pp.32–41.

    Google Scholar 

  64. Darwen P, Yao X. Co-evolution in iterated prisoner's dilemma with intermediate levels of cooperation: Application to missile defense. International Journal of Computational Intelligence and Applications, 2002, 2(1): 83–107.

    Google Scholar 

  65. Frean M. The evolution of degrees of cooperation. J. Theoretical. Bio., 1996, 182(4): 549–559.

    Google Scholar 

  66. Harrald P G, Fogel D B. Evolving continuous behaviors in the iterated prisoner's dilemma. Biosystems, 1996, 37: 135–145.

    Article  Google Scholar 

  67. Chong S Y, Yao X. The impact of noise on iterated prisoner's dilemma with multiple levels of cooperation. In Proc. the 2004 Congress on Evolutionary Computation (CEC'04), Portland, Oregon, USA, June 19–23, 2004, pp.348–355.

  68. Colman A M. Game Theory and Experimental Games. Oxford, England: Pergamon Press, 1982, p.159.

    Google Scholar 

  69. Glance N S, Huberman B A. The dynamics of social dilemmas. Sci. Am., March 1994, pp.58–63.

  70. Albin P. Approximations of cooperative equilibria in multi-person prisoners' dilemma played by cellular automata. Math. Sot. Sci., 1992, 24: 293–319.

    MATH  MathSciNet  Google Scholar 

  71. Yao X. Evolutionary stability in the n-person iterated prisoner's dilemma. BioSystems, 1996, 37(3): 189–197.

    Article  Google Scholar 

  72. Seo Y G, Cho S B, Yao X. Emergence of Cooperative Coalition in NIPD game with Localization of Interaction and Learning. In Proc. the 1999 Congress on Evolutionary Computation, IEEE Press, Piscataway, NJ, USA, July 1999, 2: 877–884.

  73. Seo Y G, Cho S B, Yao X. The impact of payoff function and local interaction on the N-player iterated prisoner's dilemma. Knowledge and Information Systems: An International Journal, 2000, 2(4): 461–478.

    Google Scholar 

  74. Yao X, Darwen P. The experimental study of N-player iterated prisoner's dilemma. Informatica, 1994, 18: 435–450.

    Google Scholar 

  75. Runarsson T P, Yao X. Stochastic ranking for constrained evolutionary optimization. IEEE Trans. Evolutionary Computation, 2000, 4(3): 284–294.

    Google Scholar 

  76. Joines J, Houck C. On the use of nonstationary penalty functions to solve nonlinear constrained optimization problems with Gas. In Proc. IEEE Int. Conf. Evolutionary Computing, Piscataway, NJ, 1994, pp.579–584.

  77. Runarsson T P, Yao X. Search biases in constrained evolutionary optimization. IEEE Trans. Systems, Man, and Cybernetics—Part C: Applications and Reviews, 2005, 35(2): 233–243.

    Google Scholar 

  78. Salcedo-Sanz S, Yao X. A hybrid Hopfield network—Genetic algorithm approach for the terminal assignment problem. IEEE Trans. System, Man and Cybernetics B, 2004, 34(6): 2343–2353.

    Google Scholar 

  79. Xu Y, Salcedo-Sanz S, Yao X. Non-standard cost terminal assignment problems using tabu search approach. In Proc. the 2004 Congress on Evolutionary Computation (CEC'04), IEEE Press, Portland, Oregon, USA, June 19–23, 2004, pp.2335–2340.

  80. Abuali F N, Schoenefeld D A, Wainwright R L. Terminal assignment in a communications network using genetic algorithms. In Proc. 22nd Annu. ACM Computer Science Conf., 1994, pp.74–81.

  81. Khuri S, Chiu T. Heuristic algorithms for the terminal assignment problem. In Proc. 1997 ACM Symp. Applied Computing, 1997, pp.247–251.

  82. Brudaru I. Grouping-based hybrid genetic algorithm for the terminal assignment problem. In Proc. 7th Int. Research/Expert Conf. Trends Development of Machinery Associated Techniques, Barcelona, Spain, 2003.

  83. Salcedo-Sanz S, Xu Y, Yao X. A hybrid genetic algorithm-Hopfield network for task assignment in distributed computer networks. Computers and Operations Research, 2005 (to appear).

  84. Salcedo-Sanz S, Xu Y, Yao X. Meta-heuristic algorithms for FPGA segmented channel routing problems with non-standard cost functions. Genetic Programming and Evolvable Machines, Vol.6, 2005 (to appear).

  85. Xu Y, Xu S C, Wu B X. Traffic grooming in unidirectional WDM ring networks using genetic algorithms. Computer Communications, 2002, 25(13): 1185–1194.

    Article  MathSciNet  Google Scholar 

  86. Xu Y, Xu S C, Wu B X. Strictly nonblocking grooming of dynamic traffic in unidirectional SONET/WDM rings using genetic algorithm. Computer Network, 2003, 41(2): 227–245.

    MathSciNet  Google Scholar 

  87. Liu K H, Xu Y. A new approach to improving the grooming performance with dynamic traffic in SONET rings. Computer Networks, 2004, 46(2): 181–195.

    Article  Google Scholar 

  88. Liu K H, Xu Y. Combining GA with splitting methods to the rearrangeably nonblocking grooming of dynamic traffic in SONET ring networks. Photonic Network Communications, 2005, 10(2): 179–192.

    Article  MathSciNet  Google Scholar 

  89. Xu Y, Yao X. Lower bound on number of ADMs in WDM rings with nonuniform traffic demands. Electronics Letters, 2004, 40(13): 824–825.

    Article  Google Scholar 

  90. Xu Y, Salcedo-Sanz S, Yao X. Metaheuristic approaches to traffic grooming in WDM optical networks. International Journal of Computational Intelligence and Applications, 2005 (to appear).

  91. Schnier T, Yao X, Liu P. Digital filter design using multiple Pareto fronts, soft computing—A fusion of foundations. Methodologies and Applications, 2004, 8(5): 332–343.

    Google Scholar 

  92. Schnier T, Yao X. Evolutionary design calibration. Lecture Notes in Computer Science, 2001, 2210: 26–37.

  93. Schnier T, Yao X. Using negative correlation to evolve fault-tolerant circuits. Lecture Notes in Computer Science, 2003, 2606: 35–46.

  94. Thornes J E. The cost-benefit of winter road maintenance in the UK. In Proc. 8th International Road Weather Conference, Birmingham, UK, 1996, pp.1–10.

  95. Handa H, Chapman L, Yao X. Dynamic salting route optimisation using evolutionary computation. In Proc. 2005 Congress on Evolutionary Computation (CEC'05), IEEE Press, Edinburgh, UK, Sept. 2–5, 2005, (to appear).

  96. Chapman L, Thornes J E. Real-time sky-view factor calculation and approximation. Journal of Atmospheric and Oceanic Technology, 2004, 21: 730–741.

    Article  Google Scholar 

  97. Li J, Yao X, Frayn C, Khosroshahi H G, Raychaudhury S. An evolutionary approach to modeling radial brightness distributions in elliptical galaxies. Lecture Notes in Computer Science, 2004, 3242: 591–601.

  98. Washbrook B, Li J. The Application of evolutionary computation to the analysis of the profiles of elliptical galaxies: A maximum likelihood approach. In Proc. 2005 IEEE Congress on Evolutionary Computation (CEC'05), Edinburgh, UK, September 2–5, 2005 (to appear).

  99. Rudolph G. Finite Markov chain results in evolutionary computation: A tour d'Horizon. Fundamenta Informaticae, 1998, 35(1-4): 67–89.

    MATH  MathSciNet  Google Scholar 

  100. Eiben A E, Rudolph G. Theory of evolutionary algorithms: A bird's eye view. Theoretical Computer Science, 1999, 229(1-2): 3–9.

    Article  MathSciNet  Google Scholar 

  101. Droste S, Jansen T, Wegener I. A rigorous complexity analysis of the (1+1) evolutionary algorithm for linear functions with Boolean inputs. Evolutionary Computation, 1998, 6(2): 185–196.

    Google Scholar 

  102. Beyer H G, Schwefel H P, Wegener I. How to analyze evolutionary algorithms. Theoretical Computer Science, 2002, 287(1): 101–130.

    Article  MathSciNet  Google Scholar 

  103. Droste S, Jansen T, Wegener I. On the analysis of the (1+1) evolutionary algorithms. Theoretical Computer Science, 2002, 276(1-2): 51–81.

    Article  MathSciNet  Google Scholar 

  104. He J, Yao X. Drift analysis and average time complexity of evolutionary algorithms. Artificial Intelligence, 2001, 127(1): 57–85.

    Article  MathSciNet  Google Scholar 

  105. He J, Yao X. From an individual to a population: An analysis of the first hitting time of population-based evolutionary algorithms. IEEE Trans. Evolutionary Computation, 2002, 6(5): 495–511.

    Google Scholar 

  106. He J, Yao X. Towards an analytic framework for analysing the computation time of evolutionary algorithms. Artificial Intelligence, 2003, 145(1-2): 59–97.

    Article  MathSciNet  Google Scholar 

  107. He J, Yao X. A study of drift analysis for estimating computation time of evolutionary algorithms. Natural Computing, 2004, 3(1): 21–35.

    Article  MathSciNet  Google Scholar 

  108. Syski R. Passage Times for Markov Chains. Amsterdam: IOS Press, 1992.

    Google Scholar 

  109. Iosifescu M. Finite Markov Processes and Their Applications. Chichester: John Wiley & Sons, 1980.

    Google Scholar 

  110. Bäck T, Fogel D B, Michalewicz Z (eds.). Handbook of Evolutionary Computation. Oxford: Oxford University Press, 1997.

    Google Scholar 

  111. Spears W M. Evolutionary Algorithms: The Role of Mutation and Recombination. Berlin: Springer, 2000.

    Google Scholar 

  112. Hajek B. Hitting time and occupation time bounds implied by drift analysis with applications. Advances in Applied Probability, 1982, 14(3): 502–525.

    MATH  MathSciNet  Google Scholar 

  113. Meyn S P, Tweedie R L. Markov Chains and Stochastic Stability. New York: Springer-Verlag, 1993.

    Google Scholar 

  114. Sasaki G H, Hajek B. The time complexity of maximum matching by simulated annealing. J. ACM, 1988, 35(2): 387–403.

    Article  MathSciNet  Google Scholar 

  115. Bäck T. The interaction of mutation rate, selection and self-adaptation within a genetic algorithm. In Proc. Parallel Problem Solving from Nature (PPSN) II, Männer R, Manderick B (eds.), Amsterdam, The Netherlands, North Holland, 1992, pp.85–94.

  116. Mühlenbein H. How genetic algorithms really works I: Mutation and hill-climbing. In Proc. Parallel Problem Solving from Nature (PPSN) II, Männer R, Manderick B (eds.), Amsterdam, North Holland, 1992, pp.15–25.

  117. Rudolph G. Convergence Properties of Evolutionary Algorithms. Hamburg, Germany: Springer-Verlag, 1997.

    Google Scholar 

  118. Garnier J, Kallel L, Schoenauer M. Rigorous hitting times for binary mutations. Evolutionary Computation, 1999, 7(2): 167–203.

    Google Scholar 

  119. Rudolph G. How mutation and selection solve long path problems in polynomial expected time. Evolutionary Computation, 1996, 4(2): 207–211.

    Google Scholar 

  120. Garnier J, Kallel L. Statistical distribution of the convergence time of evolutionary algorithms for longpath problems. IEEE Trans. Evolutionary Computation, 2000, 4: 16–30.

    Google Scholar 

  121. Lu Q, Yao X. Clustering and Learning Gaussian Distribution for Continuous Optimization. IEEE Trans. Systems, Man, and Cybernetics, Part C, 2005, 35(2): 195–204.

    Google Scholar 

  122. Tang E K, Suganthan P N, Yao X. Generalized LDA using relevance weighting and evolution strategy. In Proc. 2004 Congress on Evolutionary Computation (CEC'04), Portland, Oregon, USA, June 19–23, 2004, pp.2230–2234.

  123. Zou P, Zhou Z, Chen G L, Yao X. A novel memetic algorithm with random multi-local-search: A case study of TSP. In Proc. 2004 Congress on Evolutionary Computation (CEC'04), Portland, Oregon, USA, June 19–23, 2004, pp.2335–2340.

  124. Matsumura Y, Yao X, Wyatt J L, Ohkura K, Ueda K. Robust evolution strategies use adaptive search strategies to design continuous-time recurrent neural networks. In Proc. 2nd International Conference on Computational Intelligence, Robotics and Autonomous Systems (CIRAS 2003), Singapore, December 15–18, 2003.

  125. Runarsson T P, Yao X. Evolutionary search and constraint violations. In Proc. 2003 Congress on Evolutionary Computation (CEC'03), IEEE Press, Piscataway, NJ, USA, December 8–12, 2003, pp.1414–1419.

  126. He J, Yao X. An analysis of evolutionary algorithms for finding approximation solutions to hard optimisation problems. In Proc. 2003 Congress on Evolutionary Computation (CEC'03), IEEE Press, Piscataway, NJ, USA, December 8–12, 2003, pp.2004–2009.

  127. Tianfield H, Tian J, Yao X. On the architectures of complex multi-agent systems. In Proc. Workshop on Knowledge Grid and Grid Intelligence, held at the 2003 IEEE/WIC International Conference on Web Intelligence/Intelligent Agent Technology, Halifax, Canada, October 13–16, 2003, pp.195–206.

  128. Yang S, Yao X. Experimental study on population-based incremental learning algorithms for dynamic optimization problems. Soft Computing, August 2004 (to appear).

  129. Matsumura Y, Fujimoto N, Yao X, Wyatt J, Hagihara K. Near-optimal dynamic grid task scheduling of evolution strategies. In Proc. The 4th International Conference on Advanced Mechatronics, Japan, 2004, pp.175–180.

  130. Yang S, Yao X. Dual population-based incremental learning for problem optimization in dynamic environments. In Proc. The 7th Asia Pacific Symposium on Intelligent and Evolutionary Systems, Japan, 2003, pp.49–56.

  131. Ma P C H, Chan K C C, Yao X, Chiu D K Y. An evolutionary clustering algorithm for gene expression microarray data analysis. IEEE Trans. Evolutionary Computation, 2005 (to appear).

  132. Au W H, Chan K C C, Yao X. Data mining by evolutionary learning for robust churn prediction in the telecommunications industry. IEEE Trans. Evolutionary Computation, 2003, 7(6): 532–545.

    Google Scholar 

  133. Salim M, Yao X. Evolving SQL Queries for Data Mining. Lecture Notes in Computer Science, 2002, 2412: 62–67.

  134. Zhang C, Yao X, Yang J. An evolutionary approach to materialized views selection in a data warehouse environment. IEEE Trans. Systems, Man and Cybernetics, Part C, 2001, 31(3): 282–294.

    MathSciNet  Google Scholar 

  135. Tang E K, Suganthan P N, Yao X, Qin A K. Linear dimensionality reduction using relevance weighted LDA. Pattern Recognition, April 2005, 38(4): 485–493.

    Google Scholar 

  136. Sarker R, Yao X. Simulated annealing for solving a manufacturing batch-sizing problem. International Journal of Operations and Quantitative Management, 2003, 9(1): 65–80.

    Google Scholar 

  137. Yu J X, Yao X, Choi C H, Gou G. Materialized view selection as constrained evolutionary optimization. IEEE Trans. Systems, Man and Cybernetics, Part C, 2003, 33(4): 458–467.

    Google Scholar 

  138. Lin J, Cheong B H, Yao X. Universal multiobjective function for optimising superplastic-damage constitutive equations. Journal of Materials Processing Technology, 2002, 125-126: 199–205.

    Article  Google Scholar 

  139. Li B, Lin J, Yao X. A novel evolutionary algorithm for determining unified creep damage constitutive equations. International Journal of Mechanical Sciences, 2002, 44(5): 987–1002.

    Article  Google Scholar 

  140. Tang E K, Suganthan P N, Yao X. Nonlinear Feature Extraction Using Evolutionary Algorithm. Lecture Notes in Computer Science, 2004, 3316: 1014–1019.

  141. Matsumura Y, Fujimoto N, Yao X. Wyatt J, Hagihara K. Near-optimal dynamic grid task scheduling of evolution strategies. In Proc. The 4th International Conference on Advanced Mechatronics, 2004, pp.175–180.

  142. Matsumura Y, Fujimoto N, Yao X, Wyatt J, Hagihara K. Evolution strategies on grid computing. In Proc. the China-Japan Joint Conference on Mechatronics, CJ-H2, Suzhou, China, 2004, pp.105–106.

  143. Fogel D B. Evolutionary Computation—Toward a New Philosophy of Machine Intelligence. 2nd Edition, Piscataway, NJ: IEEE Press, 2000.

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Xin Yao.

Additional information

This work is partially supported by the National Natural Science Foundation of China (Grant No. 60428202), and the Advantage West Midlands, UK.

Xin Yao obtained his B.Sc. degree in 1982 from the University of Science and Technology of China (USTC) in Hefei, M.Sc. degree in 1985 from the North China Institute of Computing Technologies (NCI) in Beijing and Ph.D. degree in 1990 from USTC, all in computer science. He joined the University of Birmingham from Australia as a professor of computer science in 1999. He is a fellow of IEEE, the editor-in-chief of IEEE Transactions on Evolutionary Computation, an associate editor of several other international journals, and the editor of the book series on “Advances in Natural Computation” from World Scientific Publishing Co. He has been an invited keynote or plenary speaker of more than 35 international conferences in 11 different countries and a chair/co-chair of 27 international conferences. He is an IEEE Computational Intelligence Society Distinguished Lecturer. He won the prestigious IEEE Donald G. Fink Prize Paper Award (2001). He is currently the Director of The Centre of Excellence for Research in Computational Intelligence and Applications (CERCIA) at the University of Birmingham, UK. He is also a Distinguished Visiting Professor and Cheung Kong Scholar at USTC and a visiting professor at three other universities. He has more than 200 research publications, including 80 refereed journal papers. His major research interests include evolutionary computation and neural network ensembles.

Yong Xu received the M.Sc. degree in applied optics and lasers from Fujian Normal University in 1987 and the Ph.D. degree in physics specializing in nature-inspired approaches to optical network design from Xiamen University in 2002. He is a member of IEEE and now a professor with Fujian Normal University and a research fellow with the University of Birmingham. From 2001 to 2002 he was a research assistant with the City University of Hong Kong. He is a reviewer for a number of leading international journals and international conferences and serves as a program committee member for many international conferences. He is an organizer of a workshop in PPSN04 and a guest editor of a special issue in International Journal of Computational Intelligence and Applications. He has published more than 50 papers and undertaken many research projects. His research interests include evolutionary computation, e.g., neural networks, genetic algorithms and tabu search etc., global optimization, and the optimal design of telecommunication networks.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Yao, X., Xu, Y. Recent Advances in Evolutionary Computation. J Comput Sci Technol 21, 1–18 (2006). https://doi.org/10.1007/s11390-006-0001-4

Download citation

  • Received:

  • Accepted:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11390-006-0001-4

Keywords

Navigation