Advertisement

Journal of Computer Science and Technology

, Volume 21, Issue 1, pp 1–18 | Cite as

Recent Advances in Evolutionary Computation

  • Xin YaoEmail author
  • Yong Xu
Article

Abstract

Evolutionary computation has experienced a tremendous growth in the last decade in both theoretical analyses and industrial applications. Its scope has evolved beyond its original meaning of “biological evolution” toward a wide variety of nature inspired computational algorithms and techniques, including evolutionary, neural, ecological, social and economical computation, etc., in a unified framework. Many research topics in evolutionary computation nowadays are not necessarily “evolutionary”. This paper provides an overview of some recent advances in evolutionary computation that have been made in CERCIA at the University of Birmingham, UK. It covers a wide range of topics in optimization, learning and design using evolutionary approaches and techniques, and theoretical results in the computational time complexity of evolutionary algorithms. Some issues related to future development of evolutionary computation are also discussed.

Keywords

evolutionary computation neural network ensemble prisoner's dilemma real-world application computational time complexity 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Schwefel H P. Numerical Optimization of Computer Models. Chichester, UK: John Wiley & Sons, 1981.Google Scholar
  2. 2.
    Fogel L J, Owens A J, Walsh M J. Artificial Intelligence Through Simulated Evolution. New York, USA: John Wiley & Sons, 1966.Google Scholar
  3. 3.
    Holland J H. Adaptation in Natural and Artificial Systems. Ann Arbor MI, USA: University of Michigan Press, 1975.Google Scholar
  4. 4.
    De Jong K A. Genetic algorithms: A 10 year perspective. In Proc. the First International Conference on Genetic Algorithms, Lawrence Erlbaum Associates, Hillsdale, HJ, 1985, pp.169–177.Google Scholar
  5. 5.
    Fraser A. Simulation of genetic systems by automatic digital computers: I. introduction. Australian Journal of Biological Science, 1957, 10: 484–491.Google Scholar
  6. 6.
    Koza J R. Genetic Programming: On the Programming of Computers by Means of Natural Selection. Cambridge, MA: The MIT Press, 1992.Google Scholar
  7. 7.
    Fogel D B. The advantages of evolutionary computation. In Proc. Biocomputing and Emergent Computation (BCEC97), Sköve, Sweden, Singapore: World Scientific, Sept. 1997, pp.1–11.Google Scholar
  8. 8.
    Yao X. An overview of evolutionary computation. Chinese Journal of Advanced Software Research, New York, NY 10011: Allerton Press, Inc., 1996, 3(1): 12–29.Google Scholar
  9. 9.
    Kirkpatrick S, Gelatt C D, Vecchi M P. Optimization by simulated annealing. Science, 1983, 220: 671–680.MathSciNetGoogle Scholar
  10. 10.
    Szu H H, Hartley R L. Fast simulated annealing. Physics Letters A, 1987, 122: 157–162.CrossRefGoogle Scholar
  11. 11.
    Ingber L. Very fast simulated re-annealing. Mathl. Comput. Modelling, 1989, 12(8): 967–973.zbMATHMathSciNetGoogle Scholar
  12. 12.
    Yao X. A new simulated annealing algorithm. Int. J. Computer Math, 1995, 56: 161–168.zbMATHGoogle Scholar
  13. 13.
    Glover F. Future paths for integer programming and links to artificial intelligence. Computers and Operations Research, 1986, 5: 533–549.MathSciNetGoogle Scholar
  14. 14.
    Fogel D B. System Identification Through Simulated Evolution: A Machine Learning Approach to Modeling. Needham Heights, MA 02194: Ginn Press, 1991.Google Scholar
  15. 15.
    Fogel D B. Evolving artificial intelligence [Dissertation]. University of California, San Diego, CA, 1992.Google Scholar
  16. 16.
    Fogel D B. Applying evolutionary programming to selected traveling salesman problems. Cybernetics and Systems, 1993, 24: 27–36.MathSciNetGoogle Scholar
  17. 17.
    Yao X, Liu Y, Lin G. Evolutionary programming made faster. IEEE Trans. Evolutionary Computation, 1999, 3(2): 82–102.Google Scholar
  18. 18.
    Yao X, Liu Y. Fast evolution strategies. Control and Cybernetics, 1997, 26(3): 467–496.MathSciNetGoogle Scholar
  19. 19.
    Hunt R A. Calculus with Analytic Geometry. New York, NY 10022-5299: Harper & Row Publ., Inc., 1986, p.322.Google Scholar
  20. 20.
    Lee C Y, Yao X. Evolutionary programming using mutations based on the Lévy probability distribution. IEEE Trans. Evolutionary Computation, 2004, 8(1): 1–13.Google Scholar
  21. 21.
    He J, Yao X. A game-theoretic approach for designing mixed mutation strategies. In Proc. the 2005 Int. Conf. Natural Computation (ICNC'05), Changsha, China, Aug. 2005, pp.27–29.Google Scholar
  22. 22.
    Weibull J W. Evolutionary Game Theory. Cambridge, MA: MIT Press, 1995.Google Scholar
  23. 23.
    Dutta P K. Strategies and Games. Cambridge, MA: The MIT Press, 1999.Google Scholar
  24. 24.
    Canhi-Paz E. A survey of parallel genetic algorithms. Calculateun Porolllles, Reseau el Sjvtem Repom's, 1998, 10(2): 141–171.Google Scholar
  25. 25.
    Alba E, Troya J M. A survey of parallel distributed genetic algorithms. Complexity, 1999, 4(4): 31–52.CrossRefMathSciNetGoogle Scholar
  26. 26.
    Riesscn G A, Williams G J, Yao X. Pepnet: Parallel evolutionary programming for constructing artificial neural networks. In Lecture Notes in Computer Science, Berlin: Springer-Verlag, 1997, 1213: 35–45.Google Scholar
  27. 27.
    Tongchim S, Yao X. Parallel Evolutionary Programming. In Proc. the 2004 Congress on Evolutionary Computation (CEC'04), Portland, Oregon, USA, June 19–23, 2004, pp.1362–1367.Google Scholar
  28. 28.
    Yao X, Liu Y. A new evolutionary system for evolving artificial neural networks. IEEE Trans. Neural Networks, 1997, 8(3): 694–713.CrossRefMathSciNetGoogle Scholar
  29. 29.
    Yao X, Liu Y. Making use of population information in evolutionary artificial neural networks. IEEE Trans. Systems, Man, and Cybernetics, Part B: Cybernetics, 1998, 28(3): 417–425.Google Scholar
  30. 30.
    Selfridge O G. Pandemonium: A paradigm for learning. In Mechanization of Thought Progresses: Proc. a Symp. Held at the National Physical Lab., HMSO, London, 1958, pp.513–526.Google Scholar
  31. 31.
    Hansen L K, Salamon P. Neural network ensembles. IEEE Trans. Pattern Analysis and Machine Intelligence, 1990, 12(10): 993–1001.CrossRefGoogle Scholar
  32. 32.
    Sharkey A. On combining artificial neural nets. Connection Science, 1996, 8: 299–313.CrossRefGoogle Scholar
  33. 33.
    Jacobs R A, Jordan M I, Nowlan S J, Hinton G E. Adaptive mixtures of local experts. Neural Computation, 1991, 3: 79–87.Google Scholar
  34. 34.
    Jacobs R A, Jordan M I, Barto A G. Task decomposition through competition in a modular connectionist architecture: The what and where vision task. Cognitive Science, 1991, 15: 219–250.CrossRefGoogle Scholar
  35. 35.
    Drecker H, Cortes C, Jackel L D, LeCun Y, Vapnik V. Boosting and other ensemble methods. Neural Computation, 1994, 6: 1289–1301.Google Scholar
  36. 36.
    Liu Y, Yao X, Higuchi T. Evolutionary ensembles with negative correlation learning. IEEE Trans. Evolutionary Computation, 2000, 4(4): 380–387.Google Scholar
  37. 37.
    Liu Y, Yao X. Negative corrected neural networks can produce best ensembles. Australian Journal of Intelligent Information Processing Systems, 1997, 4(3/4): 176–185.Google Scholar
  38. 38.
    Liu Y, Yao X. Ensemble learning via negative correlation. Neural Networks, 1999, 12: 1399–1404.CrossRefGoogle Scholar
  39. 39.
    Liu Y, Yao X. Simultaneous training of negatively correlated neural networks in an ensemble. IEEE Trans. Systems, Man, and Cybernetics, Part B: Cybernetics, 1999, 29: 716–725.Google Scholar
  40. 40.
    Liu Y, Yao X. Toward designing neural network ensembles by evolution. In Proc. 5th Int. Conf. Parallel Problem Solving from Nature (PPSN V), Berlin, Germany: Springer-Verlag, 1998, Lecture Notes in Computer Science, 1498: 623–632.Google Scholar
  41. 41.
    Liu Y, Yao X, Darwen P. How to make best use of evolutionary learning. In Complex Systems: From Local Interactions to Global Phenomena, Stocker R, Jelinek H, Durnota B (eds.), Amsterdam, Netherlands: IOS Press, 1996, pp.229–242.Google Scholar
  42. 42.
    Michie D, Spiegelhalter D J, Taylor C C. Machine Learning, Neural and Statistical Classification. London, UK: Ellis Horwood Limited, 1994.Google Scholar
  43. 43.
    ftp://ics.uci.edu/pub/machine-learning-databases.Google Scholar
  44. 44.
    Monirul Islam M, Yao X, Murase K. A constructive algorithm for training cooperative neural network ensembles. IEEE Trans. Neural Networks, 2003, 14(4): 820–834.Google Scholar
  45. 45.
    Wang Z, Yao X, Xu Y. An improved constructive neural network ensemble approach to medical diagnoses. Lecture Notes in Computer Science, 2004, 3177: 572–577.Google Scholar
  46. 46.
    Brown G, Wyatt J, Harris R, Yao X. Diversity creation methods: A survey and categorisation. Journal of Information Fusion, 2005, 6(1): 5–20.Google Scholar
  47. 47.
    Yao X, Liu Y. Evolving neural network ensembles by mini-mization of mutual information. International Journal of Hybrid Intelligent Systems, 2004, 1(1): 12–21.MathSciNetGoogle Scholar
  48. 48.
    Chandra A, Yao X. Evolutionary framework for the construction of diverse hybrid ensembles. In Proc. the 13th European Symposium on Artificial Neural Networks (ESANN'2005), Bruges, Belgium, April 27–29, 2005, pp.253–258.Google Scholar
  49. 49.
    Chandra A, Yao X. DIVACE: Diverse and accurate ensemble learning algorithm. Lecture Notes in Computer Science, 2004, 3177: 619–625.Google Scholar
  50. 50.
    Chandra A, Yao X. Ensemble learning using multiobjective evolutionary algorithms. Journal of Mathematical Modelling and Algorithms, May 2005 (to appear).Google Scholar
  51. 51.
    Abbass H A. A memetic Pareto evolutionary approach to artificial neural networks. In Proc. the 14th Australian Joint Conference on Artificial Intelligence, Berlin, 2000, pp.1–12.Google Scholar
  52. 52.
    Liu Y, Yao X. Learning and evolution by minimization of mutual information. Lecture Notes in Computer Science, 2002, 2439: 495–504.Google Scholar
  53. 53.
    Chandra A, Yao X. Evolving hybrid ensembles of learning machines for better generalisation. Submitted to Neurocomputing, Elsevier, July 2005.Google Scholar
  54. 54.
    Coello Coello C A. A comprehensive survey of evolutionary-based multiobjective optimization techniques. Knowledge and Information Systems: An International Journal, 1999, 1(3): 269–308.Google Scholar
  55. 55.
    Khare V, Yao X, Deb K. Performance scaling of multiobjective evolutionary algorithms. Lecture Notes in Computer Science, 2003, 2632: 376–390.Google Scholar
  56. 56.
    Deb K, Agrawal S, Pratap A, Meyarivan T. A fast and elitist multiobjective genetic algorithm: NSGA-II. IEEE Trans. Evolutionary Computation, 2002, 6(2): 182–197.Google Scholar
  57. 57.
    Zitzler E, Laumanns M, Thiele L. SPEA2: Improving the strength Pareto evolutionary algorithm. Technical Report 103, Computer Engineering and Networks Laboratory (TIK), Swiss Federal Institute of Technology (ETH), Zurich, Gloriastrasse 35, CH-8092 Zurich, Switzerland, May 2001.Google Scholar
  58. 58.
    Corne D W, Knowles J D, Oates M J. The Pareto envelope-based selection algorithm for multiobjective optimization. Lecture Notes in Computer Science, 2000, 1917: 839–848.Google Scholar
  59. 59.
    Deb K, Thiele L, Laumanns M, Zitzler E. Scalable test problems for evolutionary multiobjective optimization. Technical Report 112, Computer Engineering and Networks Laboratory (TIK), Swiss Federal Institute of Technology (ETH), Zurich, Switzerland, 2001.Google Scholar
  60. 60.
    Atashkari K, Nariman-Zadeh N, Pilechi A, Yao X. Thermodynamic Pareto optimization of turbojet engines using multiobjective genetic algorithms. International Journal of Thermal Sciences, 2005 (to appear).Google Scholar
  61. 61.
    Nariman-Zadeh N, Atashkari K, Jamali A, Pilechi A, Yao X. Inverse modelling of multiobjective thermodynamically optimized turbojet engines using GMDH-type neural networks and evolutionary algorithms. Engineering Optimization, 2005, 37(5): 437–462.CrossRefGoogle Scholar
  62. 62.
    Axelrod R M. The Evolution of Cooperation. New York: Basic Books, 1984.Google Scholar
  63. 63.
    Axelrod R M. The Evolution of Strategies in the Iterated Prisoner's Dilemma. Genetic Algorithms and Simulated Annealing, Chap. 3, Davis L (ed.), Los Altos, California: Morgan Kaufmann, 1987, pp.32–41.Google Scholar
  64. 64.
    Darwen P, Yao X. Co-evolution in iterated prisoner's dilemma with intermediate levels of cooperation: Application to missile defense. International Journal of Computational Intelligence and Applications, 2002, 2(1): 83–107.Google Scholar
  65. 65.
    Frean M. The evolution of degrees of cooperation. J. Theoretical. Bio., 1996, 182(4): 549–559.Google Scholar
  66. 66.
    Harrald P G, Fogel D B. Evolving continuous behaviors in the iterated prisoner's dilemma. Biosystems, 1996, 37: 135–145.CrossRefGoogle Scholar
  67. 67.
    Chong S Y, Yao X. The impact of noise on iterated prisoner's dilemma with multiple levels of cooperation. In Proc. the 2004 Congress on Evolutionary Computation (CEC'04), Portland, Oregon, USA, June 19–23, 2004, pp.348–355.Google Scholar
  68. 68.
    Colman A M. Game Theory and Experimental Games. Oxford, England: Pergamon Press, 1982, p.159.Google Scholar
  69. 69.
    Glance N S, Huberman B A. The dynamics of social dilemmas. Sci. Am., March 1994, pp.58–63.Google Scholar
  70. 70.
    Albin P. Approximations of cooperative equilibria in multi-person prisoners' dilemma played by cellular automata. Math. Sot. Sci., 1992, 24: 293–319.zbMATHMathSciNetGoogle Scholar
  71. 71.
    Yao X. Evolutionary stability in the n-person iterated prisoner's dilemma. BioSystems, 1996, 37(3): 189–197.CrossRefGoogle Scholar
  72. 72.
    Seo Y G, Cho S B, Yao X. Emergence of Cooperative Coalition in NIPD game with Localization of Interaction and Learning. In Proc. the 1999 Congress on Evolutionary Computation, IEEE Press, Piscataway, NJ, USA, July 1999, 2: 877–884.Google Scholar
  73. 73.
    Seo Y G, Cho S B, Yao X. The impact of payoff function and local interaction on the N-player iterated prisoner's dilemma. Knowledge and Information Systems: An International Journal, 2000, 2(4): 461–478.Google Scholar
  74. 74.
    Yao X, Darwen P. The experimental study of N-player iterated prisoner's dilemma. Informatica, 1994, 18: 435–450.Google Scholar
  75. 75.
    Runarsson T P, Yao X. Stochastic ranking for constrained evolutionary optimization. IEEE Trans. Evolutionary Computation, 2000, 4(3): 284–294.Google Scholar
  76. 76.
    Joines J, Houck C. On the use of nonstationary penalty functions to solve nonlinear constrained optimization problems with Gas. In Proc. IEEE Int. Conf. Evolutionary Computing, Piscataway, NJ, 1994, pp.579–584.Google Scholar
  77. 77.
    Runarsson T P, Yao X. Search biases in constrained evolutionary optimization. IEEE Trans. Systems, Man, and Cybernetics—Part C: Applications and Reviews, 2005, 35(2): 233–243.Google Scholar
  78. 78.
    Salcedo-Sanz S, Yao X. A hybrid Hopfield network—Genetic algorithm approach for the terminal assignment problem. IEEE Trans. System, Man and Cybernetics B, 2004, 34(6): 2343–2353.Google Scholar
  79. 79.
    Xu Y, Salcedo-Sanz S, Yao X. Non-standard cost terminal assignment problems using tabu search approach. In Proc. the 2004 Congress on Evolutionary Computation (CEC'04), IEEE Press, Portland, Oregon, USA, June 19–23, 2004, pp.2335–2340.Google Scholar
  80. 80.
    Abuali F N, Schoenefeld D A, Wainwright R L. Terminal assignment in a communications network using genetic algorithms. In Proc. 22nd Annu. ACM Computer Science Conf., 1994, pp.74–81.Google Scholar
  81. 81.
    Khuri S, Chiu T. Heuristic algorithms for the terminal assignment problem. In Proc. 1997 ACM Symp. Applied Computing, 1997, pp.247–251.Google Scholar
  82. 82.
    Brudaru I. Grouping-based hybrid genetic algorithm for the terminal assignment problem. In Proc. 7th Int. Research/Expert Conf. Trends Development of Machinery Associated Techniques, Barcelona, Spain, 2003.Google Scholar
  83. 83.
    Salcedo-Sanz S, Xu Y, Yao X. A hybrid genetic algorithm-Hopfield network for task assignment in distributed computer networks. Computers and Operations Research, 2005 (to appear).Google Scholar
  84. 84.
    Salcedo-Sanz S, Xu Y, Yao X. Meta-heuristic algorithms for FPGA segmented channel routing problems with non-standard cost functions. Genetic Programming and Evolvable Machines, Vol.6, 2005 (to appear).Google Scholar
  85. 85.
    Xu Y, Xu S C, Wu B X. Traffic grooming in unidirectional WDM ring networks using genetic algorithms. Computer Communications, 2002, 25(13): 1185–1194.CrossRefMathSciNetGoogle Scholar
  86. 86.
    Xu Y, Xu S C, Wu B X. Strictly nonblocking grooming of dynamic traffic in unidirectional SONET/WDM rings using genetic algorithm. Computer Network, 2003, 41(2): 227–245.MathSciNetGoogle Scholar
  87. 87.
    Liu K H, Xu Y. A new approach to improving the grooming performance with dynamic traffic in SONET rings. Computer Networks, 2004, 46(2): 181–195.CrossRefGoogle Scholar
  88. 88.
    Liu K H, Xu Y. Combining GA with splitting methods to the rearrangeably nonblocking grooming of dynamic traffic in SONET ring networks. Photonic Network Communications, 2005, 10(2): 179–192.CrossRefMathSciNetGoogle Scholar
  89. 89.
    Xu Y, Yao X. Lower bound on number of ADMs in WDM rings with nonuniform traffic demands. Electronics Letters, 2004, 40(13): 824–825.CrossRefGoogle Scholar
  90. 90.
    Xu Y, Salcedo-Sanz S, Yao X. Metaheuristic approaches to traffic grooming in WDM optical networks. International Journal of Computational Intelligence and Applications, 2005 (to appear).Google Scholar
  91. 91.
    Schnier T, Yao X, Liu P. Digital filter design using multiple Pareto fronts, soft computing—A fusion of foundations. Methodologies and Applications, 2004, 8(5): 332–343.Google Scholar
  92. 92.
    Schnier T, Yao X. Evolutionary design calibration. Lecture Notes in Computer Science, 2001, 2210: 26–37.Google Scholar
  93. 93.
    Schnier T, Yao X. Using negative correlation to evolve fault-tolerant circuits. Lecture Notes in Computer Science, 2003, 2606: 35–46.Google Scholar
  94. 94.
    Thornes J E. The cost-benefit of winter road maintenance in the UK. In Proc. 8th International Road Weather Conference, Birmingham, UK, 1996, pp.1–10.Google Scholar
  95. 95.
    Handa H, Chapman L, Yao X. Dynamic salting route optimisation using evolutionary computation. In Proc. 2005 Congress on Evolutionary Computation (CEC'05), IEEE Press, Edinburgh, UK, Sept. 2–5, 2005, (to appear).Google Scholar
  96. 96.
    Chapman L, Thornes J E. Real-time sky-view factor calculation and approximation. Journal of Atmospheric and Oceanic Technology, 2004, 21: 730–741.CrossRefGoogle Scholar
  97. 97.
    Li J, Yao X, Frayn C, Khosroshahi H G, Raychaudhury S. An evolutionary approach to modeling radial brightness distributions in elliptical galaxies. Lecture Notes in Computer Science, 2004, 3242: 591–601.Google Scholar
  98. 98.
    Washbrook B, Li J. The Application of evolutionary computation to the analysis of the profiles of elliptical galaxies: A maximum likelihood approach. In Proc. 2005 IEEE Congress on Evolutionary Computation (CEC'05), Edinburgh, UK, September 2–5, 2005 (to appear).Google Scholar
  99. 99.
    Rudolph G. Finite Markov chain results in evolutionary computation: A tour d'Horizon. Fundamenta Informaticae, 1998, 35(1-4): 67–89.zbMATHMathSciNetGoogle Scholar
  100. 100.
    Eiben A E, Rudolph G. Theory of evolutionary algorithms: A bird's eye view. Theoretical Computer Science, 1999, 229(1-2): 3–9.CrossRefMathSciNetGoogle Scholar
  101. 101.
    Droste S, Jansen T, Wegener I. A rigorous complexity analysis of the (1+1) evolutionary algorithm for linear functions with Boolean inputs. Evolutionary Computation, 1998, 6(2): 185–196.Google Scholar
  102. 102.
    Beyer H G, Schwefel H P, Wegener I. How to analyze evolutionary algorithms. Theoretical Computer Science, 2002, 287(1): 101–130.CrossRefMathSciNetGoogle Scholar
  103. 103.
    Droste S, Jansen T, Wegener I. On the analysis of the (1+1) evolutionary algorithms. Theoretical Computer Science, 2002, 276(1-2): 51–81.CrossRefMathSciNetGoogle Scholar
  104. 104.
    He J, Yao X. Drift analysis and average time complexity of evolutionary algorithms. Artificial Intelligence, 2001, 127(1): 57–85.CrossRefMathSciNetGoogle Scholar
  105. 105.
    He J, Yao X. From an individual to a population: An analysis of the first hitting time of population-based evolutionary algorithms. IEEE Trans. Evolutionary Computation, 2002, 6(5): 495–511.Google Scholar
  106. 106.
    He J, Yao X. Towards an analytic framework for analysing the computation time of evolutionary algorithms. Artificial Intelligence, 2003, 145(1-2): 59–97.CrossRefMathSciNetGoogle Scholar
  107. 107.
    He J, Yao X. A study of drift analysis for estimating computation time of evolutionary algorithms. Natural Computing, 2004, 3(1): 21–35.CrossRefMathSciNetGoogle Scholar
  108. 108.
    Syski R. Passage Times for Markov Chains. Amsterdam: IOS Press, 1992.Google Scholar
  109. 109.
    Iosifescu M. Finite Markov Processes and Their Applications. Chichester: John Wiley & Sons, 1980.Google Scholar
  110. 110.
    Bäck T, Fogel D B, Michalewicz Z (eds.). Handbook of Evolutionary Computation. Oxford: Oxford University Press, 1997.Google Scholar
  111. 111.
    Spears W M. Evolutionary Algorithms: The Role of Mutation and Recombination. Berlin: Springer, 2000.Google Scholar
  112. 112.
    Hajek B. Hitting time and occupation time bounds implied by drift analysis with applications. Advances in Applied Probability, 1982, 14(3): 502–525.zbMATHMathSciNetGoogle Scholar
  113. 113.
    Meyn S P, Tweedie R L. Markov Chains and Stochastic Stability. New York: Springer-Verlag, 1993.Google Scholar
  114. 114.
    Sasaki G H, Hajek B. The time complexity of maximum matching by simulated annealing. J. ACM, 1988, 35(2): 387–403.CrossRefMathSciNetGoogle Scholar
  115. 115.
    Bäck T. The interaction of mutation rate, selection and self-adaptation within a genetic algorithm. In Proc. Parallel Problem Solving from Nature (PPSN) II, Männer R, Manderick B (eds.), Amsterdam, The Netherlands, North Holland, 1992, pp.85–94.Google Scholar
  116. 116.
    Mühlenbein H. How genetic algorithms really works I: Mutation and hill-climbing. In Proc. Parallel Problem Solving from Nature (PPSN) II, Männer R, Manderick B (eds.), Amsterdam, North Holland, 1992, pp.15–25.Google Scholar
  117. 117.
    Rudolph G. Convergence Properties of Evolutionary Algorithms. Hamburg, Germany: Springer-Verlag, 1997.Google Scholar
  118. 118.
    Garnier J, Kallel L, Schoenauer M. Rigorous hitting times for binary mutations. Evolutionary Computation, 1999, 7(2): 167–203.Google Scholar
  119. 119.
    Rudolph G. How mutation and selection solve long path problems in polynomial expected time. Evolutionary Computation, 1996, 4(2): 207–211.Google Scholar
  120. 120.
    Garnier J, Kallel L. Statistical distribution of the convergence time of evolutionary algorithms for longpath problems. IEEE Trans. Evolutionary Computation, 2000, 4: 16–30.Google Scholar
  121. 121.
    Lu Q, Yao X. Clustering and Learning Gaussian Distribution for Continuous Optimization. IEEE Trans. Systems, Man, and Cybernetics, Part C, 2005, 35(2): 195–204.Google Scholar
  122. 122.
    Tang E K, Suganthan P N, Yao X. Generalized LDA using relevance weighting and evolution strategy. In Proc. 2004 Congress on Evolutionary Computation (CEC'04), Portland, Oregon, USA, June 19–23, 2004, pp.2230–2234.Google Scholar
  123. 123.
    Zou P, Zhou Z, Chen G L, Yao X. A novel memetic algorithm with random multi-local-search: A case study of TSP. In Proc. 2004 Congress on Evolutionary Computation (CEC'04), Portland, Oregon, USA, June 19–23, 2004, pp.2335–2340.Google Scholar
  124. 124.
    Matsumura Y, Yao X, Wyatt J L, Ohkura K, Ueda K. Robust evolution strategies use adaptive search strategies to design continuous-time recurrent neural networks. In Proc. 2nd International Conference on Computational Intelligence, Robotics and Autonomous Systems (CIRAS 2003), Singapore, December 15–18, 2003.Google Scholar
  125. 125.
    Runarsson T P, Yao X. Evolutionary search and constraint violations. In Proc. 2003 Congress on Evolutionary Computation (CEC'03), IEEE Press, Piscataway, NJ, USA, December 8–12, 2003, pp.1414–1419.Google Scholar
  126. 126.
    He J, Yao X. An analysis of evolutionary algorithms for finding approximation solutions to hard optimisation problems. In Proc. 2003 Congress on Evolutionary Computation (CEC'03), IEEE Press, Piscataway, NJ, USA, December 8–12, 2003, pp.2004–2009.Google Scholar
  127. 127.
    Tianfield H, Tian J, Yao X. On the architectures of complex multi-agent systems. In Proc. Workshop on Knowledge Grid and Grid Intelligence, held at the 2003 IEEE/WIC International Conference on Web Intelligence/Intelligent Agent Technology, Halifax, Canada, October 13–16, 2003, pp.195–206.Google Scholar
  128. 128.
    Yang S, Yao X. Experimental study on population-based incremental learning algorithms for dynamic optimization problems. Soft Computing, August 2004 (to appear).Google Scholar
  129. 129.
    Matsumura Y, Fujimoto N, Yao X, Wyatt J, Hagihara K. Near-optimal dynamic grid task scheduling of evolution strategies. In Proc. The 4th International Conference on Advanced Mechatronics, Japan, 2004, pp.175–180.Google Scholar
  130. 130.
    Yang S, Yao X. Dual population-based incremental learning for problem optimization in dynamic environments. In Proc. The 7th Asia Pacific Symposium on Intelligent and Evolutionary Systems, Japan, 2003, pp.49–56.Google Scholar
  131. 131.
    Ma P C H, Chan K C C, Yao X, Chiu D K Y. An evolutionary clustering algorithm for gene expression microarray data analysis. IEEE Trans. Evolutionary Computation, 2005 (to appear).Google Scholar
  132. 132.
    Au W H, Chan K C C, Yao X. Data mining by evolutionary learning for robust churn prediction in the telecommunications industry. IEEE Trans. Evolutionary Computation, 2003, 7(6): 532–545.Google Scholar
  133. 133.
    Salim M, Yao X. Evolving SQL Queries for Data Mining. Lecture Notes in Computer Science, 2002, 2412: 62–67.Google Scholar
  134. 134.
    Zhang C, Yao X, Yang J. An evolutionary approach to materialized views selection in a data warehouse environment. IEEE Trans. Systems, Man and Cybernetics, Part C, 2001, 31(3): 282–294.MathSciNetGoogle Scholar
  135. 135.
    Tang E K, Suganthan P N, Yao X, Qin A K. Linear dimensionality reduction using relevance weighted LDA. Pattern Recognition, April 2005, 38(4): 485–493.Google Scholar
  136. 136.
    Sarker R, Yao X. Simulated annealing for solving a manufacturing batch-sizing problem. International Journal of Operations and Quantitative Management, 2003, 9(1): 65–80.Google Scholar
  137. 137.
    Yu J X, Yao X, Choi C H, Gou G. Materialized view selection as constrained evolutionary optimization. IEEE Trans. Systems, Man and Cybernetics, Part C, 2003, 33(4): 458–467.Google Scholar
  138. 138.
    Lin J, Cheong B H, Yao X. Universal multiobjective function for optimising superplastic-damage constitutive equations. Journal of Materials Processing Technology, 2002, 125-126: 199–205.CrossRefGoogle Scholar
  139. 139.
    Li B, Lin J, Yao X. A novel evolutionary algorithm for determining unified creep damage constitutive equations. International Journal of Mechanical Sciences, 2002, 44(5): 987–1002.CrossRefGoogle Scholar
  140. 140.
    Tang E K, Suganthan P N, Yao X. Nonlinear Feature Extraction Using Evolutionary Algorithm. Lecture Notes in Computer Science, 2004, 3316: 1014–1019.Google Scholar
  141. 141.
    Matsumura Y, Fujimoto N, Yao X. Wyatt J, Hagihara K. Near-optimal dynamic grid task scheduling of evolution strategies. In Proc. The 4th International Conference on Advanced Mechatronics, 2004, pp.175–180.Google Scholar
  142. 142.
    Matsumura Y, Fujimoto N, Yao X, Wyatt J, Hagihara K. Evolution strategies on grid computing. In Proc. the China-Japan Joint Conference on Mechatronics, CJ-H2, Suzhou, China, 2004, pp.105–106.Google Scholar
  143. 143.
    Fogel D B. Evolutionary Computation—Toward a New Philosophy of Machine Intelligence. 2nd Edition, Piscataway, NJ: IEEE Press, 2000.Google Scholar

Copyright information

© Springer Science + Business Media, Inc. 2006

Authors and Affiliations

  1. 1.Nature Inspired Computation and Applications LaboratoryThe University of Science and Technology of ChinaHefeiP.R. China
  2. 2.The Centre of Excellence for Research in Computational Intelligence and Applications (CERCIA) School of Computer ScienceThe University of BirminghamEdgbaston, BirminghamUK

Personalised recommendations