Advertisement

From Evolutionary Computation to Natural Computation

Conference paper

Abstract

Evolutionary computation has enjoyed a tremendous growth in more than a decade in both its theoretical foundations and industrial applications. Its scope has gone beyond its earlier meaning of “genetic evolution”. Many research topics in evolutionary computation nowadays are not necessarily “evolutionary” in any sense. There is a need for studying a wide variety of nature inspired computational algorithms and techniques, including evolutionary, neural, ecological computation, etc., in a unified framework This paper gives an overview of some work that has been going on in the Natural Computation Group at The University of Birmingham, UK. It covers topics in optimisation, learning and design using nature inspired algorithms and techniques. Some recent theoretical results in the computational time complexity of evolutionary and neural optimisation algorithms will also be mentioned.

Keywords

Evolutionary Computation Neural Network Ensemble Drift Condition Gaussian Mutation Negative Correlation Learning 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. [1]
    X. Yao, “An overview of evolutionary computation,” Chinese Journal of Advanced Software Research (Allerton Press, Inc., New York, NY 10011), vol. 3, no. 1, pp. 12–29, 1996.Google Scholar
  2. [2]
    S. Kirkpatrick, C. D. Gelatt, and M. P. Vecchi, “Optimization by simulated annealing,” Science, vol. 220, pp. 671–680,1983.MathSciNetMATHCrossRefGoogle Scholar
  3. [3]
    H. H. Szu and R. L. Hartley, “Fast simulated annealing,” Physics Letters A, vol. 122, pp. 157–162,1987.CrossRefGoogle Scholar
  4. [4]
    L. Ingber, “Very fast simulated re-annealing,” Mathl. Comput. Modelling, vol. 12, no. 8, pp. 967–973, 1989.MathSciNetMATHCrossRefGoogle Scholar
  5. [5]
    X. Yao, “A new simulated annealing algorithm,” Int. J. of Computer Math., vol. 56, pp. 161–168, 1995.MATHCrossRefGoogle Scholar
  6. [6]
    D. B. Fogel, System Identification Through Simulated Evolution: A Machine Learning Approach to Modeling. Needham Heights, MA 02194: Ginn Press, 1991.Google Scholar
  7. [7]
    D. B. Fogel, Evolving Artificial Intelligence. PhD thesis, University of California, San Diego, CA, 1992.Google Scholar
  8. [8]
    D. B. Fogel, “Applying evolutionary programming to selected traveling salesman problems,” Cybernetics and Systems, vol. 24, pp. 27–36, 1993.MathSciNetCrossRefGoogle Scholar
  9. [9]
    X. Yao, Y. Liu, and G. Lin, “Evolutionary programming made faster,” IEEE Transactions on Evolutionary Computation, vol. 3, pp. 82–102, July 1999.CrossRefGoogle Scholar
  10. [10]
    X. Yao and Y. Liu, “Fast evolution strategies,” Control and Cybernetics, vol. 26, no. 3, pp. 467–496, 1997.MathSciNetMATHGoogle Scholar
  11. [11]
    R. A. Hunt, Calculus with Analytic Geometry. New York, NY 10022-5299: Harper & Row Publ., Inc., 1986.Google Scholar
  12. [12]
    T. P. Runarsson and X. Yao, “Stochastic ranking for constrained evolutionary optimization,” IEEE Transactions on Evolutionary Computation, vol. 4, pp. 284–294, September 2000.CrossRefGoogle Scholar
  13. [13]
    X. Yao and Y. Liu, “Making use of population information in evolutionary artificial neural networks,” IEEE Trans, on Systems, Man, and Cybernetics, Part B: Cybernetics, vol. 28, no. 3, pp. 417–425,1998.MathSciNetGoogle Scholar
  14. [14]
    O. G. Selfridge, “Pandemonium: a paradigm for learning,” in Mechanisation of Thought Processes: Proc. of a Symp. Held at the National Physical Lab., pp. 513–526, HMSO, London, 1958.Google Scholar
  15. [15]
    L. K. Hansen and P. Salamon, “Neural network ensembles,” IEEE Trans, on Pattern Analysis and Machine Intelligence, vol. 12, no. 10, pp. 993–1001, 1990.CrossRefGoogle Scholar
  16. [16]
    A. Sharkey, “On combining artificial neural nets,” Connection Science, vol. 8, pp. 299–313, 1996.CrossRefGoogle Scholar
  17. [17]
    R. A. Jacobs, M. I. Jordan, S. J. Nowlan, and G. E. Hinton, “Adaptive mixtures of local experts,” Neural Computation, vol. 3, pp. 79–87, 1991.CrossRefGoogle Scholar
  18. [18]
    R. A. Jacobs, M. I. Jordan, and A. G. Barto, “Task decomposition through competition in a modular connectionist architecture: the what and where vision task,” Cognitive Science, vol. 15, pp. 219–250,1991.CrossRefGoogle Scholar
  19. [19]
    H. Drucker, C. Cortes, L. D. Jackel, Y. LeCun, and V. Vapnik, “Boosting and other ensemble methods,” Neural Computation, vol. 6, pp. 1289–1301, 1994.MATHCrossRefGoogle Scholar
  20. [20]
    Y. Liu and X. Yao, “Negatively correlated neural networks can produce best ensembles,” Australian Journal of Intelligent Information Processing Systems, vol. 4, no. 3/4, pp. 176–185,1997.Google Scholar
  21. [21]
    Y. Liu and X. Yao, “A cooperative ensemble learning system,” in 1998 IEEE International Joint Conference on Neural Networks, Anchorage, USA, (Piscataway, NJ, USA), pp. 2202–2207, IEEE Press, 1998.Google Scholar
  22. [22]
    Y. Liu and X. Yao, “Simultaneous training of negatively correlated neural networks in an ensemble,” IEEE Trans, on Systems, Man, and Cybernetics, Part B: Cybernetics, vol. 29, pp. 716–725, December 1999.CrossRefGoogle Scholar
  23. [23]
    X. Yao and Y. Liu, “A new evolutionary system for evolving artificial neural networks,” IEEE Transactions on Neural Networks, vol. 8, no. 3, pp. 694–713,1997.MathSciNetCrossRefGoogle Scholar
  24. [24]
    Y. Liu and X. Yao, “Towards designing neural network ensembles by evolution,” in Parallel Problem Solving from Nature (PPSN) V (A. E. Eiben, T. Back, M. Schoenauer, and H.-P. Schwefel, eds.), vol. 1498 of Lecture Notes in Computer Science, (Berlin), pp. 623–632, Springer-Verlag, 1998.Google Scholar
  25. [25]
    Y. Liu, X. Yao, and T. Higuchi, “Evolutionary ensembles with negative correlation learning,” IEEE Transactions on Evolutionary Computation, vol. 4, pp. 380–387, November 2000.CrossRefGoogle Scholar
  26. [26]
    D. B. Fogel, Evolutionary Computation: Towards a New Philosophy of Machine Intelligence. New York, NY: IEEE Press, 1995.Google Scholar
  27. [27]
    T. Schnier, X. Yao, and P. Liu, “Digital filter design using multiple pareto fronts,” in Proceedings of the Third NASA/DoD Workshop on Evolvable Hardware, pp. 136–145, IEEE Computer Society Press, CA, July 2001.CrossRefGoogle Scholar
  28. [28]
    J. He and X. Yao, “Drift analysis and average time complexity of evolutionary algorithms,” Artificial Intelligence, vol. 127, pp. 57–85, March 2001.MathSciNetMATHCrossRefGoogle Scholar
  29. [29]
    X. Yao, ed., Evolutionary Computation: Theory and Applications. Singapore: World Scientific Publishing Co., 1999.Google Scholar
  30. [30]
    D. H. Wolpert and W. G. Macready, “No free lunch theorems for optimization,” IEEE Transactions on Evolutionary Computation, vol. 1, no. 1, pp. 67–82, 1997.CrossRefGoogle Scholar

Copyright information

© Springer-Verlag London 2002

Authors and Affiliations

  1. 1.School of Computer ScienceThe University of BirminghamBirminghamUK

Personalised recommendations