Advertisement

Mixing Different Search Biases in Evolutionary Learning Algorithms

  • Kristina Davoian
  • Wolfram-M. Lippe
Part of the Lecture Notes in Computer Science book series (LNCS, volume 5768)

Abstract

This work investigates the benefits of using different distribution functions in the evolutionary learning algorithms with respect to Artificial Neural Networks’ (ANNs) generalization ability. We examine two modification of the recently proposed network weight-based evolutionary algorithm (NWEA), by mixing mutation strategies based on three distribution functions at the chromosome and the gene levels. The utilization of combined search strategies in the ANNs training implies that different step sizes determined by mixed distributions will direct the evolution towards good generalized ANNs.

Keywords

Artificial Neural Networks Learning Evolutionary Algorithms 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Yao, X.: A review of evolutionary artificial neural networks. International Journal of Intelligent Systems 8(4), 539–567 (1993)MathSciNetCrossRefGoogle Scholar
  2. 2.
    Yao, X.: Evolving artificial neural networks. In: Proceedings of the IEEE, pp. 1423–1447. IEEE Press, Los Alamitos (1999)Google Scholar
  3. 3.
    Rechenberg, I.: Cybernetic solution path of an experimental problem. In: Royal Aircraft Establishment, Farnborough, page Library Translation 1122 (1965)Google Scholar
  4. 4.
    Schwefel, H.-P.: Kybernetische Evolution als Strategie der experimentellen Forschung in der Strömungstechnik. Diplomarbeit, Technische Universität Berlin (1965)Google Scholar
  5. 5.
    Fogel, L.J.: Autonomous automata. Industrial Research 4, 14–19 (1962)Google Scholar
  6. 6.
    Fogel, L.J., Owens, A.J., Walsh, M.J.: Artificial intelligence through simulated evolution. Wiley, New York (1966)zbMATHGoogle Scholar
  7. 7.
    Yao, X., Liu, Y.: An Analysis of evolutionary algorithms based on neighborhood and step size. In: Angeline, P.J., McDonnell, J.R., Reynolds, R.G., Eberhart, R. (eds.) EP 1997. LNCS, vol. 1213, pp. 297–307. Springer, Heidelberg (1997)CrossRefGoogle Scholar
  8. 8.
    Yao, X., Liu, Y.: Fast Evolutionary Programming. In: Proc. of the Fifth Annual Conference on Evolutionary Programming, pp. 451–460. MIT Press, Cambridge (1996)Google Scholar
  9. 9.
    Yao, X., Liu, Y.: Evolutionary programming made faster. In: IEEE Transactions on Evolutionary Computation, vol. 3, pp. 82–102. IEEE Press, Los Alamitos (1999)Google Scholar
  10. 10.
    Davoian, K., Lippe, W.-M.: Including phenotype information in mutation to evolve artificial neural networks. In: Proc. of the IEEE International Joint Conference on Neural Networks (IJCNN 2007), Orlando, USA (2007)Google Scholar
  11. 11.
    Davoian, K., Lippe, W.-M.: Exploring the role of activation function type in evolutionary artificial neural networks. In: Proc. of the 2008 Int. Conference on Data Mining (DMIN 2008), pp. 443–449. CSREA Press, Las Vegas (2008)Google Scholar
  12. 12.
    Prechelt, L.: Proben1-A set of neural network benchmark problems and benchmarking rules. Fakultät für Informatik, Universät Karlsruhe, Germany, Tech. Rep. 21/94 (1994)Google Scholar
  13. 13.
    Yao, X., Liu, Y.: Scaling up evolutionary programming algorithms. In: Porto, V.W., Waagen, D. (eds.) EP 1998. LNCS, vol. 1447, pp. 103–112. Springer, Heidelberg (1998)CrossRefGoogle Scholar
  14. 14.
    Lippe, W.-M.: Soft-Computing mit Neuronalen Netzen, Fuzzy-Logic und Evolutionären Algorithmen. Springer, Heidelberg (2006)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2009

Authors and Affiliations

  • Kristina Davoian
    • 1
  • Wolfram-M. Lippe
    • 1
  1. 1.Department of Mathematics and Computer ScienceUniversity of MünsterMünsterGermany

Personalised recommendations