Skip to main content

A Memetic Pareto Evolutionary Approach to Artificial Neural Networks

  • Conference paper
  • First Online:
Book cover AI 2001: Advances in Artificial Intelligence (AI 2001)

Part of the book series: Lecture Notes in Computer Science ((LNAI,volume 2256))

Included in the following conference series:

Abstract

Evolutionary Artificial Neural Networks (EANN) have been a focus of research in the areas of Evolutionary Algorithms (EA) and Artificial Neural Networks (ANN) for the last decade. In this paper, we present an EANN approach based on pareto multi-objective optimization and differential evolution augmented with local search. We call the approach Memetic Pareto Artificial Neural Networks (MPANN). We show empirically that MPANN is capable to overcome the slow training of traditional EANN with equivalent or better generalization.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. H.A. Abbass, R. Sarker, and C. Newton. A pareto differential evolution approach to vector optimisation problems. Congress on Evolutionary Computation, 2:971–978, 2001.

    Google Scholar 

  2. C.L. Blake and C.J. Merz. UCI repository of machine learning databases, http://www.ics.uci.edu/~mlearn/mlrepository.html. University of California, Irvine, Dept. of Information and Computer Sciences, 1998.

  3. C.A. Coello. A comprehensive survey of evolutionary-based multiobjective optimization techniques. Knowledge and Information Systems, 1(3):269–308, 1999.

    Google Scholar 

  4. S. Fahlman and C. Lebiere. The cascade correlation learning architecture. Technical Report CMU-CW-90-100, Canegie Mellon University, Pittsburgh, PA, 1990.

    Google Scholar 

  5. D.B. Fogel. Evolutionary Computation: towards a new philosophy of machine intelligence. IEEE Press, New York, NY, 1995.

    Google Scholar 

  6. D.B. Fogel, E.C. Wasson, and E.M. Boughton. Evolving neural networks for detecting breast cancer. Cancer letters, 96(1):49–53, 1995.

    Article  Google Scholar 

  7. D.B. Fogel, E.C. Wasson, and V.W. Porto. A step toward computer-assisted mammography using evolutionary programming and neural networks. Cancer letters, 119(1):93, 1997.

    Article  Google Scholar 

  8. S. Haykin. Neural networks-a comprehensive foundation. Printice Hall, USA, 2 edition, 1999.

    MATH  Google Scholar 

  9. J. Horn, N. Nafpliotis, and D.E. Goldberg. A niched pareto genetic algorithm for multiobjective optimization. Proceedings of the First IEEE Conference on Evolutionary Computation, 1:82–87, 1994.

    Google Scholar 

  10. D.J. Janson and J.F. Frenzel. Application of genetic algorithms to the training of higher order neural networks. Systems Engineering, 2:272–276, 1992.

    Google Scholar 

  11. D.J. Janson and J.F. Frenzel. Training product unit neural networks with genetic algorithms. IEEE Expert, 8(5):26–33, 1993.

    Article  Google Scholar 

  12. H. Kitano. Designing neural networks using genetic algorithms with graph generation system. Complex Systems, 4(4):461–476, 1990.

    MATH  Google Scholar 

  13. J. Knowles and D. Corne. Approximating the nondominated front using the pareto archived evolution strategy. Evolutionary Computation, 8(2):149–172, 2000.

    Article  Google Scholar 

  14. Y. LeCun, J.J. Denker, and S.A. Solla. Optimal brain damage. In D. Touretzky, editor, Advances in Neural Information Processing Systems. Morgan Kaufmann, 1990.

    Google Scholar 

  15. V. Maniezzo. Genetic evolution of the topology and weight distribution of neural networks. IEEE Transactions on Neural Networks, 5(1):39–53, 1994.

    Article  Google Scholar 

  16. F. Menczer and D. Parisi. Evidence of hyperplanes in the genetic learning of neural networks. Biological Cybernetics, 66:283–289, 1992.

    Article  Google Scholar 

  17. D. Michie, D.J. Spiegelhalter, and C.C. Taylor. Machine learning, neural and statistical classification. Ellis Horwood, 1994.

    Google Scholar 

  18. P. Moscato. Memetic algorithms: a short introduction. In D. Corne, M. Dorigo, and F. Glover, editors, New ideas in optimization, pages 219–234. McGraw-Hill, 1999.

    Google Scholar 

  19. V.W. Porto, D.B. Fogel, and L.J. Fogel. Alternative neural network training methods. IEEE Expert, 10(3):16–22, 1995.

    Article  Google Scholar 

  20. J.C.F. Pujol and R. Poli. Evolving the topology and the weights of neural networks using a dual representation. Applied Intelligence, 8(1):73–84, 1998.

    Article  Google Scholar 

  21. D.E. Rumelhart, G.E. Hinton, and R.J. Williams. Learning internal representations by error propagation. In J.L. McClelland D.E. Rumelhart and the PDP Research Group Eds, editors, Parallel Distributed Processing: Explorations in the Microstructure of Cognition., Foundations, 1, 318,. MIT Press Cambridge, 1986.

    Google Scholar 

  22. R. Sarker, H.A. Abbass, and C. Newton. Solving multiobjective optimization problems using evolutionary algorithm. The International Conference on Computational Intelligence for Modelling, Control and Automation (CIMCA’2001), Los Vegas, USA, 2001.

    Google Scholar 

  23. J.D. Schaffer. Multiple objective optimization with vector evaluated genetic algorithms. Genetic Algorithms and their Applications: Proceedings of the First International Conference on Genetic Algorithms, pages 93–100, 1985.

    Google Scholar 

  24. R. Storn and K. Price. Differential evolution: a simple and efficient adaptive scheme for global optimization over continuous spaces. Technical Report TR-95-012, International Computer Science Institute, Berkeley, 1995.

    Google Scholar 

  25. P. Werbos. Beyond regression: new tools for prediction and analysis in the behavioral sciences. PhD thesis, Harvard University, 1974.

    Google Scholar 

  26. W. Yan, Z. Zhu, and R. Hu. Hybrid genetic/bp algorithm and its application for radar target classification. Proceedings of the 1997 IEEE National Aerospace and Electronics Conference, NAECON, pages 981–984, 1997.

    Google Scholar 

  27. X. Yao. Evolutionary artificial neural networks. International Journal of Neural Systems, 4(5):203–222, 1993.

    Article  Google Scholar 

  28. X. Yao. A review of evolutionary artificial neural networks. International Journal of Intelligent Systems, 8(4):529–567, 1993.

    Article  Google Scholar 

  29. X. Yao. Evolving artificial neural networks. Proceedings of the IEEE, 87(9):1423–1447, 1999.

    Google Scholar 

  30. X. Yao and Y. Liu. Making use of population information in evolutionary artificial neural networks. IEEE Trans. on Systems, Man, and Cybernetics, Part B: Cybernetics, 28(3):417–425, 1998.

    Article  MathSciNet  Google Scholar 

  31. X. Yao and Y. Liu. Towards designing artificial neural networks by evolution. Applied Mathematics and Computation, 91(1):83–90, 1998.

    Article  MATH  MathSciNet  Google Scholar 

  32. E. Zitzler and L. Thiele. Multiobjective evolutionary algorithms: A comparative case study and the strength pareto approach. IEEE Transactions on Evolutionary Computation, 3(4):257–271, 1999.

    Article  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2001 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Abbass, H. (2001). A Memetic Pareto Evolutionary Approach to Artificial Neural Networks. In: Stumptner, M., Corbett, D., Brooks, M. (eds) AI 2001: Advances in Artificial Intelligence. AI 2001. Lecture Notes in Computer Science(), vol 2256. Springer, Berlin, Heidelberg. https://doi.org/10.1007/3-540-45656-2_1

Download citation

  • DOI: https://doi.org/10.1007/3-540-45656-2_1

  • Published:

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-42960-9

  • Online ISBN: 978-3-540-45656-8

  • eBook Packages: Springer Book Archive

Publish with us

Policies and ethics