Skip to main content

Hybrid Heuristics for Optimal Design of Artificial Neural Networks

  • Conference paper
Developments in Soft Computing

Part of the book series: Advances in Soft Computing ((AINSC,volume 9))

Abstract

Designing the architecture and correct parameters for the learning algorithm is a tedious task for modeling an optimal Artificial Neural Network (ANN), which is smaller, faster and with a better generalization performance. In this paper we explain how a hybrid algorithm integrating Genetic algorithm (GA), Simulated Annealing (SA) and other heuristic procedures can be applied for the optimal design of an ANN. This paper is more concerned with the understanding of current theoretical developments of Evolutionary Artificial Neural Networks (EANNs) using GAs and how the proposed hybrid heuristic procedures can be combined to produce an optimal ANN. The proposed meta-heuristic can be regarded as a general framework for adaptive systems, that is, systems that can change their connection weights, architectures and learning rules according to different environments without human intervention.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Yao, X. (1999): Evolving Artificial Neural Networks,Proceedings of the IEEE, 87(9): 1423–1447.

    Google Scholar 

  2. Hart, W.E. (1996): A Theoretical Comparison of Evolutionary Algorithms and Simulated Annealing,Proceedings of the 5th Annual Conference on Evolutionary Programming MIT press.

    Google Scholar 

  3. Frean, M. (1990): The Upstart Algorithm: A Method For Constructing and Training Feed Forward Neural Networks,Neural computations, pp.198–209.

    Google Scholar 

  4. Mezard, M., Nadal, J.P. (1989): Learning in Feed Forward Layered Networks: The Tiling algorithm, Journal of Physics A, Vol 22, pp. 2191–2204.

    Google Scholar 

  5. Ingber, L., Rosen, B., (1992): Genetic Algorithms and Very Fast Simulated Annealing — A Comparison,Mathematical and Computer Modeling, pp 87–100.

    Google Scholar 

  6. Yao, X. (1995): A New Simulated Annealing Algorithm, International Journal of Computer Mathematics, 56: 161–168.

    Google Scholar 

  7. Baffles, P.T., Zelle, J.M. (1992) Growing layers of Perceptrons: Introducing the Exentron Algorithm, Proceedings on the International Joint Conference on Neural Networks, Vol 2, pp. 392–397.

    Google Scholar 

  8. Fahlman, S.E., Lebiere, C. (1990) The Cascade Correlation Learning Architecture,Advances in Neural Information Processing Systems, pp. 524–532.

    Google Scholar 

  9. Boers, E.J.W, Kuiper, H., Happel, B.L.M. and Sprinkhuizen-Kuyper; I.G. (1993): Designing Modular Artificial Neural Networks, In: H.A. Wijshoff ( Ed. ); Proceedings of Computing Science in The Netherlands, pp. 87–96.

    Google Scholar 

  10. Gutjahr, S., Ragg, T. (1997): Automatic Determination of Optimal Network Topologies Based on Information Theory and Evolution,IEEE Proceedings of the 23rd EUROMICRO Conference.

    Google Scholar 

  11. Schiffmann, W., Joost, M. and Werner, R. (1993) Comparison Of Optimized Backpropagation Algorithms, Proceedings. Of the European Symposium on Artificial Neural Networks, Brussels, pp. 97–104.

    Google Scholar 

  12. Mascioli, F., Martinelli, G. (1995): A Constructive Algorithm for Binary Neural Networks: The Oil Spot Algorithm,IEEE Transactions on Neural Networks, 6(3), pp 794–797.

    Google Scholar 

  13. Porto, V.W., Fogel, D.B. and Fogel, L.J. (1995): Alternative Neural Network Training Methods, IEEE Expert, volume 10, no. 4, pp. 16–22.

    Google Scholar 

  14. Topchy, A.P., and Lebedko, O.A. (1997): Neural Network Training by Means of Cooperative Evolutionary Search,Nuclear Instruments & Methods In Physics Research, Volume 389, no. 1–2, pp. 240–241.

    Google Scholar 

  15. Polani, D. and Miikkulainen, R. (1999): Fast Reinforcement Learning Through Eugenic Neuro-Evolution. Technical Report AI99–277, Department of Computer Sciences, University of Texas at Austin.

    Google Scholar 

  16. Kitano, H. (1990): Designing Neural Networks Using Genetic Algorithms with Graph Generation System, Complex Systems, Volume 4, No. 4, pp. 461–476.

    Google Scholar 

  17. Price, K.V. (1994) Genetic Annealing, Dr. Dobbs Journal, Vol. 220, pp. 127–132.

    Google Scholar 

  18. Stepniewski, S.W. and Keane, A.J. (1997): Pruning Back-propagation Neural Networks Using Modern Stochastic Optimization Techniques, Neural Computing & Applications, Vol. 5, pp. 76–98.

    Google Scholar 

  19. Fullmer, B. and Miikkulainen, R. (1992): Using Marker-Based Genetic Encoding of Neural Networks To Evolve Finite-State Behavior, Proceedings of the First European Conference on Artificial Life, France ), pp. 255–262.

    Google Scholar 

  20. Gruau, F. (1991): Genetic Synthesis of Modular Neural Networks,In S Forrest (Ed.) Genetic Algorithms: Proc. of the 5th International Conference, Morgan Kaufman.

    Google Scholar 

  21. Merril, J.W.L., Port, R.F. (1991): Fractally Configured Neural Networks, Neural Networks, Vol 4, No. 1, pp 53–60.

    Google Scholar 

  22. Kim, H.B., Jung, S.H., Kim, T.G. and Park, K.H. (1996): Fast Learning Method for Back-Propagation Neural Network by Evolutionary Adaptation of Learning Rates, Neurocomputing, vol. 11, no.1, pp. 101–106.

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2001 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Abraham, A., Nath, B. (2001). Hybrid Heuristics for Optimal Design of Artificial Neural Networks. In: John, R., Birkenhead, R. (eds) Developments in Soft Computing. Advances in Soft Computing, vol 9. Physica, Heidelberg. https://doi.org/10.1007/978-3-7908-1829-1_2

Download citation

  • DOI: https://doi.org/10.1007/978-3-7908-1829-1_2

  • Publisher Name: Physica, Heidelberg

  • Print ISBN: 978-3-7908-1361-6

  • Online ISBN: 978-3-7908-1829-1

  • eBook Packages: Springer Book Archive

Publish with us

Policies and ethics