Abstract
Designing the architecture and correct parameters for the learning algorithm is a tedious task for modeling an optimal Artificial Neural Network (ANN), which is smaller, faster and with a better generalization performance. In this paper we explain how a hybrid algorithm integrating Genetic algorithm (GA), Simulated Annealing (SA) and other heuristic procedures can be applied for the optimal design of an ANN. This paper is more concerned with the understanding of current theoretical developments of Evolutionary Artificial Neural Networks (EANNs) using GAs and how the proposed hybrid heuristic procedures can be combined to produce an optimal ANN. The proposed meta-heuristic can be regarded as a general framework for adaptive systems, that is, systems that can change their connection weights, architectures and learning rules according to different environments without human intervention.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
Yao, X. (1999): Evolving Artificial Neural Networks,Proceedings of the IEEE, 87(9): 1423–1447.
Hart, W.E. (1996): A Theoretical Comparison of Evolutionary Algorithms and Simulated Annealing,Proceedings of the 5th Annual Conference on Evolutionary Programming MIT press.
Frean, M. (1990): The Upstart Algorithm: A Method For Constructing and Training Feed Forward Neural Networks,Neural computations, pp.198–209.
Mezard, M., Nadal, J.P. (1989): Learning in Feed Forward Layered Networks: The Tiling algorithm, Journal of Physics A, Vol 22, pp. 2191–2204.
Ingber, L., Rosen, B., (1992): Genetic Algorithms and Very Fast Simulated Annealing — A Comparison,Mathematical and Computer Modeling, pp 87–100.
Yao, X. (1995): A New Simulated Annealing Algorithm, International Journal of Computer Mathematics, 56: 161–168.
Baffles, P.T., Zelle, J.M. (1992) Growing layers of Perceptrons: Introducing the Exentron Algorithm, Proceedings on the International Joint Conference on Neural Networks, Vol 2, pp. 392–397.
Fahlman, S.E., Lebiere, C. (1990) The Cascade Correlation Learning Architecture,Advances in Neural Information Processing Systems, pp. 524–532.
Boers, E.J.W, Kuiper, H., Happel, B.L.M. and Sprinkhuizen-Kuyper; I.G. (1993): Designing Modular Artificial Neural Networks, In: H.A. Wijshoff ( Ed. ); Proceedings of Computing Science in The Netherlands, pp. 87–96.
Gutjahr, S., Ragg, T. (1997): Automatic Determination of Optimal Network Topologies Based on Information Theory and Evolution,IEEE Proceedings of the 23rd EUROMICRO Conference.
Schiffmann, W., Joost, M. and Werner, R. (1993) Comparison Of Optimized Backpropagation Algorithms, Proceedings. Of the European Symposium on Artificial Neural Networks, Brussels, pp. 97–104.
Mascioli, F., Martinelli, G. (1995): A Constructive Algorithm for Binary Neural Networks: The Oil Spot Algorithm,IEEE Transactions on Neural Networks, 6(3), pp 794–797.
Porto, V.W., Fogel, D.B. and Fogel, L.J. (1995): Alternative Neural Network Training Methods, IEEE Expert, volume 10, no. 4, pp. 16–22.
Topchy, A.P., and Lebedko, O.A. (1997): Neural Network Training by Means of Cooperative Evolutionary Search,Nuclear Instruments & Methods In Physics Research, Volume 389, no. 1–2, pp. 240–241.
Polani, D. and Miikkulainen, R. (1999): Fast Reinforcement Learning Through Eugenic Neuro-Evolution. Technical Report AI99–277, Department of Computer Sciences, University of Texas at Austin.
Kitano, H. (1990): Designing Neural Networks Using Genetic Algorithms with Graph Generation System, Complex Systems, Volume 4, No. 4, pp. 461–476.
Price, K.V. (1994) Genetic Annealing, Dr. Dobbs Journal, Vol. 220, pp. 127–132.
Stepniewski, S.W. and Keane, A.J. (1997): Pruning Back-propagation Neural Networks Using Modern Stochastic Optimization Techniques, Neural Computing & Applications, Vol. 5, pp. 76–98.
Fullmer, B. and Miikkulainen, R. (1992): Using Marker-Based Genetic Encoding of Neural Networks To Evolve Finite-State Behavior, Proceedings of the First European Conference on Artificial Life, France ), pp. 255–262.
Gruau, F. (1991): Genetic Synthesis of Modular Neural Networks,In S Forrest (Ed.) Genetic Algorithms: Proc. of the 5th International Conference, Morgan Kaufman.
Merril, J.W.L., Port, R.F. (1991): Fractally Configured Neural Networks, Neural Networks, Vol 4, No. 1, pp 53–60.
Kim, H.B., Jung, S.H., Kim, T.G. and Park, K.H. (1996): Fast Learning Method for Back-Propagation Neural Network by Evolutionary Adaptation of Learning Rates, Neurocomputing, vol. 11, no.1, pp. 101–106.
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2001 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Abraham, A., Nath, B. (2001). Hybrid Heuristics for Optimal Design of Artificial Neural Networks. In: John, R., Birkenhead, R. (eds) Developments in Soft Computing. Advances in Soft Computing, vol 9. Physica, Heidelberg. https://doi.org/10.1007/978-3-7908-1829-1_2
Download citation
DOI: https://doi.org/10.1007/978-3-7908-1829-1_2
Publisher Name: Physica, Heidelberg
Print ISBN: 978-3-7908-1361-6
Online ISBN: 978-3-7908-1829-1
eBook Packages: Springer Book Archive