Abstract
Majority of algorithms in the field of evolutionary artificial neural networks (EvoANN) rely on the proper choice and implementation of the perturbation function to maintain their population’s diversity from generation to generation. Maintaining diversity is an important factor in the evolution process since it helps the population of ANN (Artificial Neural Networks) to escape local minima. To determine which among the perturbation functions are ideal for ANN evolution, this paper analyzed the influence of the three commonly used functions, namely: Gaussian, Cauchy, and Uniform. Statistical comparisons were conducted to examine their influence in the generalization and training performance of EvoANN. Our simulations using the glass classification problem indicated that for mutation-with-crossover-based EvoANN, generalization performance among the three perturbation functions were not significantly different. On the other hand, mutation-based EvoANN that used Gaussian mutation performed as good as that with crossover but it performed worst when it used either Uniform or Cauchy distribution function. These observations suggest that crossover operation becomes a significant operation in systems that employ strong perturbation functions but has less significance in systems that use weak or conservative perturbation functions.
This is a preview of subscription content, log in via an institution.
Buying options
Tax calculation will be finalised at checkout
Purchases are for personal use only
Learn about institutional subscriptionsPreview
Unable to display preview. Download preview PDF.
References
Chellapilla, A., Fogel, D.: Two new mutation operators for enhanced search and optimization in evolutionary programming. In: Bosacchi, B., Bezdek, J.C., Fogel, D.B. (eds.) Proc. of SPIE: Applications of Soft Computing, vol. 3165, pp. 260–269 (1997)
Murphy, P.M., Aha, D.W.: UCI Repository of machine learning databases. University of California, Department of Information and Computer Science, Irvine, CA (1994)
Palmes, P., Hayasaka, T., Usui, S.: Evolution and adaptation of neural networks. In: Proceedings of the International Joint Conference on Neural Networks, IJCNN, Portland, Oregon, USA, July 19-24, 2003, vol. II, pp. 397–404. IEEE Computer Society Press, Los Alamitos (2003)
Palmes, P., Hayasaka, T., Usui, S.: SEPA: Structure evolution and parameter adaptation. In: Cantu Paz, E. (ed.) Proceedings of the Genetic and Evolutionary Computation Conference, Chicago, Illinois, 11-17 July 2003, vol. 2, p. 223. Morgan Kaufmann, San Francisco (2003)
Palmes, P., Hayasaka, T., Usui, S.: Mutation-based genetic neural network. IEEE Transactions on Neural Network (2004) (In press)
Prechelt, L.: Proben1–a set of neural network benchmark problems and benchmarking rules. Technical Report 21/94, Fakultat fur Informatik, Univ. Karlsruhe, Karlsruhe, Germany (September 1994)
Rudolph, G.: Local convergence rates of simple evolutionary algorithms with Cauchy mutations. IEEE Trans. on Evolutionary Computation 1(4), 249–258 (1997)
Yao, X., Liu, Y., Liu, G.: Evolutionary programming made faster. IEEE Trans. on Evolutionary Computation 3(2), 82–102 (1999)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2004 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Palmes, P.P., Usui, S. (2004). The Influence of Gaussian, Uniform, and Cauchy Perturbation Functions in the Neural Network Evolution. In: Pal, N.R., Kasabov, N., Mudi, R.K., Pal, S., Parui, S.K. (eds) Neural Information Processing. ICONIP 2004. Lecture Notes in Computer Science, vol 3316. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-540-30499-9_29
Download citation
DOI: https://doi.org/10.1007/978-3-540-30499-9_29
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-23931-4
Online ISBN: 978-3-540-30499-9
eBook Packages: Springer Book Archive