Skip to main content

Evolving Node Transfer Functions in Artificial Neural Networks for Handwritten Digits Recognition

  • Conference paper
  • First Online:
Computer Vision and Graphics (ICCVG 2016)

Part of the book series: Lecture Notes in Computer Science ((LNIP,volume 9972))

Included in the following conference series:

Abstract

Feed-forward Artificial Neural Networks are popular choices among scientists and engineers for modeling complex real-world problems. One of the latest research areas in this field is evolving Artificial Neural Networks: NeuroEvolution. In this paper we investigate the effect of evolving a node transfer function and its parameters along with the evolution of connection weights in Evolutionary Artificial Neural Networks for the problem of handwritten digits recognition. The results are promising when compared with the traditional approach of homogeneous Artificial Neural Network with predefined transfer function.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 39.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Similar content being viewed by others

References

  1. Kent, A., Williams, J.G. (eds.): Evolutionary Artificial Neural Networks. Encyclopedia of Computer Science and Technology, vol. 33, pp. 137–170. Marcel Dekker, New York (1995)

    Google Scholar 

  2. Angeline, P.J., Saunders, G.M., Pollack, J.B.: An evolutionary algorithm that constructs recurrent neural networks. Neural Networks, pp. 54–65 (1994)

    Google Scholar 

  3. Stanley, K.O., Miikkulainen, R.: Evolving neural networks through augmenting topologies. Evol. Comput. 10, 99–127 (2002)

    Article  Google Scholar 

  4. Mahsal, K.M., Masood, A.A., Khan, M., Miller, J.F.: Fast learning neural networks using Cartesian genetic programming. Neurocomputing (2013)

    Google Scholar 

  5. Duch, W., Jankowski, N.: Transfer functions: hidden possibilities for better neural networks. In: ESANN, pp. 81–94 (2001)

    Google Scholar 

  6. Duch, W., Jankowski, N.: Survey of neural transfer functions. Neural Comput. Surv. 2, 163–212 (1999)

    Google Scholar 

  7. Chauvin, Y., Rumelhart, D.E. (eds.): Backpropagation: Theory, Architectures, and Applications. Erlbaum, Hillsdale (1995)

    Google Scholar 

  8. Belew, R.K., McInerney, J., Schraudolph, N.N.: Evolving networks: using genetic algorithm with connectionist learning. University of California, San Diego, Technical report CS90-174 (1991)

    Google Scholar 

  9. Mani, G.: Learning by gradient descent in function space. In: Proceedings of the IEEE Internation Conference on System, Man, and Cybernetics, Los Angeles, CA, pp. 242–247 (1990)

    Google Scholar 

  10. Liu, Y., Yao, X.: Evolutionary design of artificial neural networks with different nodes. In: Proceedings of IEEE International Conference on Evolutionary Computation, pp. 670–675 (1996)

    Google Scholar 

  11. Poli, R.: Parallel distributed genetic programming. In: New Ideas in Optimization, Advanced Topics in Computer Science, pp. 403–431 (1999)

    Google Scholar 

  12. James, A.T., Miller, J.F.: Cartesian genetic programming encoded artificial neural networks: a comparison using three benchmarks. In: Proceedings of the Conference on Genetic and Evolutionary Computation (GECCO 2013), pp. 1005–1012 (2013)

    Google Scholar 

  13. Manning, T., Walsh, P.: Improving the performance of CGPANN for breast cancer diagnosis using crossover and radial basis functions. In: Vanneschi, L., Bush, W.S., Giacobini, M. (eds.) EvoBIO 2013. LNCS, vol. 7833, pp. 165–176. Springer, Heidelberg (2013)

    Chapter  Google Scholar 

  14. James, A.T., Miller, J.F.: NeuroEvolution: The Importance of Transfer Function Evolution (2013)

    Google Scholar 

  15. Thrun, S.B., Bala, J., Bloedorn, E., Bratko, I., Cestnik, B., Cheng, J., De Jong, K., Dzeroski, S., Fahlman, S.E., Fisher, D., et al.: The monk’s problems a performance comparison of different learning algorithms. Technical report, Carnegie Mellon University (1991)

    Google Scholar 

  16. The MNIST database of handwritten digits. http://yann.lecun.com/exdb/mnist/

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Dmytro Vodianyk .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2016 Springer International Publishing AG

About this paper

Cite this paper

Vodianyk, D., Rokita, P. (2016). Evolving Node Transfer Functions in Artificial Neural Networks for Handwritten Digits Recognition. In: Chmielewski, L., Datta, A., Kozera, R., Wojciechowski, K. (eds) Computer Vision and Graphics. ICCVG 2016. Lecture Notes in Computer Science(), vol 9972. Springer, Cham. https://doi.org/10.1007/978-3-319-46418-3_54

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-46418-3_54

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-46417-6

  • Online ISBN: 978-3-319-46418-3

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics