Advertisement

Evolutionary Approach to Deep Learning

  • Hitoshi IbaEmail author
Chapter

Abstract

This chapter describes an evolutionary approach to deep learning networks. We first explain neuroevolution approach, which can adaptively learn a network structure and size appropriate to the task. A typical example of neuroevolution is NEAT. NEAT has demonstrated performance superior to that of conventional methods in a large number of problems. Then, several studies on deep neural networks with evolutionary optimization are explained, such as Genetic CNNs, hierarchical feature construction using GP, and Differentiable pattern-producing network (DPPSN).

Keywords

Neuroevolution Neuroevolution of augmenting topologies (NEAT) HyperNEAT L–system Composition pattern-producing network (CPPN) 

References

  1. 1.
    Fernando, C., Banarse, D., Reynolds, M., Besse, F., Pfau, D., Jaderberg, M., Lanctot, M., Wierstra, D.: Convolution by evolution–differentiable pattern producing networks. In: Proceedings of the Genetic and Evolutionary Computation Conference 2016. (GECCO16), pp. 109–116 (2016)Google Scholar
  2. 2.
    Hausknecht, M., Khandelwal, P., Miikkulainen, R., Stone, P.: HyperNEAT-GGP: A HyperNEAT-based Atari general game player. In: Proceedings of the Genetic and Evolutionary Computation Conference (GECCO 2012), pp. 217–224 (2012)Google Scholar
  3. 3.
    He, K., Zhang, X., Ren, S., Sun, J.: Deep residual learning for image recognition. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (2016)Google Scholar
  4. 4.
    Huang, G., Liu, Z., Weinbergerz, K.: Densely connected convolutional networks. In: Proceedings of Computer Vision and Pattern Recognition (CVPR2017) (2017)Google Scholar
  5. 5.
    Iba, H.: Agent-Based Modeling and Simulation with Swarm. Chapman and Hall/CRC, London (2013)Google Scholar
  6. 6.
    Ioffe, S., Szegedy, C.: Batch normalization: accelerating deep network training by reducing internal covariate shift. In: Proceedings of the 32nd International Conference on Machine Learning, vol. 37, pp. 448–456 (2015)Google Scholar
  7. 7.
    Kingma, D.P., Ba, J.: Adam: a method for stochastic optimization. In: Proceedings of the 3rd International Conference on Learning Representations (ICLR2015)Google Scholar
  8. 8.
    Krizhevsky, A., Hinton, G.: Learning multiple layers of features from tiny images. Technical report 1. Computer Science Department, University of Toronto (2009)Google Scholar
  9. 9.
    Krizhevsky, A., Sutskerver, I. Hinton, G.E.: ImageNet classification with deep convolutional neural networks. In: Advances in Neural Information Processing Systems 25 (NIPS), pp. 1097–1105 (2012)Google Scholar
  10. 10.
    Miikkulainen, R., Liang, J., Meyerson, E., Rawal, A., Fink, D., Francon, O., Raju, B., Shahrzad, H., Navruzyan, A., Duffy, N., Hodjat, B.: Evolving deep neural networks (2017). arXiv:1703.00548
  11. 11.
    Netzer, Y., Wang, T., Coates, A., Bissacco, A., Wu, B., Ng, A.: Reading digits in natural images with unsupervised feature learning. In: Proceedings of NIPS Workshop on Deep Learning and Unsupervised Feature Learning (2011)Google Scholar
  12. 12.
    Rozenberg, G. (ed.): The Book of L. Springer, Berlin (1986)Google Scholar
  13. 13.
    Schrum, J., Miikkulainen, R.: Evolving multimodal behavior with modular neural networks in Ms. Pac-Man. In: Proceedings of the Genetic and Evolutionary Computation Conference (GECCO 2014), pp. 325–332 (2014)Google Scholar
  14. 14.
    Simonyan, K., Zisserman, A.: Very deep convolutional networks for large-scale image recognition. In: Proceedings of International Conference on Learning Representations (2014)Google Scholar
  15. 15.
    Stanley, K.O.: Compositional pattern producing networks: a novel abstraction of development. Genet. Program. Evolvable Mach. (Special Issue on Dev. Syst.) 8(2), 131–162 (2007)Google Scholar
  16. 16.
    Stanley, K.O., Miikkulainen, R.: Evolving neural networks through augmenting topologies. Evol. Comput. 10(2), 99–127 (2002)Google Scholar
  17. 17.
    Stanley, K.O., D’Ambrosio, D.B., Gauci, J.: A hypercube-based encoding for evolving large-scale neural networks. Artif. Life 15(2), 185–212 (2009)Google Scholar
  18. 18.
    Suganuma, M., Shirakawa, S., Nagao, T.: A genetic programming approach to designing convolutional neural network architectures. In: Proceedings of the Genetic and Evolutionary Computation. Conference 2017 (GECCO2017), pp. 497–504 (2017)Google Scholar
  19. 19.
    Szegedy, C., Liu, W., Jia, Y., Sermanet, P., Reed, S., Anguelov, D., Erhan, D., Vanhoucke, V., Rabinovich, A.: Going deeper with convolutions. In: Proceedings of Computer Vision and Pattern Recognition (CVPR2016) (2016)Google Scholar
  20. 20.
    Unemi, T.: SBART2.4: Breeding 2D CG images and movies, and creating a type of collage. In: Proceedings of The Third International Conference on Knowledge-based Intelligent Information Engineering Systems, pp. 288–291 (1999)Google Scholar
  21. 21.
    Xie, L., Yuille, A.: Genetic CNN. In: Proceedings of IEEE International Conference on Computer Vision (ICCV)Google Scholar
  22. 22.
    Yao, X.: A review of evolutionary artificial neural networks. Int. J. Intell. Syst. 8, 539–567 (1993)Google Scholar
  23. 23.
    Zagoruyko, S., Komodakis, N.: Wide residual networks (2016). arXiv: 1605.07146

Copyright information

© Springer Nature Singapore Pte Ltd. 2018

Authors and Affiliations

  1. 1.The University of TokyoTokyoJapan

Personalised recommendations