Skip to main content

Evolutionary Deep Neural Networks

  • Chapter
  • First Online:
Evolutionary Algorithms and Neural Networks

Part of the book series: Studies in Computational Intelligence ((SCI,volume 780))

Abstract

This chapter first employs a kinematic model of hand to create two datasets for static hand postures. Neural Networks with different learning algorithms are then applied to the datasets for classification. The chapter also considers the comparison and analysis of different evolutionary algorithms for classifying datasets as well. Another contribution is finding the best set of features for the dataset using evolutionary algorithms. The results show that due to the large number of samples and features, Back Propagation is not effective due to the problem of local optima stagnation. However, Evolutionary Algorithms are able to efficiently classify the dataset with a very high accuracy and convergence speed. It was also observed that feature selection is important and evolutionary algorithms are able to find the optimal set of features for this problem.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

eBook
USD 16.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 129.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 129.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. McCulloch, W. S., & Pitts, W. (1990). A logical calculus of the ideas immanent in nervous activity. Bulletin of Mathematical Biology, 52(1), 99–115.

    Article  Google Scholar 

  2. Kiang, M. Y. (2001). Extending the Kohonen self-organizing map networks for clustering analysis. Computational Statistics & Data Analysis, 38, 161–180.

    Article  MathSciNet  Google Scholar 

  3. Gudise, V. G., & Venayagamoorthy, G. K. (2003) Comparison of particle swarm optimization and backpropagation as training algorithms for neural networks. In Proceedings of the 2003 IEEE swarm intelligence symposium. SIS’03 (pp. 110–117).

    Google Scholar 

  4. Bebis, G., & Georgiopoulos, M. (1994). Feed-forward neural networks. IEEE Potentials, 13, 27–31.

    Article  Google Scholar 

  5. Lang, B. (2005). Monotonic multi-layer perceptron networks as universal approximators. In International conference on artificial neural networks (pp. 31–37).

    Google Scholar 

  6. Ciregan, D., Meier, U., & Schmidhuber, J. (2012). Multi-column deep neural networks for image classification. In 2012 IEEE conference on computer vision and pattern recognition (CVPR) (pp. 3642–3649).

    Google Scholar 

  7. Li, J., Cheng, J.-H., Shi, J.-Y., & Huang, F. (2012). Brief introduction of back propagation (BP) neural network algorithm and its improvement. In Advances in computer science and information engineering (pp. 553–558). Springer.

    Google Scholar 

  8. Lee, Y., Oh, S.-H., & Kim, M. W. (1993). An analysis of premature saturation in back propagation learning. Neural Networks, 6, 719–728.

    Article  Google Scholar 

  9. Ibnkahla, M. (2003). Nonlinear system identification using neural networks trained with natural gradient descent. EURASIP Journal on Advances in Signal Processing, 2003, 1–9.

    Article  Google Scholar 

  10. Siddique, N., & Adeli, H. (2013). Evolutionary neural networks. In Computational intelligence: synergies of fuzzy logic, neural networks and evolutionary computing (pp. 307–355).

    Google Scholar 

  11. BoussaD, I., Lepagnot, J., & Siarry, P. (2013). A survey on optimization metaheuristics. Information Sciences, 237, 82–117.

    Article  MathSciNet  Google Scholar 

  12. Schaffer, J. D., Whitley, D., Eshelman, L. J. (1992) Combinations of genetic algorithms and neural networks: A survey of the state of the art. In International workshop on combinations of genetic algorithms and neural networks. COGANN-92 (pp. 1–37).

    Google Scholar 

  13. Khan, K., & Sahai, A. (2012). A comparison of BA, GA, PSO, BP and LM for training feed forward neural networks in e-learning context. International Journal of Intelligent Systems and Applications, 4, 23.

    Google Scholar 

  14. Ilonen, J., Kamarainen, J.-K., & Lampinen, J. (2003). Differential evolution training algorithm for feed-forward neural networks. Neural Processing Letters, 17, 93–105.

    Article  Google Scholar 

  15. Magoulas, G. D., Plagianakos, V. P., & Vrahatis, M. N. (2004). Neural network-based colonoscopic diagnosis using on-line learning and differential evolution. Applied Soft Computing, 4, 369–379.

    Article  Google Scholar 

  16. Oberweger, M., Riegler, G., Wohlhart, P., & Lepetit, V. (2016). Efficiently creating 3D training data for fine hand pose estimation. arXiv preprint arXiv:1605.03389.

  17. Malvezzi, M., Gioioso, G., Salvietti, G., Prattichizzo, D., & Bicchi, A. (2013). Syngrasp: A matlab toolbox for grasp analysis of human and robotic hands. In IEEE international conference on robotics and automation (ICRA) (pp. 1088–1093).

    Google Scholar 

  18. Malvezzi, M., Gioioso, G., Salvietti, G., & Prattichizzo, D. (2015). Syngrasp: A matlab toolbox for underactuated and compliant hands. IEEE Robotics & Automation Magazine, 22, 52–68.

    Article  Google Scholar 

  19. Kennedy, J. (2011). Particle swarm optimization. In Encyclopedia of machine learning (pp. 760–766). Springer.

    Google Scholar 

  20. Das, S. (2001). Filters, wrappers and a boosting-based hybrid for feature selection. In ICML (pp. 74–81).

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Seyedali Mirjalili .

Rights and permissions

Reprints and permissions

Copyright information

© 2019 Springer International Publishing AG, part of Springer Nature

About this chapter

Check for updates. Verify currency and authenticity via CrossMark

Cite this chapter

Mirjalili, S. (2019). Evolutionary Deep Neural Networks. In: Evolutionary Algorithms and Neural Networks. Studies in Computational Intelligence, vol 780. Springer, Cham. https://doi.org/10.1007/978-3-319-93025-1_9

Download citation

Publish with us

Policies and ethics