Advertisement

An Automated Approach for Developing a Convolutional Neural Network Using a Modified Firefly Algorithm for Image Classification

  • Ahmed I. SharafEmail author
  • El-Sayed F. Radwan
Chapter
Part of the Springer Tracts in Nature-Inspired Computing book series (STNIC)

Abstract

Convolutional neural network (CNN) is a basic configuration of neural networks that can perform deep learning. There are many applications based on CNN in fields of image processing, machine learning, and data analysis. The CNN is a complex neural network with a various number of hidden layers (depth) and a large number of neurons. The depth of the CNN is the essential factor that determines how the network can perform a complicated task. The design phase of CNNs requires potential from non-experts of machine learning. In this chapter, a fully automated algorithm to develop CNN was proposed based on firefly optimization. The proposed method can design a CNN structure with any number of layer depth without any limitation on the depth value. The proposed method employed the skip connection as a fundamental building block of CNN. A modified firefly algorithm was presented base on the \(k-\)nearest neighbor attraction model to reduce the computational complexity of the firefly. The CIFAR-10 and CIFAR-100 were used for the training and validation of the proposed method to perform image classification. The proposed method provided high accuracy when compared to the cutting-edge approaches.

Keywords

Convolutional neural network Firefly Optimization Algorithm evolutionary Optimization Firefly attraction model hybrid Algorithms 

References

  1. 1.
    Krizhevsky A, Sutskever I, Hinton GE (2012) Imagenet classification with deep convolutional neural networks. In: Advances in neural information processing systems, pp 1097–1105Google Scholar
  2. 2.
    Simonyan K, Zisserman A (2014) Very deep convolutional networks for large-scale image recognition. arXiv preprint arXiv:1409.1556
  3. 3.
    He K, Zhang X, Ren S, Sun J (2016) Deep residual learning for image recognition. In: Proceedings of the IEEE conference on computer vision and pattern recognition, pp 770–778Google Scholar
  4. 4.
    Huang G, Liu Z, Van Der Maaten L, Weinberger KQ (2017) Densely connected convolutional networks. In: Proceedings of the IEEE conference on computer vision and pattern recognition, pp 4700–4708Google Scholar
  5. 5.
    Brownlee J (2011) Clever algorithms: nature-inspired programming recipes. Jason BrownleeGoogle Scholar
  6. 6.
    Sun Y, Xue B, Zhang M, Yen GG (2018) Automatically designing CNN architectures using genetic algorithm for image classification. arXiv preprint arXiv:1808.03818
  7. 7.
    Real E, Moore S, Selle A, Saxena S, Suematsu YL, Tan J, Le QV, Kurakin A (2017) Large-scale evolution of image classifiers. In: Proceedings of the 34th international conference on machine learning, vol 70, pp 2902–2911. JMLR. orgGoogle Scholar
  8. 8.
    Liu H, Simonyan K, Vinyals O, Fernando C, Kavukcuoglu K (2017) Hierarchical representations for efficient architecture search. arXiv preprint arXiv:1711.00436
  9. 9.
    Suganuma M, Shirakawa S, Nagao T (2017) A genetic programming approach to designing convolutional neural network architectures. In: Proceedings of the genetic and evolutionary computation conference. ACM, New York, pp 497–504Google Scholar
  10. 10.
    Arulkumaran K, Deisenroth MP, Brundage M, Bharath AA (2017) A brief survey of deep reinforcement learning. arXiv preprint arXiv:1708.05866
  11. 11.
    Zoph B, Le QV (2016) Neural architecture search with reinforcement learning. arXiv preprint arXiv:1611.01578
  12. 12.
    Baker B, Gupta O, Naik N, Raskar R (2016) Designing neural network architectures using reinforcement learning. arXiv preprint arXiv:1611.02167
  13. 13.
    Cai H, Chen T, Zhang W, Yu Y, Wang J (2018) Efficient architecture search by network transformation. In: Thirty-Second AAAI conference on artificial intelligenceGoogle Scholar
  14. 14.
    Zhong Z, Yan J, Liu CL (2017) Practical network blocks design with q-learning, 1(2):5. arXiv preprint arXiv:1708.05552
  15. 15.
    Dorigo M, Di Caro G (1999) Ant colony optimization: a new meta-heuristic. In: Proceedings of the 1999 Congress on evolutionary computation-CEC99 (Cat. No. 99TH8406), vol 2. IEEE, New York, pp 1470–1477Google Scholar
  16. 16.
    Kennedy J, Eberhart RC (1999) The particle swarm: social adaptation in information-processing systems. In: New ideas in optimization. McGraw-Hill Ltd., pp 379–388Google Scholar
  17. 17.
    Yang XS, Deb S (2009) Cuckoo search via lévy flights. In: 2009 World Congress on Nature & biologically inspired computing (NaBIC). IEEE, New York, pp 210–214Google Scholar
  18. 18.
    Yang XS (2010) A new metaheuristic bat-inspired algorithm. In: Nature inspired cooperative strategies for optimization (NICSO 2010). Springer, Berlin, pp 65–74CrossRefGoogle Scholar
  19. 19.
    Dey N (2017) Advancements in applied metaheuristic computing. IGI GlobalGoogle Scholar
  20. 20.
    Krizhevsky A, Hinton G (2009) Learning multiple layers of features from tiny images. Report, CiteseerGoogle Scholar
  21. 21.
    Khan A, Sohail A, Zahoora U, Qureshi AS (2019) A survey of the recent architectures of deep convolutional neural networks. arXiv preprint arXiv:1901.06032
  22. 22.
    Chakraborty S, Dey N, Samanta S, Ashour AS, Balas VE (2016) Firefly algorithm for optimized nonrigid demons registration. In: Bio-inspired computation and applications in image processing. Elsevier, Amsterdam, pp 221–237CrossRefGoogle Scholar
  23. 23.
    Dey N, Samanta S, Chakraborty S, Das A, Chaudhuri SS, Suri JS (2014) Firefly algorithm for optimization of scaling factors during embedding of manifold medical information: an application in ophthalmology imaging. J Med Imaging Health Inf 4(3):384–394CrossRefGoogle Scholar
  24. 24.
    Yang XS (2010) Nature-inspired metaheuristic algorithms. Luniver pressGoogle Scholar
  25. 25.
    dos Santos Coelho L, Bora TC, Schauenburg F, Alotto P (2013) A multiobjective firefly approach using beta probability distribution for electromagnetic optimization problems. IEEE Trans Magn 49(5):2085–2088.  https://doi.org/10.1109/tmag.2013.2238902CrossRefGoogle Scholar
  26. 26.
    Gandomi AH, Yang XS, Alavi AH (2011) Mixed variable structural optimization using firefly algorithm. Comput Struct 89(23–24):2325–2336CrossRefGoogle Scholar
  27. 27.
    Kazem A, Sharifi E, Hussain FK, Saberi M, Hussain OK (2013) Support vector regression with chaos-based firefly algorithm for stock market price forecasting. Appl Soft Comput 13(2):947–958CrossRefGoogle Scholar
  28. 28.
    Horng MH (2012) Vector quantization using the firefly algorithm for image compression. Exp Syst Appl 39(1):1078–1091.  https://doi.org/10.1016/j.eswa.2011.07.108CrossRefGoogle Scholar
  29. 29.
    Wang H, Wang W, Zhou X, Sun H, Zhao J, Yu X, Cui Z (2017) Firefly algorithm with neighborhood attraction. Inf Sci 382–383:374–387CrossRefGoogle Scholar
  30. 30.
    Hawkins DM (2004) The problem of overfitting. J Chem Inf Comput Sci 44(1):1–12CrossRefGoogle Scholar
  31. 31.
    Srivastava N, Hinton G, Krizhevsky A, Sutskever I, Salakhutdinov R (2014) Dropout: a simple way to prevent neural networks from overfitting. J Mach Learn Res 15(1):1929–1958MathSciNetzbMATHGoogle Scholar
  32. 32.
    Glorot X, Bordes A, Bengio Y (2011) Deep sparse rectifier neural networks. In: Proceedings of the fourteenth international conference on artificial intelligence and statistics, pp 315–323Google Scholar
  33. 33.
    Ioffe S (2017) Batch renormalization: towards reducing minibatch dependence in batch-normalized models. In: Advances in neural information processing systems, pp 1945–1953Google Scholar
  34. 34.
    Bottou L (2012) Stochastic gradient descent tricks. Springer, Berlin, pp 421–436Google Scholar
  35. 35.
    Abdulal W, Jabas A, Ramachandram S, Jadaan OA (2011) Mutation based simulated annealing algorithm for minimizing makespan in grid computing systems. In: 2011 3rd international conference on electronics computer technology, vol 6. IEEE, New York, pp 90–94.  https://doi.org/10.1109/ICECTECH.2011.5942057
  36. 36.
    Sutskever I, Martens J, Dahl G, Hinton G (2013) On the importance of initialization and momentum in deep learning. In: International conference on machine learning, pp 1139–1147Google Scholar
  37. 37.
    Goodfellow IJ, Warde-Farley D, Mirza M, Courville A, Bengio Y (2013) Maxout networks. arXiv preprint arXiv:1302.4389
  38. 38.
    Lin M, Chen Q, Yan S (2013) Network in network. arXiv preprint arXiv:1312.4400
  39. 39.
    Srivastava RK, Greff K, Schmidhuber J (2015) Highway networks. arXiv preprint arXiv:1505.00387
  40. 40.
    Springenberg JT, Dosovitskiy A, Brox T, Riedmiller M (2014) Striving for simplicity: the all convolutional net. arXiv preprint arXiv:1412.6806
  41. 41.
    Xie L, Yuille A (2017) Genetic CNN. In: Proceedings of the IEEE international conference on computer vision, pp 1379–1388Google Scholar
  42. 42.
    Sharma N, Jain V, Mishra A (2018) An analysis of convolutional neural networks for image classification. Proc Comput Sci 132:377–384CrossRefGoogle Scholar

Copyright information

© Springer Nature Singapore Pte Ltd. 2020

Authors and Affiliations

  1. 1.Department of Computer Sciences, Faculty of Computers and InformationEl-Mansoura UniversityMansouraEgypt
  2. 2.Deanship of Scientific ResearchUmm Al-Qura UniversityMeccaSaudi Arabia

Personalised recommendations