Advertisement

Self-adaptive Parameters Optimization for Incremental Classification in Big Data Using Swarm Intelligence

  • Saad M. Darwish
  • Akmal I. SaberEmail author
Conference paper
  • 55 Downloads
Part of the Advances in Intelligent Systems and Computing book series (AISC, volume 1153)

Abstract

Nowadays, big data is one of the most technical challenges confront researchers and companies. The main challenge lies in the fact that big data sources usually formed in a continuous data stream. Thus, many previous researches present incremental data mining approaches to deal with the challenges of the data streams by adapting traditional machine learning algorithms. Artificial Neural Network (ANN) is a common technique in this field. The main challenge is how to optimize the neural network parameters to deal with a huge data arrive over time. These parameters, which are vital for the performance of a neural network, are called hyperparameters. Most earlier optimization approaches have dealt with big data containers instead of big data streams or handled big data streams with time consumed. This paper proposes an incremental learning process for ANN hyperparameters optimization over data stream by utilizing Grasshopper Algorithm (GOA) as a swarm intelligence technique. GOA is utilized to make a balance between exploration and exploitation to find the best set of ANN hyperparameters suitable for data stream. The experimental results illustrate that the proposed optimization model yields better accuracy results with appropriate CPU time.

Keywords

Big data Incremental classification Hyperparameters optimization Grasshopper algorithm 

References

  1. 1.
    Emani, C.K., Cullot, N., Nicolle, C.: Understandable big data: a survey. Comput. Sci. Rev. 17, 70–81 (2015)MathSciNetCrossRefGoogle Scholar
  2. 2.
    Bayera, H., Aksogana, M., Celikb, E., Kondilogluc, A.: Big data mining and business intelligence trends. J. Asian Bus. Strategy 7(1), 23–33 (2017)CrossRefGoogle Scholar
  3. 3.
    Ruzgas, T., Jakubėlienė, K.: Big data mining and knowledge discovery. J. Commun. Technol. Electron. Comput. Sci. 9(1), 5–9 (2016)CrossRefGoogle Scholar
  4. 4.
    Sivarajah, U., Kamal, M.M., Irani, Z., Weerakkody, V.: Critical analysis of big data challenges and analytical methods. J. Bus. Res. 70, 263–286 (2017)CrossRefGoogle Scholar
  5. 5.
    Mnich, M.: Big data algorithms beyond machine learning. KI-Künstliche Intell. 32(1), 9–17 (2018)MathSciNetCrossRefGoogle Scholar
  6. 6.
    Losing, V., Hammer, B., Wersing, H.: Incremental on-line learning: a review and comparison of state of the art algorithms. Neurocomputing 275, 1261–1274 (2018)CrossRefGoogle Scholar
  7. 7.
    Gepperth, A., Hammer, B.: Incremental learning algorithms and applications. In: Proceedings on European Symposium on Artificial Neural Networks, Computational Intelligence and Machine Learning, Belgium, pp. 357–368 (2017)Google Scholar
  8. 8.
    Bose, S., Huber, M.: Incremental learning of neural network classifiers using reinforcement learning. In: Proceedings on IEEE International Conference on Systems, Man, and Cybernetics (SMC), Budapest, pp. 97–103 (2016)Google Scholar
  9. 9.
    Fong, S., Fang, C., Tian, N., Wong, R., Yap, B.W: Self-adaptive parameters optimization for incremental classification in big data using neural network. In: Big Data Applications and Use Cases, pp. 175–196. Springer (2016)Google Scholar
  10. 10.
    Cheng, S., Shi, Y., Qin, Q., Bai, R.: Swarm intelligence in big data analytics. In: Intelligent Data Engineering and Automated Learning, pp. 417–426. Springer (2013)Google Scholar
  11. 11.
    Fong, S., Yang, X., Deb, S.: Swarm search for feature selection in classification. In: Proceedings of IEEE International Conference on Computational Science and Engineering, Sydney, pp. 902–909 (2013)Google Scholar
  12. 12.
    Fong, S., Deb, S., Yang, X., Li, J.: Feature selection in life science classification: metaheuristic swarm search. IT Prof. 16(4), 24–29 (2014)CrossRefGoogle Scholar
  13. 13.
    Heidari, A.A., Faris, H., Aljarah, I., Mirjalili, S.: An efficient hybrid multilayer perceptron neural network with grasshopper optimization. Soft Comput. 1–18 (2018)Google Scholar
  14. 14.
    Schilling, N., Wistuba, M., Drumond, L., Schmidt-Thieme, L.: Hyperparameter optimization with factorized multilayer perceptrons. In: Joint European Conference on Machine Learning and Knowledge Discovery in Databases, pp. 87–103. Springer, Cham (2015)Google Scholar
  15. 15.
    Bochinski, E., Senst, T., Sikora, T.: Hyper-parameter optimization for convolutional neural network committees based on evolutionary algorithms. In: Proceedings of IEEE International Conference on Image Processing (ICIP), Beijing, pp. 3924–3928 (2017)Google Scholar
  16. 16.
    Tsirikoglou, P., Abraham, S., Contino, F., Lacor, C., Ghorbaniasl, G.: A hyperparameters selection technique for support vector regression models. Appl. Soft Comput. 61, 139–148 (2017)CrossRefGoogle Scholar
  17. 17.
    Bilal, M., Canini, M.: Towards automatic parameter tuning of stream processing systems. In: Proceedings of the 2017 Symposium on Cloud Computing, pp. 189–200. ACM (2017)Google Scholar
  18. 18.
    Qolomany, B., Maabreh, M., Al-Fuqaha, A., Gupta, A., Benhaddou, D.: Parameters optimization of deep learning models using Particle swarm optimization. In: Proceedings of 13th International Wireless Communications and Mobile Computing Conference (IWCMC), pp. 1285–1290. IEEE (2017)Google Scholar
  19. 19.
    Saremi, S., Mirjalili, S., Lewis, A.: Grasshopper optimisation algorithm: theory and application. Adv. Eng. Softw. 105, 30–47 (2017)CrossRefGoogle Scholar

Copyright information

© Springer Nature Switzerland AG 2020

Authors and Affiliations

  1. 1.Department of Information Technology, Institute of Graduate Studies and ResearchAlexandria UniversityAlexandriaEgypt

Personalised recommendations