MLP-LOA: a metaheuristic approach to design an optimal multilayer perceptron
- 41 Downloads
Designing an ANN is a complex task as its performance is highly dependent on the network architecture as well as the training algorithm used to select proper synaptic weights and biases. Choosing an optimal design leads to greater accuracy when the ANN is used for classification. In this paper, we propose an approach multilayer perceptron-lion optimization algorithm (MLP-LOA) that uses lion optimization algorithm to find an optimum multilayer perceptron (MLP) architecture for a given classification problem. MLP-LOA uses back-propagation (BP) for training during the optimization process. MLP-LOA also optimizes learning rate and momentum as they have a significant role while training MLP using BP. LOA is a population-based metaheuristic algorithm inspired by the lifestyle of lions and their cooperative behavior. LOA, unlike other metaheuristics, uses different strategies to search for optimal solution, performs strong local search and helps to escape from worst solutions. A new fitness function is proposed to evaluate MLP based on its generalization ability as well as the network’s complexity. This is done to avoid dense architectures as they increase chances of overfitting. The proposed approach is tested on different classification problems selected from University of California Irvine repository and compared with the existing state-of-the-art techniques in terms of accuracy achieved during testing phase. Experimental results show that MLP-LOA performs better as compared to the existing state-of-the-art techniques.
KeywordsMultilayer perceptron Lion optimization algorithm Classification Back-propagation
Compliance with ethical standards
Conflict of interest
Priti Bansal, Shakshi Gupta, Sumit Kumar, Shubham Sharma and Shreshth Sharma declare that he has no conflict of interest.
This article does not contain any studies with human participants or animals performed by any of the authors.
- Aljarah I, Faris H, Mirjalili S (2016) Optimizing connection weights in neural networks using the whale optimization algorithm. In: Proceedings soft computing, pp 1–15Google Scholar
- Carvalho M, Ludermir T (2007) Particle swarm optimization of neural network architectures and weights. In: 7th International conference on hybrid intelligent systems, pp 336–339Google Scholar
- Chen L, Zhang X (2009) Application of artificial neural networks to classify water quality of the yellow river. In: Cao B, Zhang C, Li T (eds) Fuzzy information and engineering. Advances in soft computing, vol 54, pp 15–23. . Springer, Berlin, HeidelbergGoogle Scholar
- Conforth M, Meng Y (2008) Toward evolving neural networks using bio-inspired algorithms. In: Proceedings of the international conference on artificial intelligence, pp 413–419Google Scholar
- Ettaouil M, Ghanou Y (2009) Neural architectures optimization and genetic algorithms. WSEAS Trans Comput 8(3):526–537Google Scholar
- Frank A, Asuncion A (2010) UCI machine learning repository http://archive.ics.uci.edu/ml. University of California, School of Information and Computer Science, Irvine, CA
- Garro BA, Sossa H, Vazquez RA (2011) Artificial neural network synthesis by means of artificial bee colony (abc) algorithm. In: Proceedings of the IEEE congress on evolutionary computation (CEC’11), pp 331–338Google Scholar
- Lu C, Shi B, Chen L (2000) Hybrid BP-GA for multilayer feedforward neural networks. In: Proceedings 7th IEEE international conference electronics, circuits and systems (ICECS), pp 958–961Google Scholar
- Ma X, Gan X (2009) Condition monitoring and faults recognizing of dish centrifugal separator by artificial neural network combined with expert system. In: Proceedings 5th international conference on natural computing, pp 203–207Google Scholar
- Sheng W, Shan P, Mao J, Zheng Y, Chen S, Wang Z (2017) An adaptive memetic algorithm with rank-based mutation for artificial neural network architecture optimization. IEEE Access 5:8895–18908Google Scholar
- Silalahi DD, Reano CE, Lansigan FP, Panopio RG, Bantayan NC (2016) Using genetic algorithm neural network on near infrared spectral data for ripeness grading of oil palm (Elaeis guineensis Jacq.) fresh fruit. Inf Process Agric 3(4):252–261Google Scholar
- Tizhoosh HR (2005) Opposition-based learning: a new scheme for machine intelligence. In: Proceedings of international conference on computational intelligence for modeling control and automation, pp 695–701Google Scholar
- Wdaa ASI (2008) Differential evolution for neural networks learning enhancement. Ph.D. thesis, University Teknologi, MalaysiaGoogle Scholar
- Wilson DR, Martinez TR (2001) The need for small learning rates on large problems. In: Proceedings of international joint conference on neural networks, pp 115–119Google Scholar