Empirical Assessment of Human Learning Principles Inspired PSO Algorithms on Continuous Black-Box Optimization Testbed
This paper benchmarks the performance of one of the recent research directions in the performance improvement of particle swarm optimization algorithm; human learning principles inspired PSO variants. This article discusses and provides performance comparison of nine different PSO variants. The Comparing Continuous Optimizers (COCO) methodology has been adopted in comparing these variants on the noiseless BBOB testbed, providing useful insight regarding their relative efficiency and effectiveness. This study provides the research community a comprehensive account of suitability of a PSO variant in solving selective class of problems under different budget settings. Further, certain rectifications/extensions have also been suggested for the selected PSO variants for possible performance enhancement. Overall, it has been observed that SL-PSO and MePSO are most suited for expensive and moderate budget settings respectively. Further, iSRPSO and TPLPSO have provided better solutions under cheap budget settings where iSRPSO has shown robust behaviour (better solutions over dimensions). We hope this paper would mark a milestone in assessing the human learning principles inspired PSO algorithms and used as a baseline for performance comparison.
KeywordsPSO Human learning principles inspired PSO variants COCO methodology Black-box optimization
The authors wish to extend their thanks to the ATMRI:2014-R8, Singapore, for providing financial support to conduct this study.
- 1.Arya, M., Deep, K., Bansal, J.C.: A nature inspired adaptive inertia weight in particle swarm optimisation. Int. J. AI Soft Comput. 4(2–3), 228–248 (2014)Google Scholar
- 5.Eslami, M., Shareef, H., Khajehzadeh, M., Mohamed, A.: A survey of the state of the art in particle swarm optimization. Res. J. Appl. Sci. Eng. Technol. 4(9), 1181–1197 (2012)Google Scholar
- 6.Finck, S., Hansen, N., Ros, R., Auger, A.: Real-parameter black-box optimization benchmarking 2009: presentation of the noiseless functions. Technical Report 2009/20, Research Center PPE (2009). Updated, February 2010Google Scholar
- 7.Hansen, N., Auger, A., Finck, S., Ros, R.: Real-parameter black-box optimization benchmarking 2012: experimental setup. Technical report, INRIA (2012)Google Scholar
- 8.Hansen, N., Finck, S., Ros, R., Auger, A.: Real-parameter black-box optimization benchmarking 2009: noiseless functions definitions. Technical report RR-6829, INRIA (2009). Updated February 2010Google Scholar
- 10.Kennedy, J., Eberhart, R.: Particle swarm optimization. In: Proceedings of IEEE International Conference on Neural Networks, pp. 1942–1948 (1995)Google Scholar
- 15.Poli, R.: Analysis of the publications on the applications of particle swarm optimization. Artif. Evol. Appl. 28, 1–10 (2008)Google Scholar
- 16.Price, K.: Differential evolution vs. the functions of the second ICEO. In: Proceedings of the IEEE International CEC, pp. 153–157 (1997)Google Scholar
- 20.Tanweer, M.R., Suresh, S., Sundararajan, N.: Human meta-cognition inspired collaborative search algorithm for optimization. In: IEEE MFI, pp. 1–6 (2014)Google Scholar
- 23.Tanweer, M.R., Suresh, S., Sundararajan, N.: Improved SRPSO algorithm for solving CEC 2015 computationally expensive numerical optimization problems. In: IEEE CEC, pp. 1943–1949 (2015)Google Scholar
- 24.Tanweer, M.R., Suresh, S., Sundararajan, N.: Mentoring based particle swarm optimization algorithm for faster convergence. In: IEEE CEC, pp. 196–203 (2015)Google Scholar