Advertisement

Parallel Particle Swarm Optimization Using Message Passing Interface

  • Guang-Wei Zhang
  • Zhi-Hui Zhan
  • Ke-Jing Du
  • Ying Lin
  • Wei-Neng Chen
  • Jing-Jing Li
  • Jun Zhang
Part of the Proceedings in Adaptation, Learning and Optimization book series (PALO, volume 1)

Abstract

Parallel computation is an efficient way to combine the advantages of different computation paradigms to obtain promising solution. In order to analyze the performance of parallel computation techniques to the particle swarm optimization (PSO) algorithm, a parallel particle swarm optimization (PPSO) is proposed in this paper. Since the theorem of “no free lunch” exists, there is not an optimization algorithm that can perfectly tackle all problems. The PPSO provides a paradigm to combine different variants of PSO algorithms by using the Message Passing Interface (MPI) so that the advantages of diverse PSO algorithms can be utilized. The PPSO divides the whole evolution process into several stages. At the interval between two successive stages, each PSO algorithm exchanges the achievement of their evolution and then continues with the next stage of evolution. By merging the global model PSO (GPSO), the local model PSO (LPSO), the bare bone PSO (BPSO), and the comprehensive learning PSO (CLPSO), the PPSO achieves higher solution quality than the serial version of these four PSO algorithms, according to the simulation results on benchmark functions.

Keywords

Parallel particle swarm optimization (PPSO) evolutionary algorithm evolution stage Message Passing Interface (MPI) 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Kennedy, J., Eberhart, R.C.: Particle swarm optimization. In: Proc. IEEE Int. Conf. Neural Networks, pp. 1942–1948 (1995)Google Scholar
  2. 2.
    Krohling, R.A., dos Santos Coelho, L.: Coevolutionary particle swarm optimization using Gaussian distribution for solving constrained optimization problems. IEEE Trans. Syst., Man, Cybern. B, Cybern. 36(6), 1407–1416 (2006)CrossRefGoogle Scholar
  3. 3.
    Zhan, Z.-h., Zhang, J., Fan, Z.: Solving the optimal coverage problem in wireless sensor networks using evolutionary computation algorithms. In: Deb, K., Bhattacharya, A., Chakraborti, N., Chakroborty, P., Das, S., Dutta, J., Gupta, S.K., Jain, A., Aggarwal, V., Branke, J., Louis, S.J., Tan, K.C. (eds.) SEAL 2010. LNCS, vol. 6457, pp. 166–176. Springer, Heidelberg (2010)CrossRefGoogle Scholar
  4. 4.
    Zhan, Z.H., Li, J., Cao, J., Zhang, J., Chung, H., Shi, Y.H.: Multiple populations for multiple objectives: A coevolutionary technique for solving multiobjective optimization problems. IEEE Trans. Cybern. 43(2), 445–463 (2013)CrossRefGoogle Scholar
  5. 5.
    Zhan, Z.H., Li, J.J., Zhang, J.: Adaptive particle swarm optimization with variable relocation for dynamic optimization problems. In: Proc. IEEE Congr. Evol. Comput., pp. 1–7 (2014)Google Scholar
  6. 6.
    Shen, M., Zhan, Z.H., Chen, W.N., Gong, Y.J., Zhang, J., Li, Y.: Bi-velocity discrete particle swarm optimization and its application to multicast routing problem in communication networks. IEEE Trans. Ind. Electron 61(12), 7141–7151 (2014)CrossRefGoogle Scholar
  7. 7.
    Ji, C., Liu, F., Zhang, X.: Particle swarm optimization based on catfish effect for flood optima operation of reservoir. In: Proc. IEEE Int. Conf. Neutral Netw., pp. 1192–1201 (2011)Google Scholar
  8. 8.
    Zhang, J., Zhan, Z.H., Lin, Y., Chen, N., Gong, Y.J., Zhong, J.H., Chung, H., Li, Y., Shi, Y.H.: Evolutionary computation meets machine learning: A survey. IEEE Comput. Intell. Mag. 6(4), 68–75 (2011)CrossRefzbMATHGoogle Scholar
  9. 9.
    Zhan, Z.H., Zhang, J., Li, Y., Chung, H.: Adaptive Particle swarm optimization. IEEE Trans. Syst., Man, Cybern. B 39(6), 1362–1381 (2009)CrossRefGoogle Scholar
  10. 10.
    Wolpert, D.H., Macready, W.G.: No free lunch theorems for optimization. IEEE Trans. Evol. Comput. 1(1), 67–82 (1997)CrossRefGoogle Scholar
  11. 11.
    Shi, Y., Eberhart, R.C.: A modified particle swarm optimizer. In: Proc. IEEE World Congr. Comput. Intell., pp. 69–73 (1998)Google Scholar
  12. 12.
    Kennedy, J., Mendes, R.: Population structure and particle swarm performance. In: Proc. IEEE Congr. Evol. Comput., pp. 1671–1676 (2002)Google Scholar
  13. 13.
    Zhan, Z.H., Zhang, J., Li, Y., Shi, Y.H.: Orthogonal learning particle swarm optimization. IEEE Trans. Evol. Comput. 15(6), 832–847 (2011)CrossRefGoogle Scholar
  14. 14.
    Kennedy, J.: Bare bone particle swarms. In: Proc. IEEE Swarm Intelligence Symposium, pp. 80–87 (2003)Google Scholar
  15. 15.
    Bratton, D., Kennedy, J.: Defining a standard for Particle Swarm Optimization. In: Proc. IEEE Swarm Intelligence Symposium, pp. 120–127 (2007)Google Scholar
  16. 16.
    Eberhart, R.C., Shi, Y.: Guest editorial—Special Issue Particle Swarm Optimization. IEEE Trans. Evol. Comput. 8(3), 201–203 (2004)CrossRefGoogle Scholar
  17. 17.
    Li, Y.H., Zhan, Z.H., Lin, S., Wang, R.M., Luo, X.N.: Competitive and cooperative particle swarm optimization with information sharing mechanism for global optimization problems. Information Sciences (accepted, 2014)Google Scholar
  18. 18.
    Shi, Y., Eberhart, R.C.: Fuzzy adaptive particle swarm optimization. In: Proc. IEEE Congr. Evol. Comput., pp. 101–106 (2001)Google Scholar
  19. 19.
    Angeline, P.J.: Using selection to improve particle swarm optimiza-tion. In: Proc. IEEE Congr. Evol. Comput., pp. 84–89 (1998)Google Scholar
  20. 20.
    Lovbjerg, M., Rasmussen, T.K., Krink, T.: Hybrid particle swarmoptimizer with breeding and subpopulations. In: Proc. Genetic Evol. Comput. Conf., pp. 469–476 (2001)Google Scholar
  21. 21.
    Suganthan, P.N.: Particle swarm optimizer with neighborhood operator. In: Proc. Congr. Evol. Comput., pp. 1958–1962 (1999)Google Scholar
  22. 22.
    Liang, J.J., Qin, A.K., Suganthan, P.N., Baskar, S.: Comprehensive learning particle swarm optimizer for global optimization of multimodal functions. IEEE Trans. Evol. Comput. 10(3), 281–295 (2006)CrossRefGoogle Scholar
  23. 23.
    Krohling, R.A., Mendel, E.: Bare bone particle swarm optimization with Gaussian or Cauchy jumps. In: Proc. Congr. Evol. Comput., pp. 3285–3291 (2009)Google Scholar
  24. 24.
    Vanneschi, L., Codecasa, D., Mauri, G.: An empirical study of parallel and distributed particle swarm optimization. In: Fernandez de Vega, F., Hidalgo Pérez, J.I., Lanchares, J. (eds.) Parallel Architectures & Bioinspired Algorithms. SCI, vol. 415, pp. 125–150. Springer, Heidelberg (2012)Google Scholar
  25. 25.
    Deep, K., Sharma, S., Pant, M.: Modified parallel particle swarm optimization for global optimization using Message Passing Interface. In: Proc. Bio-Inspired Computing: Theories and Application, pp. 1451–1458 (2010)Google Scholar
  26. 26.
    Liu, X.F., Zhan, Z.H., Du, K.J., Chen, W.N.: Energy aware virtual machine placement scheduling in cloud computing based on ant colony optimization approach. In: Proc. Genetic Evol. Comput. Conf., pp. 41–47 (2014)Google Scholar

Copyright information

© Springer International Publishing Switzerland 2015

Authors and Affiliations

  • Guang-Wei Zhang
    • 1
    • 2
    • 3
  • Zhi-Hui Zhan
    • 1
    • 2
    • 3
  • Ke-Jing Du
    • 4
  • Ying Lin
    • 2
    • 3
    • 5
  • Wei-Neng Chen
    • 2
    • 3
    • 4
  • Jing-Jing Li
    • 6
  • Jun Zhang
    • 1
    • 2
    • 3
    • 4
  1. 1.Department of Computer ScienceSun Yat-sen UniversityGuangzhouChina
  2. 2.Key Lab. Machine Intelligence and Advanced ComputingMinistry of EducationBeijingChina
  3. 3.Engineering Research Center of Supercomputing Engineering SoftwareMOEBeijingChina
  4. 4.School of Advanced ComputingSun Yat-sen UniversityGuangzhouChina
  5. 5.Department of PsychologySun Yat-sen UniversityGuangzhouChina
  6. 6.School of Computer ScienceSouth China Normal UniversityGuangzhouChina

Personalised recommendations