Abstract
Most global optimization algorithms offer a trade-off in that they solve one class of problems better for the price of solving another class of problems worse. This is to be expected in light of theoretical results like No free lunch theorem. It is desirable, therefore, to have an automatic method of constructing algorithms tuned for solving specific problems and classes of problems. We offer a variant of Fully-Informed Particle Swarm Optimization algorithm that is highly tunable. We show how to use meta-optimization to optimize it’s neighbourhood space and influence function to adjust it to solving various test problems. The optimized neighbourhood configurations and influence functions also give insights in to what it takes for a Particle Swarm Optimization algorithm to successfully solve a problem. These configurations are often contrary to what people would design using their intuitions. This means that meta-optimization in this case can be used as a tool for scientific exploration as well as for practical utility.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Preview
Unable to display preview. Download preview PDF.
References
de Oca, M.A.M., Stützle, T., Birattari, M., Dorigo, M.: A comparison of particle swarm optimization algorithms based on run-length distributions. In: Dorigo, M., Gambardella, L.M., Birattari, M., Martinoli, A., Poli, R., Stützle, T. (eds.) ANTS 2006. LNCS, vol. 4150, pp. 1–12. Springer, Heidelberg (2006)
Wolpert, D., Macready, W.: No free lunch theorems for optimization. IEEE Transactions on Evolutionary Computation 1(1), 67–82 (1997)
Pedersen, M.E.H.: Good parameters for particle swarm optimization. Technical Report HL1001, Hvass Laboratories (2010)
Meissner, M., Schmuker, M., Schneider, G.: Optimized particle swarm optimization (opso) and its application to artificial neural network training. BMC Bioinformatics 7, 125 (2006)
Poli, R., Langdon, W.B., Holland, O.: Extending particle swarm optimisation via genetic programming. In: Keijzer, M., Tettamanzi, A.G.B., Collet, P., van Hemert, J., Tomassini, M. (eds.) EuroGP 2005. LNCS, vol. 3447, pp. 291–300. Springer, Heidelberg (2005)
James Kennedy, R.C.E.: Particle swarm optimization. In: Proceedings of the IEEE International Conference on Neural Networks, vol. 4, pp. 1942–1948 (1995)
Clerc, M., Kennedy, J.: The particle swarm - explosion, stability, and convergence in a multidimensional complex space. IEEE Transactions on Evolutionary Computation 6(1), 58–73 (2002)
Mendes, R., Kennedy, J., Neves, J.: The fully informed particle swarm: simpler, maybe better. IEEE Transactions on Evolutionary Computation 8(3), 204–210 (2004)
Poli, R., Kennedy, J., Blackwell, T.: Particle swarm optimization. Swarm Intelligence 1(1), 33–57 (2007)
Bäck, T.: Evolutionary algorithms in theory and practice: evolution strategies, evolutionary programming, genetic algorithms. Oxford University Press, Oxford (1996)
Oldenhuis, R.: Many test functions for global optimizers (February 2009)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2014 Springer International Publishing Switzerland
About this chapter
Cite this chapter
Jančauskas, V. (2014). Optimizing Neighbourhood Distances for a Variant of Fully-Informed Particle Swarm Algorithm. In: Terrazas, G., Otero, F., Masegosa, A. (eds) Nature Inspired Cooperative Strategies for Optimization (NICSO 2013). Studies in Computational Intelligence, vol 512. Springer, Cham. https://doi.org/10.1007/978-3-319-01692-4_17
Download citation
DOI: https://doi.org/10.1007/978-3-319-01692-4_17
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-01691-7
Online ISBN: 978-3-319-01692-4
eBook Packages: EngineeringEngineering (R0)