A Framework for Knowledge Integrated Evolutionary Algorithms
One of the main reasons for the success of Evolutionary Algorithms (EAs) is their general-purposeness, i.e. the fact that they can be applied in a straight forward manner to a broad range of optimization problems, without any specific prior knowledge. On the other hand, it has been shown that incorporating a priori knowledge, such as expert knowledge or empirical findings, can significantly improve the performance of an EA. However, integrating knowledge in EAs poses numerous challenges. It is often the case that the features of the search space are unknown, hence any knowledge associated with the search space properties can be hardly used. In addition, a priori knowledge is typically problem-specific and hard to generalize. In this paper, we propose a framework, called Knowledge Integrated Evolutionary Algorithm (KIEA), which facilitates the integration of existing knowledge into EAs. Notably, the KIEA framework is EA-agnostic, i.e. it works with any evolutionary algorithm, problem-independent, i.e. it is not dedicated to a specific type of problems and expandable, i.e. its knowledge base can grow over time. Furthermore, the framework integrates knowledge while the EA is running, thus optimizing the consumption of computational power. In the preliminary experiments shown here, we observe that the KIEA framework produces in the worst case an 80% improvement on the converge time, w.r.t. the corresponding “knowledge-free” EA counterpart.
KeywordsEvolutionary algorithms Knowledge incorporation Landscape analysis Evolutionary algorithm fingerprint
Open image in new window This project has received funding from the European Union’s Horizon 2020 research and innovation program under grant agreement No. 665347. We also gratefully acknowledge the computational resources provided by RWTH Compute Cluster from RWTH Aachen University under project RWTH0118.
- 4.Hornby, G.S., Globus, A., Linden, D.S., Lohn, J.D.: Automated antenna design with evolutionary algorithms. In: AIAA Space, pp. 19–21 (2006)Google Scholar
- 5.Lohn, J.D., Linden, D.S., Hornby, G.S., Kraus, W.F., Rodriguez-Arroyo, A.: Evolutionary design of an X-band antenna for NASA’s space technology 5 mission. In: null, vol. 155. IEEE (2003)Google Scholar
- 8.Bäck, T.: Selective pressure in evolutionary algorithms: A characterization of selection mechanisms. In: Proceedings of the First IEEE Conference on Evolutionary Computation, IEEE World Congress on Computational Intelligence, 57–62. IEEE (1994)Google Scholar
- 9.Gates, G.H., Merkle, L.D., Lamont, G.B., Pachter, R.: Simple genetic algorithm parameter selection for protein structure prediction. In: IEEE International Conference on Evolutionary Computation, vol. 2, pp. 620–624. IEEE (1995)Google Scholar
- 10.Yang, M., Cai, Z., Li, C., Guan, J.: An improved adaptive differential evolution algorithm with population adaptation. In: Proceedings of the 15th Annual Conference on Genetic and Evolutionary Computation, pp. 145–152. ACM (2013)Google Scholar
- 14.Harik, G.R., Lobo, F.G.: A parameter-less genetic algorithm. In: Proceedings of the 1st Annual Conference on Genetic and Evolutionary Computation-Volume 1, pp. 258–265. Morgan Kaufmann Publishers Inc. (1999)Google Scholar
- 16.Iacca, G., Mallipeddi, R., Mininno, E., Neri, F., Suganthan, P.N.: Super-fit and population size reduction in compact differential evolution. In: IEEE Workshop on Memetic Computing (MC), pp. 1–8. IEEE (2011)Google Scholar
- 19.Casas, N.: Genetic algorithms for multimodal optimization: a review. arXiv preprint arXiv:1508.05342 (2015)
- 20.Miller, B.L., Shaw, M.J.: Genetic algorithms with dynamic niche sharing for multimodal function optimization. In: Proceedings of IEEE International Conference on Evolutionary Computation, pp. 786–791. IEEE (1996)Google Scholar
- 22.Asmus, J., Borchmann, D., Sbalzarini, I.F., Walther, D.: Towards an FCA-based recommender system for black-box optimization. In: Workshop Notes, p. 35 (2014)Google Scholar
- 23.Muñoz, M.A., Kirley, M., Halgamuge, S.K.: A meta-learning prediction model of algorithm performance for continuous optimization problems. In: Coello, C.A.C., Cutello, V., Deb, K., Forrest, S., Nicosia, G., Pavone, M. (eds.) PPSN 2012. LNCS, vol. 7491, pp. 226–235. Springer, Heidelberg (2012). doi: 10.1007/978-3-642-32937-1_23CrossRefGoogle Scholar
- 24.Picek, S., Jakobovic, D.: From fitness landscape to crossover operator choice. In: Proceedings of the 2014 Annual Conference on Genetic and Evolutionary Computation, pp. 815–822. ACM (2014)Google Scholar