Advertisement

A Generalized Approach to Construct Benchmark Problems for Dynamic Optimization

  • Changhe Li
  • Shengxiang Yang
Part of the Lecture Notes in Computer Science book series (LNCS, volume 5361)

Abstract

There has been a growing interest in studying evolutionary algorithms in dynamic environments in recent years due to its importance in real applications. However, different dynamic test problems have been used to test and compare the performance of algorithms. This paper proposes a generalized dynamic benchmark generator (GDBG) that can be instantiated into the binary space, real space and combinatorial space. This generator can present a set of different properties to test algorithms by tuning some control parameters. Some experiments are carried out on the real space to study the performance of the generator.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Lewis, J., Hart, E., Ritchie, G.: A comparison of dominance mechanisms and simple mutation on non-stationary problems. In: Proc. of the 5th Int. Conf. on Parallel Problem Solving from Nature, pp. 139–148 (1998)Google Scholar
  2. 2.
    Branke, J.: Memory enhanced evolutionary algorithms for changing optimization problems. In: Proc. of the 1999 Congr. on Evol. Comput., pp. 1875–1882 (1999)Google Scholar
  3. 3.
    Morrison, R.W., De Jong, K.A.: A test problem generator for non-stationary environments. In: Proc. of the 1999 Congr. on Evol. Comput., pp. 2047–2053 (1999)Google Scholar
  4. 4.
    Yang, S.: Non-stationary problem optimization using the primal-dual genetic algorithm. In: Proc. of the 2003 IEEE Congr. on Evol. Comput., pp. 2246–2253 (2003)Google Scholar
  5. 5.
    Yang, S., Yao, X.: Experimental study on population-based incremental learning algorithms for dynamic optimization problems. Soft Comput. 9(11), 815–834 (2005)CrossRefzbMATHGoogle Scholar
  6. 6.
    Yang, S., Yao, X.: Population-based incremental learning with associative memory for dynamic environments. IEEE Trans. on Evol. Comput (2008)Google Scholar
  7. 7.
    Eberhart, R.C., Kennedy, J.: A new optimizer using particle swarm theory. In: Proc. of the 6th Int. Symp. on Micro Machine and Human Science, pp. 39–43 (1995)Google Scholar
  8. 8.
    Kennedy, J., Eberhart, R.C.: Particle Swarm Optimization. In: Proc. of the 1995 IEEE Int. Conf. on Neural Networks, pp. 1942–1948 (1995)Google Scholar
  9. 9.
    Yao, X., Liu, Y.: Fast evolutionary programming. In: Proc. of the 5th Annual Conf. on Evolutionary Programming, pp. 451–460 (1996)Google Scholar
  10. 10.
    Weicker, K., Weicker, N.: Dynamic rotation and partial visibility. In: Proc. of the IEEE 2003 Congr. on Evol. Comput., pp. 1125–1131 (2003)Google Scholar
  11. 11.
    Liang, J.J., Suganthan, P.N., Deb, K.: Novel composition test functions for numerical global optimization. In: Proc. of the 2005 IEEE Congr. on Evol. Comput., pp. 68–75 (2005)Google Scholar
  12. 12.
    Kellerer, H., Pferschy, U., Pisinger, D.: Knapsack Problems. Springer, Heidelberg (2004)CrossRefzbMATHGoogle Scholar
  13. 13.
    Li, C., Yang, M., Kang, L.: A new approach to solving dynamic TSP. In: Wang, T.-D., Li, X.-D., Chen, S.-H., Wang, X., Abbass, H.A., Iba, H., Chen, G.-L., Yao, X. (eds.) SEAL 2006. LNCS, vol. 4247, pp. 236–243. Springer, Heidelberg (2006)CrossRefGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2008

Authors and Affiliations

  • Changhe Li
    • 1
  • Shengxiang Yang
    • 1
  1. 1.Department of Computer ScienceUniversity of LeicesterLeicesterUK

Personalised recommendations