Constrained Optimisation with the Fuzzy Clustering Evolution Strategy

Conference paper


In previously published work [1],[2], [3] we have demonstrated that, by using a variant of the fuzzy clustering algorithm to provide multiple parent selection within the well-known Evolution Strategy paradigm, the Fuzzy Clustering Evolution Strategy (FCES) can help to preserve population diversity and maintain high convergence rates in fitness landscapes with “difficult” topological characteristics.

This paper describes an adaptation of the FCES to handle constrained optimisation problems. The constraint-handling method employed here is a version of the Behavioral Memory algorithm [4] which was chosen because of its minimal problem dependent a priori knowledge requirements and simplicity of implementation. An important requirement of this method is a search algorithm which enables a thorough exploration of the entire feasible region, even when this is not a connected domain. Schoenauer and Xanthakis [4] used a fitness sharing algorithm to achieve this, and here we demonstrate that our fuzzy-clustering approach is capable of promoting the required search characteristics without adversely affecting the rapid convergence for which the (μ,λ)- Evolution Strategy is renowned.


Fuzzy Cluster Fitness Landscape Bump Function Constraint Surface High Convergence Rate 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. [1]
    J.C.W. Sullivan and A.G. Pipe. An evolution strategy for on-line optimisation of dynamic objective functions. In Voigt et al. [19], pages 781–790.Google Scholar
  2. [2]
    J.C.W. Sullivan and A.G. Pipe. Path planning for redundant robot manipulators: a global optimization approach using evolutionary search. In Proc. IEEE Int. Conf. on Systems, Man and Cybernetics, pages 2396–2399, San Diego, CA, 1998.Google Scholar
  3. [3]
    J.C.W Sullivan, ?. Carse, and A.G. Pipe. A fuzzy clustering evolution strategy and its application to optimisation of robot manipulator movement. In I.C. Parmee, editor, Evolutionary Design and Manufacture, Proc. ACDM 2000, pages 187–198, Berlin, 2000. Springer-Verlag.Google Scholar
  4. [4]
    M. Schoenauer and S. Xanthakis. Constrained GA Optimization. In S. Forrest, editor, Proc. 5th International Conference on Genetic Algorithms, pages 573–580, San Mateo, CA, 1993. Morgan Kaufmann.Google Scholar
  5. [5]
    A.A. Törn and A. Zilinskas. Global Optimization. Number 350 in Lecture Notes in Computer Science. Springer-Verlag, Berlin, 1989.MATHCrossRefGoogle Scholar
  6. [6]
    J.C. Bezdek. Fuzzy Mathematics in Pattern Classification. PhD thesis, Cornell University, 1973.Google Scholar
  7. [7]
    A.E. Eiben, P-E. Raué, and Zs. Ruttkay. Genetic algorithms with multi-parent recombination. In Y. Davidor, H.-P. Schwefel, and R. Männer, editors, Parallel Problem Solving from Nature — PPSN III, pages 78–87, Jerusalem, 1994. Springer, Berlin.CrossRefGoogle Scholar
  8. [8]
    H.-G. Beyer. Towards a theory of evolution strategies: On the benefits of sex — the (μ/μ,λ) theory. Evolutionary Computation, 3(1):81–111, 1995.MathSciNetCrossRefGoogle Scholar
  9. [9]
    A. J. Keane. Experiences with optimizers in structural design. In I.C. Parmee, editor, Proceedings of Adaptive Computing in Engineering Design and Control ACEDC ′94, pages 14–27, Plymouth, UK, 1994.Google Scholar
  10. [10]
    Z. Michalewicz and M. Schoenauer. Evolutionary algorithms for constrained parameter optimisation problems. Evolutionary Computation, 4(1): 1–32,1996.CrossRefGoogle Scholar
  11. [11]
    A. J. Keane. A brief comparison of some evolutionary optimization methods. In V. Rayward-Smith, I. Osman, C. Reeves, and G.D. Smith, editors, Modern Heuristic Search Methods, pages 255–272. Wiley, 1996.Google Scholar
  12. [12]
    M. Schoenauer and Z. Michalewicz. Evolutionary computation at the edge of feasibility. In Voigt et al. [19], pages 245–254.Google Scholar
  13. [13]
    X. Yin and N. Germay. A fast genetic algorithm with sharing scheme using cluster analysis methods in multimodal function optimization. In Proc. Int. Conf. Artificial Neural Networks and Genetic Algorithms, pages 450–457, Berlin, 1993. Springer-Verlag.Google Scholar
  14. [14]
    A. J. Keane. A hard (?) problem. Genetic Algorithms Digest, 8(6), May 1994.Google Scholar
  15. [15]
    Z. Michalewicz. Genetic Algorithms + Data Structures = Evolution Programs. Springer, Berlin, 3rd edition, 1996.Google Scholar
  16. [16]
    M.R. Ghasemi, E. Hinton, and S. Bulman. Performance of genetic algorithms for optimization of frame structures. In I.C. Parmee, editor, Proceedings of Adaptive Computing in Design and Manufacture ACDM ′98, pages 287–299, Plymouth, UK, 1998.CrossRefGoogle Scholar
  17. [17]
    A.V. Fiacco and G.P. McCormack. The sequential unconstrained minimization tecnique for nonlinear programming, a primal-dual method. Management Sci., 10(2):360–366,1964.CrossRefGoogle Scholar
  18. [18]
    A. Ostermeier, A. Gawelczyk, and N. Hansen. A derandomized approach to self-adaptation of evolution strategies. Evolutionary Computation, 2(4):369–380, 1995.CrossRefGoogle Scholar
  19. [19]
    H.-M. Voigt, W. Ebeling, I. Rechenberg, and H.-P. Schwefel, editors, volume 1141 of Lecture Notes in Computer Science, Berlin, September 22–26, 1996. Springer, Berlin.Google Scholar

Copyright information

© Springer-Verlag London 2002

Authors and Affiliations

  1. 1.Faculty of Computing, Engineering and Mathematical SciencesUniversity of the West of EnglandBristolUK

Personalised recommendations