Advertisement

Discussion of Search Strategy for Multi-objective Genetic Algorithm with Consideration of Accuracy and Broadness of Pareto Optimal Solutions

  • Tomoyuki Hiroyasu
  • Masashi Nishioka
  • Mitsunori Miki
  • Hisatake Yokouchi
Part of the Lecture Notes in Computer Science book series (LNCS, volume 5361)

Abstract

In multi-objective optimization, it is important that the obtained solutions are high quality regarding accuracy, uniform distribution, and broadness. Of these qualities, we focused on accuracy and broadness of the solutions and proposed a search strategy. Since it is difficult to improve both convergence and broadness of the solutions at the same time in a multi-objective GA search, we considered to converge the solutions first and then broaden them in the proposed search strategy by dividing the search into two search stages. The first stage is to improve convergence of the solutions, and a reference point specified by a decision maker is adopted in this search. In the second stage, the solutions are broadened using the Distributed Cooperation Scheme. From the results of the numerical experiment, we found that the proposed search strategy is capable of deriving broader solutions than conventional multi-objective GA with equivalent accuracy.

Keywords

Pareto Front Knapsack Problem Pareto Optimal Solution Pareto Optimal Front Multiobjective Evolutionary Algorithm 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Goldberg, D.E.: Genetic Algorithms in search, optimization and machine learning. Addison-Wesly (1989)Google Scholar
  2. 2.
    Fonseca, C.M., Fleming, P.J.: Genetic algorithms for multiobjective optimization: Formulation, discussion and generalization. In: Proceedings of the 5th international conference on genetic algorithms, pp. 416–423 (1993)Google Scholar
  3. 3.
    Zitzler, E., Thiele, L.: Multiobjective Evolutionary Algorithms: A Comparative Case Study and the Strength Pareto Approach. IEEE Transactions on Evolutionary Computation 3(4), 257–271 (1999)CrossRefGoogle Scholar
  4. 4.
    Deb, K., Agarwal, S., Pratap, A., Meyarivan, T.: A Fast Elitist Non-Dominated Sorting Genetic Algorithm for Multi-Objective Optimization: NSGA-II. In: KanGAL report 200001, Indian Institute of Technology, Kanpur (2000)Google Scholar
  5. 5.
    Zitzler, E., Laumanns, M., Thiele, L.: SPEA2: Improving the Performance of the Strength Pareto Evolutionary Algorithm. In Technical Report 103, Computer Engineering and Communication Networks Lab (TIK), Swiss Federal Institute of Technology (ETH) Zurich (2001)Google Scholar
  6. 6.
    Okuda, T., Hiroyasu, T., Miki, M., Watanabe, S.: DCMOGA: Distributed Cooperation model of Multi-Objective Genetic Algorithm. In: Advances in Nature-Inspired Computation: The PPSN VII Workshops, pp. 25–26 (2002)Google Scholar
  7. 7.
    Watanabe, S., Hiroyasu, T., Miki, M.: NCGA: Neighborhood Cultivation Genetic Algorithm for Multi-Objective Optimization Problems. In: Proceedings of the Genetic and Evolutionary Computation Conference (GECCO 2002), pp. 458–465 (2002)Google Scholar
  8. 8.
    Ishibuchi, H., Shibata, Y.: Mating Scheme for Controlling the Diversity-Convergence Balance for Multiobjective Optimization. In: Deb, K., et al. (eds.) GECCO 2004. LNCS, vol. 3102, pp. 1259–1271. Springer, Heidelberg (2004)CrossRefGoogle Scholar
  9. 9.
    Deb, K., Sundar, J.: Reference Point Based Multi-Objective Optimization Using Evolutionary Algorithms. In: GECCO 2006: Proceedings of the 8th annual conference on Genetic and evolutionary computation, pp. 635–642 (2006)Google Scholar
  10. 10.
    Tanese, R.: Distributed Genetic Algorithms. In: Proc. 3rd ICGA, pp. 434–439 (1989)Google Scholar
  11. 11.
    Jaimes, A.L., Coello, C.A.C.: MRMOGA: Parallel Evolutionary Multiobjective Optimization using Multiple Resolutions. In: 2005 IEEE Congress on Evolutionary Computation (CEC 2005), pp. 2294–2301 (2005)Google Scholar
  12. 12.
    Kursawe, F.: A Variant of Evolution Strategies for Vector Optimization. In: Parallel Problem Solving from Nature. 1st Workshop, PPSN I, pp. 193–197 (1991)Google Scholar
  13. 13.
    Ishibuchi, H., Kaige, S., Narukawa, K.: Comparison between Lamarckian and Baldwinian Repair on Multiobjective 0/1 Knapsack Problems. In: Coello Coello, C.A., Hernández Aguirre, A., Zitzler, E. (eds.) EMO 2005. LNCS, vol. 3410, pp. 370–385. Springer, Heidelberg (2005)CrossRefGoogle Scholar
  14. 14.
    Sato, H., Aguirre, H., Tanaka, K.: Local Dominance Using Polar Coordinates to Enhance Multi-objective Evolutionary Algorithms. In: Proc. 2004 IEEE Congress on Evolutionary Computation, pp. 188–195 (2004)Google Scholar
  15. 15.
    Knowles, J., Thiele, L., Zitzler, E.: A Tutorial on the Performance Assessment of Stochastic Multiobjective Optimizers. In TIK Report 214, Computer Engineering and Networks Laboratory (TIK), ETH Zurich (2006)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2008

Authors and Affiliations

  • Tomoyuki Hiroyasu
    • 1
  • Masashi Nishioka
    • 2
  • Mitsunori Miki
    • 3
  • Hisatake Yokouchi
    • 1
  1. 1.Faculty of Life and Medical SciencesDoshisha UniversityKyotoJapan
  2. 2.Graduate School of EngineeringDoshisha UniversityJapan
  3. 3.Faculty of Science and EngineeringDoshisha UniversityJapan

Personalised recommendations