Nonmonotone line searches for unconstrained multiobjective optimization problems

  • Kanako Mita
  • Ellen H. FukudaEmail author
  • Nobuo Yamashita


In the last two decades, many descent methods for multiobjective optimization problems were proposed. In particular, the steepest descent and the Newton methods were studied for the unconstrained case. In both methods, the search directions are computed by solving convex subproblems, and the stepsizes are obtained by an Armijo-type line search. As a consequence, the objective function values decrease at each iteration of the algorithms. In this work, we consider nonmonotone line searches, i.e., we allow the increase of objective function values in some iterations. Two well-known types of nonmonotone line searches are considered here: the one that takes the maximum of recent function values, and the one that takes their average. We also propose a new nonmonotone technique specifically for multiobjective problems. Under reasonable assumptions, we prove that every accumulation point of the sequence produced by the nonmonotone version of the steepest descent and Newton methods is Pareto critical. Moreover, we present some numerical experiments, showing that the nonmonotone technique is also efficient in the multiobjective case.


Multiobjective optimization Steepest descent method Newton method Nonmonotone line search Pareto optimality 



We would like to thank the anonymous referees for their suggestions, which improved the original version of the paper.


  1. 1.
    Das, I., Dennis, J.E.: Normal-boundary intersection: a new method for generating the Pareto surface in nonlinear multicriteria optimization problems. SIAM J. Optim. 8(3), 631–657 (1998)MathSciNetCrossRefzbMATHGoogle Scholar
  2. 2.
    Dolan, E.D., Moré, J.J.: Benchmarking optimization software with performance profiles. Math. Program. 91(2), 201–213 (2002)MathSciNetCrossRefzbMATHGoogle Scholar
  3. 3.
    Fazzio, N., Schuverdt, M.L.: Convergence analysis of a nonmonotone projected gradient method for multiobjective optimization problems (2018) (submitted)Google Scholar
  4. 4.
    Fliege, J., Graña Drummond, L.M., Svaiter, B.F.: Newton’s method for multiobjective optimization. SIAM J. Optim. 20(2), 602–626 (2009)MathSciNetCrossRefzbMATHGoogle Scholar
  5. 5.
    Fliege, J., Svaiter, B.F.: Steepest descent methods for multicriteria optimization. Math. Methods Oper. Res. 51(3), 479–494 (2000)MathSciNetCrossRefzbMATHGoogle Scholar
  6. 6.
    Fukuda, E.H., Graña Drummond, L.M.: A survey on multiobjective descent methods. Pesquisa Oper. 34(3), 585–620 (2014)CrossRefGoogle Scholar
  7. 7.
    Grippo, L., Lampariello, F., Lucidi, S.: A nonmonotone line search technique for Newton’s method. SIAM J. Numer. Anal. 23(4), 707–716 (1986)MathSciNetCrossRefzbMATHGoogle Scholar
  8. 8.
    Jin, Y., Olhofer, M., Sendhoff, B.: Dynamic weighted aggregation for evolutionary multi-objective optimization: Why does it work and how? In: GECCO’01 Proceedings of the 3rd Annual Conference on Genetic and Evolutionary Computation, pp. 1042–1049 (2001)Google Scholar
  9. 9.
    Kim, I.Y., de Weck, O.L.: Adaptive weighted-sum method for bi-objective optimization: Pareto front generation. Struct. Multidiscip. Optim. 29(2), 149–158 (2005)CrossRefGoogle Scholar
  10. 10.
    Laumanns, M., Thiele, L., Deb, K., Zitzler, E.: Combining convergence and diversity in evolutionary multiobjective optimization. Evolut. Comput. 10(3), 263–282 (2002)CrossRefGoogle Scholar
  11. 11.
    Luc, D.T.: Scalarization of vector optimization problems. J. Optim. Theory Appl. 55(1), 85–102 (1987)MathSciNetCrossRefzbMATHGoogle Scholar
  12. 12.
    Mita, K.: Nonmonotone line search in multiobjective settings (in Japanese). Undergraduate research, Kyoto University(2017)Google Scholar
  13. 13.
    Mita, K., Fukuda, E.H., Yamashita, N.: On using nonmonotone line search techniques in steepest descent methods for multiobjective optimization (in Japanese). In: Proceedings of the 61st Annual Conference of the Institute of Systems, Control and Information Engineers (2017)Google Scholar
  14. 14.
    Moré, J.J., Garbow, B.S., Hillstrom, K.E.: Testing unconstrained optimization software. ACM T. Math. Softw. 7(1), 17–41 (1981)MathSciNetCrossRefzbMATHGoogle Scholar
  15. 15.
    Nocedal, J.: Updating quasi-newton matrices with limited storage. Math. Comput. 35(151), 773–782 (1980)MathSciNetCrossRefzbMATHGoogle Scholar
  16. 16.
    Ogata, Y., Saito, Y., Tanaka, T., Yamada, S.: Sublinear scalarization methods for sets with respect to set-relations. Linear Nonlinear Anal. 3(1), 121–132 (2017)MathSciNetzbMATHGoogle Scholar
  17. 17.
    Qu, S., Ji, Y., Jiang, J., Zhang, Q.: Nonmonotone gradient methods for vector optimization with a portfolio optimization application. Eur. J. Oper. Res. 263, 356–366 (2017)MathSciNetCrossRefzbMATHGoogle Scholar
  18. 18.
    Stadler, W., Dauer, J.: Multicriteria optimization in engineering: a tutorial and survey. In: Kamat, M.P. (ed.) Progress in Aeronautics and Astronautics: Structural Optimization: Status and Promise, vol. 150, pp. 209–249. American Institute of Aeronautics and Astronautics, Reston (1992)Google Scholar
  19. 19.
    Toint, Ph.L.: Test problems for partially separable optimization and results for the routine PSPMIN. Tech. Rep. 83/4, Department of Mathematics, University of Namur, Brussels (1983) Google Scholar
  20. 20.
    Zhang, H., Hager, W.W.: A nonmonotone line search technique and its application to unconstrained optimization. SIAM J. Optim. 14(4), 1043–1056 (2004)MathSciNetCrossRefzbMATHGoogle Scholar
  21. 21.
    Zitzler, E., Deb, K., Thiele, L.: Comparison of multiobjective evolutionary algorithms: empirical results. Evolut. Comput. 8(2), 173–195 (2000)CrossRefGoogle Scholar

Copyright information

© Springer Science+Business Media, LLC, part of Springer Nature 2019

Authors and Affiliations

  1. 1.Department of Applied Mathematics and Physics, Graduate School of InformaticsKyoto UniversityKyotoJapan

Personalised recommendations