Safe global optimization of expensive noisy black-box functions in the \(\delta \)-Lipschitz framework

Abstract

In this paper, the problem of safe global maximization (it should not be confused with robust optimization) of expensive noisy black-box functions satisfying the Lipschitz condition is considered. The notion “safe” means that the objective function f(x) during optimization should not violate a “safety” threshold, for instance, a certain a priori given value h in a maximization problem. Thus, any new function evaluation (possibly corrupted by noise) must be performed at “safe points” only, namely, at points y for which it is known that the objective function \(f(y) > h\). The main difficulty here consists in the fact that the used optimization algorithm should ensure that the safety constraint will be satisfied at a point ybefore evaluation of f(y) will be executed. Thus, it is required both to determine the safe region \(\varOmega \) within the search domain D and to find the global maximum within \(\varOmega \). An additional difficulty consists in the fact that these problems should be solved in the presence of the noise. This paper starts with a theoretical study of the problem, and it is shown that even though the objective function f(x) satisfies the Lipschitz condition, traditional Lipschitz minorants and majorants cannot be used due to the presence of the noise. Then, a \(\delta \)-Lipschitz framework and two algorithms using it are proposed to solve the safe global maximization problem. The first method determines the safe area within the search domain, and the second one executes the global maximization over the found safe region. For both methods, a number of theoretical results related to their functioning and convergence is established. Finally, numerical experiments confirming the reliability of the proposed procedures are performed.

This is a preview of subscription content, log in to check access.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7
Fig. 8
Fig. 9
Fig. 10
Fig. 11
Fig. 12
Fig. 13

Notes

  1. 1.

    Clearly, there can exist several constants \({\tilde{L}}\) such that if they are placed in (1) instead of L the inequality will hold. Without loss of generality we suppose hereinafter that \(L>\min {\tilde{L}}\).

References

  1. Archetti F, Schoen F (1984) A survey on the global optimization problem: general theory and computational approaches. Ann Oper Res 1(2):87–110

    MATH  Article  Google Scholar 

  2. Barkalov KA, Gergel VP (2016) Parallel global optimization on GPU. J Glob Optim 66(1):3–20

    MathSciNet  MATH  Article  Google Scholar 

  3. Barkalov KA, Strongin RG (2018) Solving a set of global optimization problems by the parallel technique with uniform convergence. J Glob Optim 71(1):21–36

    MathSciNet  MATH  Article  Google Scholar 

  4. Ben-Tal A, El Ghaoui L, Nemirovski A (2009) Robust optimization. Princeton University Press, Princeton

    Google Scholar 

  5. Berkenkamp F, Schoellig AP, Krause A (2016) Safe controller optimization for quadrotors with Gaussian processes. In: 2016 IEEE International Conference on Robotics and Automation (ICRA). IEEE, pp 491–496. https://doi.org/10.1109/icra.2016.7487170

  6. Calvin JM, Žilinskas A (2000) One-dimensional P-algorithm with convergence rate \(\cal{O}(n-3+\delta )\) for smooth functions. J Optim Theory Appl 106(2):297–307

    MathSciNet  MATH  Article  Google Scholar 

  7. Calvin JM, Žilinskas A (2005) One-dimensional global optimization for observations with noise. Comput Math Appl 50(1–2):157–169

    MathSciNet  MATH  Article  Google Scholar 

  8. Calvin JM, Chen Y, Žilinskas A (2012) An adaptive univariate global optimization algorithm and its convergence rate for twice continuously differentiable functions. J Optim Theory Appl 155(2):628–636

    MathSciNet  MATH  Article  Google Scholar 

  9. Casado L, García I, Sergeyev YD (2003) Interval algorithms for finding the minimal root in a set of multiextremal one-dimensional nondifferentiable functions. SIAM J Sci Comput 24(2):359–376

    MathSciNet  MATH  Article  Google Scholar 

  10. Cavoretto R, De Rossi A, Mukhametzhanov MS, Sergeyev YD (2019) On the search of the shape parameter in radial basis functions using univariate global optimization methods. J Glob Optim. https://doi.org/10.1007/s10898-019-00853-3

    Article  Google Scholar 

  11. Daponte P, Grimaldi D, Molinaro A, Sergeyev YD (1996) Fast detection of the first zero-crossing in a measurement signal set. Measurement 19(1):29–39

    Article  Google Scholar 

  12. Fiducioso M, Curi S, Schumacher B, Gwerder M, Krause A (2019) Safe contextual Bayesian optimization for sustainable room temperature PID control tuning. In: Proceedings of the twenty-eighth international joint conference on artificial intelligence. IJCAI, pp 5850–5856. https://doi.org/10.24963/ijcai.2019/811

  13. Floudas CA, Pardalos PM (1996) State of the art in global optimization. Kluwer Academic Publishers, Dordrecht

    Google Scholar 

  14. García J, Fernández F (2012) Safe exploration of state and action spaces in reinforcement learning. J Artif Intell Res 45:515–564

    MathSciNet  MATH  Article  Google Scholar 

  15. Gergel VP, Sidorov SV (2015) A two-level parallel global search algorithm for solution of computationally intensive multiextremal optimization problems. In: Malyshkin V (ed) Parallel computing technologies (PaCT 2015), LNCS, vol 9251. Springer, Cham, pp 505–515

  16. Gergel VP, Grishagin VA, Israfilov RA (2015) Local tuning in nested scheme of global optimization. Procedia Comput Sci 51:865–874

    Article  Google Scholar 

  17. Gillard JW, Kvasov DE (2016) Lipschitz optimization methods for fitting a sum of damped sinusoids to a series of observations. Stat Interface 10(1):59–70

    MathSciNet  MATH  Article  Google Scholar 

  18. Grishagin VA, Israfilov RA, Sergeyev YD (2018) Convergence conditions and numerical comparison of global optimization methods based on dimensionality reduction schemes. Appl Math Comput 318:270–280

    MathSciNet  MATH  Google Scholar 

  19. Hansen P, Jaumard B, Lu SH (1992) Global optimization of univariate Lipschitz functions: II. New algorithms and computational comparison. Math Program 55(1–3):273–292

    MathSciNet  MATH  Article  Google Scholar 

  20. Horst R, Pardalos PM (eds) (1995) Handbook of global optimization, vol 1. Kluwer Academic Publishers, Dordrecht

    Google Scholar 

  21. Jones DR, Schonlau M, Welch WJ (1998) Efficient global optimization of expensive black-box functions. J Glob Optim 13(4):455–492

    MathSciNet  MATH  Article  Google Scholar 

  22. Kvasov DE, Mukhametzhanov MS (2018) Metaheuristic vs. deterministic global optimization algorithms: the univariate case. Appl Math Comput 318:245–259

    MathSciNet  MATH  Google Scholar 

  23. Kvasov DE, Sergeyev YD (2012) Univariate geometric Lipschitz global optimization algorithms. Numer Algebra Control Optim 2(1):69–90

    MathSciNet  MATH  Article  Google Scholar 

  24. Kvasov DE, Sergeyev YD (2013) Lipschitz global optimization methods in control problems. Autom Remote Control 74(9):1435–1448

    MathSciNet  MATH  Article  Google Scholar 

  25. Kvasov DE, Sergeyev YD (2015) Deterministic approaches for solving practical black-box global optimization problems. Adv Eng Softw 80:58–66

    Article  Google Scholar 

  26. Kvasov DE, Mukhametzhanov MS, Nasso MC, Sergeyev YD (2020) On acceleration of derivative-free univariate Lipschitz global optimization methods. In: Sergeyev Y., Kvasov D. (eds) Numerical computations: theory and algorithms. NUMTA 2019. Lecture notes in computer science, vol 11974. Springer, Cham, pp 413–421

  27. Lera D, Sergeyev YD (2010) Lipschitz and Hölder global optimization using space-filling curves. Appl Numer Math 60:115–129

    MathSciNet  MATH  Article  Google Scholar 

  28. Lera D, Sergeyev YD (2013) Acceleration of univariate global optimization algorithms working with Lipschitz functions and Lipschitz first derivatives. SIAM J Optim 23(1):508–529

    MathSciNet  MATH  Article  Google Scholar 

  29. Molinaro A, Sergeyev YD (2001a) An efficient algorithm for the zero-crossing detection in digitized measurement signal. Measurement 30(3):187–196

    Article  Google Scholar 

  30. Molinaro A, Sergeyev YD (2001b) Finding the minimal root of an equation with the multiextremal and nondifferentiable left-hand part. Numer Algorithms 28(1–4):255–272

    MathSciNet  MATH  Article  Google Scholar 

  31. Paulavičius R, Sergeyev YD, Kvasov DE, Žilinskas J (2014) Globally-biased DISIMPL algorithm for expensive global optimization. J Glob Optim 59(2–3):545–567

    MathSciNet  MATH  Article  Google Scholar 

  32. Paulavičius R, Sergeyev YD, Kvasov DE, Žilinskas J (2020) Globally-biased BIRECT algorithm with local accelerators for expensive global optimization. Expert Syst Appl 144:113052

    Article  Google Scholar 

  33. Pintér JD (1996) Global optimization in action (continuous and Lipschitz optimization: algorithms, implementations and applications). Kluwer Academic Publishers, Dordrecht

    Google Scholar 

  34. Piyavskij SA (1972) An algorithm for finding the absolute extremum of a function. USSR Comput Math Math Phys 12(4):57–67 (In Russian: Zh. Vychisl. Mat. Mat. Fiz., 12(4) (1972), pp 888–896)

    Article  Google Scholar 

  35. Schillinger M, Hartmann B, Skalecki P, Meister M, Nguyen-Tuong D, Nelles O (2017) Safe active learning and safe Bayesian optimization for tuning a PI-controller. IFAC-PapersOnLine 50(1):5967–5972 20th IFAC World Congress

    Article  Google Scholar 

  36. Sergeyev YD (1995) A one-dimensional deterministic global minimization algorithm. Comput Math Math Phys 35(5):553–562

    MathSciNet  Google Scholar 

  37. Sergeyev YD, Grishagin VA (2001) Parallel asynchronous global search and the nested optimization scheme. J Comput Anal Appl 3(2):123–145

    MathSciNet  MATH  Google Scholar 

  38. Sergeyev YD, Kvasov DE (2017) Deterministic global optimization: an introduction to the diagonal approach. Springer, New York

    Google Scholar 

  39. Sergeyev YD, Daponte P, Grimaldi D, Molinaro A (1999) Two methods for solving optimization problems arising in electronic measurements and electrical engineering. SIAM J Optim 10(1):1–21

    MathSciNet  MATH  Article  Google Scholar 

  40. Sergeyev YD, Famularo D, Pugliese P (2001) Index branch-and-bound algorithm for Lipschitz univariate global optimization with multiextremal constraints. J Glob Optim 21(3):317–341

    MathSciNet  MATH  Article  Google Scholar 

  41. Sergeyev YD, Strongin RG, Lera D (2013) Introduction to global optimization exploiting space-filling curves. Springer, New York

    Google Scholar 

  42. Sergeyev YD, Mukhametzhanov MS, Kvasov DE, Lera D (2016) Derivative-free local tuning and local improvement techniques embedded in the univariate global optimization. J Optim Theory Appl 171(1):186–208

    MathSciNet  MATH  Article  Google Scholar 

  43. Sergeyev YD, Kvasov DE, Mukhametzhanov MS (2017) Operational zones for comparing metaheuristic and deterministic one-dimensional global optimization algorithms. Math Comput Simul 141:96–109

    MathSciNet  Article  Google Scholar 

  44. Sergeyev YD, Kvasov DE, Mukhametzhanov MS (2018a) On strong homogeneity of a class of global optimization algorithms working with infinite and infinitesimal scales. Commun Nonlinear Sci Numer Simul 59:319–330

    MathSciNet  Article  Google Scholar 

  45. Sergeyev YD, Kvasov DE, Mukhametzhanov MS (2018b) On the efficiency of nature-inspired metaheuristics in expensive global optimization with limited budget. Sci Rep 8:1–9

    Article  Google Scholar 

  46. Sergeyev YD, Nasso MC, Mukhametzhanov MS, Kvasov DE (2020) Novel local tuning techniques for speeding up one-dimensional algorithms in expensive global optimization using Lipschitz derivatives. J Comput Appl Math (submitted)

  47. Strongin RG, Sergeyev YD (2000) Global optimization with non-convex constraints: sequential and parallel algorithms. Kluwer Academic Publishers, Dordrecht

    Google Scholar 

  48. Sui Y, Gotovos A, Burdick JW, Krause A (2015) Safe exploration for optimization with Gaussian processes. In: Bach F, Blei D (eds) Proceedings of the 32nd international conference on machine learning, PMLR, vol 37. Lille, France, pp 997–1005

  49. Vanderbei RJ (1999) Extension of Piyavskii’s algorithm to continuous global optimization. J Glob Optim 14(2):205–216

    MathSciNet  MATH  Article  Google Scholar 

  50. Žilinskas A, Zhigljavsky A (2016) Stochastic global optimization: a review on the occasion of 25 years of Informatica. Informatica 27(2):229–256

    MATH  Article  Google Scholar 

  51. Žilinskas A, Žilinskas J (2010) Interval arithmetic based optimization in nonlinear regression. Informatica 21(1):149–158

    MathSciNet  MATH  Article  Google Scholar 

Download references

Author information

Affiliations

Authors

Corresponding author

Correspondence to Yaroslav D. Sergeyev.

Ethics declarations

Conflict of interest

The authors declare that they have no conflict of interest.

Ethical approval

This article does not contain any studies with human participants or animals performed by any of the authors.

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Communicated by Yaroslav D. Sergeyev.

Appendix

Appendix

figurea
figureb
figurec
figured

Rights and permissions

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

Sergeyev, Y.D., Candelieri, A., Kvasov, D.E. et al. Safe global optimization of expensive noisy black-box functions in the \(\delta \)-Lipschitz framework. Soft Comput (2020). https://doi.org/10.1007/s00500-020-05030-3

Download citation

Keywords

  • Safe global optimization
  • Expensive black-box functions
  • Noise
  • Lipschitz condition
  • Machine learning