Advertisement

SOP-Hybrid: A Parallel Surrogate-Based Candidate Search Algorithm for Expensive Optimization on Large Parallel Clusters

  • Taimoor AkhtarEmail author
  • Christine A. Shoemaker
Conference paper
Part of the Advances in Intelligent Systems and Computing book series (AISC, volume 991)

Abstract

Efficient parallel algorithm designs and surrogate models are powerful tools that can significantly increase the efficiency of stochastic metaheursitics with application to computationally expensive optimization problems. This paper introduces SOP-Hybrid, a synchronous parallel surrogate-based global optimization algorithm, designed for computationally expensive problems. SOP-Hybrid is a modification of the Surrogate Optimization with Pareto center selection (SOP) algorithm, designed to achieve better synchronous parallel optimization efficiency when a large number of cores are available. The original SOP was built on the idea of visualizing the exploration-exploitation trade-off of iterative surrogate optimization as a multi-objective problem, and was experimentally effective for up to 32 processors. SOP-Hybrid modifies SOP by visualizing the exploration-exploitation trade-off at two levels, i.e. (i) at the global level and as a multi-objective problem (like SOP) and (ii) at the local level, via an acquisition function. Both SOP and SOP-Hybrid use Radial Basis Functions (RBFs) as surrogates. Results on test problems indicate that SOP-Hybrid is more efficient than SOP with 48 simultaneous synchronous evaluations. SOP was previously shown to be more efficient than Parallel Stochastic RBF and ESGRBF with 32 simultaneous synchronous evaluations.

Keywords

Expensive functions Meta-models Parallel optimization 

References

  1. 1.
    Deb, K., Kalyanmoy, D.: Multi-Objective Optimization Using Evolutionary Algorithms. Wiley, New York (2001)Google Scholar
  2. 2.
    Eriksson, D., Bindel, D., Shoemaker, C.: Surrogate optimization toolbox (pysot). https://github.com/dme65/pySOT (2015)
  3. 3.
    Hansen, N., Finck, S., Ros, R., Auger, A.: Real-parameter black-box optimization benchmarking 2009: Noiseless functions definitions. Technical Report RR-6829, INRIA (2009)Google Scholar
  4. 4.
    Hutter, F., Hoos, H.H., Leyton-Brown, K.: Sequential model-based optimization for general algorithm configuration. In: Proceedings of the 5th International Conference on Learning and Intelligent Optimization, pp. 507–523. Springer-Verlag, Heidelberg (2011)Google Scholar
  5. 5.
    Ilievski, I., Akhtar, T., Feng, J., Shoemaker, C.: Efficient hyperparameter optimization for deep learning algorithms using deterministic RBF surrogates. In: AAAI Conference on Artificial Intelligence (2017)Google Scholar
  6. 6.
    Jones, D.R., Schonlau, M., Welch, W.J.: Efficient global optimization of expensive black-box functions. J. Global Optim. 13(4), 455–492 (1998)MathSciNetCrossRefGoogle Scholar
  7. 7.
    Krityakierne, T., Akhtar, T., Shoemaker, C.A.: SOP: parallel surrogate global optimization with pareto center selection for computationally expensive single objective problems. J. Global Optim. 66(3), 417–437 (2016)MathSciNetCrossRefGoogle Scholar
  8. 8.
    Pintér, J.D.: Global Optimization in Action. Springer, New York (1996)CrossRefGoogle Scholar
  9. 9.
    Regis, R.G., Shoemaker, C.A.: A stochastic radial basis function method for the global optimization of expensive functions. INFORMS J. Comput. 19(4), 497–509 (2007)MathSciNetCrossRefGoogle Scholar
  10. 10.
    Regis, R.G., Shoemaker, C.A.: Parallel stochastic global optimization using radial basis functions. INFORMS J. Comput. 21(3), 411–426 (2009)MathSciNetCrossRefGoogle Scholar
  11. 11.
    Regis, R.G., Shoemaker, C.A.: Combining radial basis function surrogates dynamic coordinate search in high dimensional expensive black-box optimization. Eng. Optim. 45(5), 529–555 (2013)MathSciNetCrossRefGoogle Scholar
  12. 12.
    Sergeyev, Y.D., Kvasov, D.E., Mukhametzhanov, M.S.: On the efficiency of nature-inspired metaheuristics in expensive global optimization with limited budget. Sci. Rep. 8(453), (Jan 2018)Google Scholar
  13. 13.
    Sergeyev, Y.D., Kvasov, D.E.: Deterministic Global Optimization: An Introduction to the Diagonal Approach, 1st edn. Springer, New York (2017)CrossRefGoogle Scholar
  14. 14.
    Snoek, J., Larochelle, H., Adams, R.P.: Practical Bayesian optimization of machine learning algorithms. In: Pereira, F., Burges, C.J.C., Bottou, L., Weinberger, K.Q. (eds.) Advances in Neural Information Processing Systems, vol. 25, pp. 2951–2959. Curran Associates, Inc. (2012)Google Scholar

Copyright information

© Springer Nature Switzerland AG 2020

Authors and Affiliations

  1. 1.Environmental Research Institute, National University of SingaporeSingaporeSingapore
  2. 2.Department of Industrial Systems Engineering and ManagementNational University of SingaporeSingaporeSingapore
  3. 3.Department of Civil and Environmental EngineeringNational University of SingaporeSingaporeSingapore

Personalised recommendations