Advertisement

Numerical Methods for Computing Plausibility and Belief Distributions of Consequences of a Subjective Model of Object of Research

  • D. A. Balakin
Article
  • 17 Downloads

Abstract

Numerical methods for computing plausibility and belief distributions of consequences of a subjective model are considered. More precisely, related constrained optimization problems are studied. Error estimates of the proposed algorithms are obtained. Techniques for taking into account the information about the consequence available to the researcher for improving the accuracy of computations are discussed.

Keywords

constrained optimization global optimization subjective modeling plausibility belief 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Yu. P. Pyt’ev, “Modeling of subjective judgments made by a researcher-modeler about the model of the research object,” Math. Models Comput. Simul. 5, 538–557 (2013).MathSciNetCrossRefzbMATHGoogle Scholar
  2. 2.
    N. N. Kalitkin, Numerical Methods, Ed. by A. A. Samarskii, 2nd ed. (BKhV, St. Peterburg, 2011) [in Russian].Google Scholar
  3. 3.
    L. M. Rios and N. V. Sahinidis, “Derivative-free optimization: A review of algorithms and comparison of software implementations,” J. Global Optim. 56, 1247–1293 (2012).MathSciNetCrossRefzbMATHGoogle Scholar
  4. 4.
    A. R. Conn, K. Scheinberg, and L. N. Vincente, Introduction to Derivative-Free Optimization. MPS-SIAM Book Series on Optimization (SIAM, Philadelphia, 2009).CrossRefGoogle Scholar
  5. 5.
    D. R. Jones, M. Schonlau, and W. J. Welch, “Efficient global optimization of expensive black-box functions,” J. Global Optim. 13, 455–492 (1998).MathSciNetCrossRefzbMATHGoogle Scholar
  6. 6.
    B. Shahriari, K. Swersky, Z. Wang, R. P. Adams, and N. de Freitas, “Taking the human out of the loop: A review of bayesian optimization,” Proc. IEEE 104, 148–175 (2016).CrossRefGoogle Scholar
  7. 7.
    J. M. Hernandez-Lobato, M. A. Gelbart, M. W. Hoffman, R. P. Adams, and Z. Ghahramani, “Predictive entropy search for bayesian optimization with unknown constraints,” in Proc. 32nd Int. Conf. on Machine Learning, Lille, France, 2015, Vol. 37 of JMLR: W&CP. arXiv:1502.05312 stat.MLGoogle Scholar
  8. 8.
    V. Picheny, “A stepwise uncertainty reduction approach to constrained global optimization,” in Proc. 17th Int. Conf. on Artificial Intelligence and Statistics, Vol. 33 of JMLR: W&CP, 2014.Google Scholar
  9. 9.
    J. R. Gardner, M. J. Kusner, Z. Xu, K. Q. Weinberger, J. P. Cunningham, “Bayesian optimization with inequality constraints,” in Proc. 31st Int. Conf. on Machine Learning, Beijing, 2014, Vol. 32 of JMLR: W&CP.Google Scholar
  10. 10.
    R. B. Gramacy, G. A. Gray, S. Le Dirigabel, H. K. H. Lee, P. Ranjan, G. Wells, and S. M. Wild, “Modeling an augmented Lagrangian for blackbox constrained optimization,” Technometrics 58, 1–11 (2016).MathSciNetCrossRefGoogle Scholar
  11. 11.
    V. Picheny, R. B. Gramacy, S. M. Wild, and S. Le Dirigabel, “Bayesian optimization under mixed constraints with a slack-variable augmented Lagrangian.” arXiv:1605.09466 [stat.CO]. Cited October, 2016.Google Scholar
  12. 12.
    F. P. Vasil’ev, Optimization Methods (Faktorial, Moscow, 2002) [in Russian].Google Scholar
  13. 13.
    U. Naumann, The Art of Differentiating Computer Programs: An Introduction to Algorithmic Differentiation. Software, Environments and Tools (SIAM, Philadelphia, 2012).zbMATHGoogle Scholar

Copyright information

© Pleiades Publishing, Ltd. 2018

Authors and Affiliations

  1. 1.Department of PhysicsMoscow State UniversityMoscowRussia

Personalised recommendations