Skip to main content

Adaptive Rejection Sampling Methods

  • Chapter
  • First Online:
Book cover Independent Random Sampling Methods

Part of the book series: Statistics and Computing ((SCO))

Abstract

This chapter is devoted to describing the class of the adaptive rejection sampling (ARS) schemes. These (theoretically) universal methods are very efficient samplers that update the proposal density whenever a generated sample is rejected in the RS test. In this way, they can produce i.i.d. samples from the target with an increasing acceptance rate that can converge to 1. As a by-product, these techniques also generate a sequence of proposal pdfs converging to the true shape of the target density. Another advantage of the ARS samplers is that, when they can be applied, the user only has to select a set of initial conditions. After the initialization, they are completely automatic, self-tuning algorithms (i.e., no parameters need to be adjusted by the user) regardless of the specific target density. However, the need to construct a suitable sequence of proposal densities restricts the practical applicability of this methodology. As a consequence, ARS schemes are often tailored to specific classes of target distributions. Indeed, the construction of the proposal is particularly hard in multidimensional spaces. Hence, ARS algorithms are usually designed only for drawing from univariate densities.

In this chapter we discuss the basic adaptive structure shared by all ARS algorithms. Then we look into the performance of the method, characterized by the acceptance probability (which increases as the proposal is adapted), and describe various extensions of the standard ARS approach which are aimed either at improving the efficiency of the method or at covering a broader class of target pdfs. Finally, we consider a hybrid method that combines the ARS and Metropolis-Hastings schemes.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 119.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 159.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 159.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    We assign a name to the function V (x) to ease the treatment, so that we can refer directly to it. The name potential function recalls a physical interpretation. In statistical mechanics, for instance, the potential energy function V  is central to the evaluation of many thermodynamic properties, where V  is used as log-density [31]. To be specific, the estimation of the thermodynamic properties demands the computation of integrals involving the function \(\exp {(-V)}\). This interpretation of the log-pdf as a potential is also evoked explicitly in the Hamiltonian MCMC techniques [32].

  2. 2.

    Note that with simple inspections it is always possible to know the interval including the mode (for instance, considering the signs of the slopes of the secant lines passing through the support points).

  3. 3.

    Table 4.2 in Sect. 4.3.5 summarizes the other three possible cases where the technique is applicable: T −1(z) increasing and g(x) convex, T −1(z) decreasing and g(x) concave and finally T −1(z) decreasing and g(x) convex. Note that in all the four cases T −1(z) is a monotonic function.

  4. 4.

    The case when the set of simple estimates is empty is similar to the case μ →±, hence the construction of the linear functions is also similar, but it needs a special care. For further details, see [23].

  5. 5.

    Note that \([\bar {\pi }_t(x)- p(x)]\geq 0\) for all \(x\in \mathcal {D}\).

References

  1. M.S. Arulumpalam, S. Maskell, N. Gordon, T. Klapp, A tutorial on particle filters for online nonlinear/non-Gaussian Bayesian tracking. IEEE Trans. Signal Process. 50(2), 174–188 (2002)

    Google Scholar 

  2. J. Besag, P.J. Green, Spatial statistics and Bayesian computation. J. R. Stat. Soc. Ser. B 55(1), 25–37 (1993)

    Google Scholar 

  3. C. Botts, A modified adaptive accept-reject algorithm for univariate densities with bounded support. Technical Report, http://williams.edu/Mathematics/cbotts/Research/paper3.pdf (2010)

  4. G.E.P. Box, G.C. Tiao, Bayesian Inference in Statistical Analysis (Wiley, New York, 1973)

    Google Scholar 

  5. L. Devroye, Non-uniform Random Variate Generation (Springer, New York, 1986)

    Google Scholar 

  6. M. Evans, T. Swartz, Random variate generation using concavity properties of transformed densities. J. Comput. Graph. Stat. 7(4), 514–528 (1998)

    Google Scholar 

  7. J. Geweke, Bayesian inference in econometric models using Monte Carlo integration. Econometrica 24, 1317–1399 (1989)

    Google Scholar 

  8. W.R. Gilks, Derivative-free adaptive rejection sampling for Gibbs sampling. Bayesian Stat. 4, 641–649 (1992)

    Google Scholar 

  9. W.R. Gilks, P. Wild, Adaptive rejection sampling for Gibbs sampling. Appl. Stat. 41(2), 337–348 (1992)

    Google Scholar 

  10. W.R. Gilks, N.G. Best, K.K.C. Tan, Adaptive rejection Metropolis sampling within Gibbs sampling. Appl. Stat. 44(4), 455–472 (1995)

    Google Scholar 

  11. D. Gorur, Y.W. Teh, Concave convex adaptive rejection sampling. J. Comput. Graph. Stat. 20(3), 670–691 (2011)

    Google Scholar 

  12. T.L. Griffiths, Z. Ghahramani, The indian buffet process: an introduction and review. J. Mach. Learn. Res. 12, 1185–1224 (2011)

    Google Scholar 

  13. H. Hirose, A. Todoroki, Random number generation for the generalized normal distribution using the modified adaptive rejection method. Int. Inf. Inst. 8(6), 829–836 (2005)

    Google Scholar 

  14. L. Holden, Adaptive chains. Technical Report Norwegian Computing Center (1998)

    Google Scholar 

  15. L. Holden, R. Hauge, M. Holden, Adaptive independent Metropolis-Hastings. Ann. Appl. Probab. 19(1), 395–413 (2009)

    Google Scholar 

  16. W. Hörmann, A rejection technique for sampling from T-concave distributions. ACM Trans. Math. Softw. 21(2), 182–193 (1995)

    Google Scholar 

  17. W. Hörmann, J. Leydold, G. Derflinger, Automatic Nonuniform Random Variate Generation (Springer, New York, 2003)

    Google Scholar 

  18. W. Hörmann, J. Leydold, G. Derflinger, Inverse transformed density rejection for unbounded monotone densities. Research Report Series, Department of Statistics and Mathematics (Economy and Business), Vienna University (2007)

    Google Scholar 

  19. Y. Huang, J. Zhang, P.M. Djurić, Bayesian detection for BLAST. IEEE Trans. Signal Process. 53(3), 1086–1096 (2005)

    Google Scholar 

  20. M.C. Jones, Distributions generated by transformation of scale using an extended cauchy-schlömilch transformation. Indian J. Stat. 72-A(2), 359–375 (2010)

    Google Scholar 

  21. J. Leydold, J. Janka, W. Hörmann, Variants of transformed density rejection and correlation induction, in Monte Carlo and Quasi-Monte Carlo Methods 2000 (Springer, Heidelberg, 2002), pp. 345–356

    Google Scholar 

  22. F. Liang, C. Liu, R. Caroll, Advanced Markov Chain Monte Carlo Methods: Learning from Past Samples. Wiley Series in Computational Statistics (Wiley, Chichester, 2010)

    Google Scholar 

  23. L. Martino, Novel Schemes for Adaptive Rejection Sampling (Universidad Carlos III, Madrid, 2011)

    Google Scholar 

  24. L. Martino, Parsimonious adaptive rejection sampling. IET Electron. Lett. 53(16), 1115–1117 (2017)

    Google Scholar 

  25. L. Martino, F. Louzada, Adaptive rejection sampling with fixed number of nodes. Commun. Stat. Simul. Comput., 1–11 (2017, to appear)

    Google Scholar 

  26. L. Martino, J. Míguez, A generalization of the adaptive rejection sampling algorithm. Stat. Comput. 21, 633–647 (2010). doi: https://doi.org/10.1007/s11222-010-9197-9

  27. L. Martino, J. Míguez, Generalized rejection sampling schemes and applications in signal processing. Signal Process. 90(11), 2981–2995 (2010)

    Google Scholar 

  28. L. Martino, J. Read, D. Luengo, Independent doubly adaptive rejection Metropolis sampling within Gibbs sampling, IEEE Trans. Signal Process. 63(12), 3123–3138 (2015)

    Google Scholar 

  29. L. Martino, R. Casarin, F. Leisen, D. Luengo, Adaptive independent sticky MCMC algorithms. EURASIP J. Adv. Signal Process. (2018, to appear)

    Google Scholar 

  30. R. Meyer, B. Cai, F. Perron, Adaptive rejection Metropolis sampling using Lagrange interpolation polynomials of degree 2. Comput. Stat. Data Anal. 52(7), 3408–3423 (2008)

    Google Scholar 

  31. J. Michel, The use of free energy simulations as scoring functions. PhD Thesis, University of Southampton (2006)

    Google Scholar 

  32. R. Neal, MCMC using Hamiltonian dynamics, Chap. 5, in Handbook of Markov Chain Monte Carlo ed. by S. Brooks, A. Gelman, G. Jones, X.-L. Meng (Chapman and Hall/CRC Press, Boca Raton, 2011)

    Google Scholar 

  33. J.G. Proakis, Digital Communications, 4th edn. (McGraw-Hill, Singapore, 2000)

    Google Scholar 

  34. Y.W. Teh, D. Görür, Z. Ghahramani, Stick-breaking construction for the indian buffet process, in Proceedings of the International Conference on Artificial Intelligence and Statistics (2007)

    Google Scholar 

  35. L. Tierney, Exploring posterior distributions using Markov Chains, in Computer Science and Statistics: Proceedings of IEEE 23rd Symposium on the Interface (1991), pp. 563–570

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Rights and permissions

Reprints and permissions

Copyright information

© 2018 Springer International Publishing AG, part of Springer Nature

About this chapter

Check for updates. Verify currency and authenticity via CrossMark

Cite this chapter

Martino, L., Luengo, D., Míguez, J. (2018). Adaptive Rejection Sampling Methods. In: Independent Random Sampling Methods. Statistics and Computing. Springer, Cham. https://doi.org/10.1007/978-3-319-72634-2_4

Download citation

Publish with us

Policies and ethics