Abstract
This chapter is devoted to describing the class of the adaptive rejection sampling (ARS) schemes. These (theoretically) universal methods are very efficient samplers that update the proposal density whenever a generated sample is rejected in the RS test. In this way, they can produce i.i.d. samples from the target with an increasing acceptance rate that can converge to 1. As a by-product, these techniques also generate a sequence of proposal pdfs converging to the true shape of the target density. Another advantage of the ARS samplers is that, when they can be applied, the user only has to select a set of initial conditions. After the initialization, they are completely automatic, self-tuning algorithms (i.e., no parameters need to be adjusted by the user) regardless of the specific target density. However, the need to construct a suitable sequence of proposal densities restricts the practical applicability of this methodology. As a consequence, ARS schemes are often tailored to specific classes of target distributions. Indeed, the construction of the proposal is particularly hard in multidimensional spaces. Hence, ARS algorithms are usually designed only for drawing from univariate densities.
In this chapter we discuss the basic adaptive structure shared by all ARS algorithms. Then we look into the performance of the method, characterized by the acceptance probability (which increases as the proposal is adapted), and describe various extensions of the standard ARS approach which are aimed either at improving the efficiency of the method or at covering a broader class of target pdfs. Finally, we consider a hybrid method that combines the ARS and Metropolis-Hastings schemes.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Notes
- 1.
We assign a name to the function V (x) to ease the treatment, so that we can refer directly to it. The name potential function recalls a physical interpretation. In statistical mechanics, for instance, the potential energy function V is central to the evaluation of many thermodynamic properties, where V is used as log-density [31]. To be specific, the estimation of the thermodynamic properties demands the computation of integrals involving the function \(\exp {(-V)}\). This interpretation of the log-pdf as a potential is also evoked explicitly in the Hamiltonian MCMC techniques [32].
- 2.
Note that with simple inspections it is always possible to know the interval including the mode (for instance, considering the signs of the slopes of the secant lines passing through the support points).
- 3.
- 4.
The case when the set of simple estimates is empty is similar to the case μ →±∞, hence the construction of the linear functions is also similar, but it needs a special care. For further details, see [23].
- 5.
Note that \([\bar {\pi }_t(x)- p(x)]\geq 0\) for all \(x\in \mathcal {D}\).
References
M.S. Arulumpalam, S. Maskell, N. Gordon, T. Klapp, A tutorial on particle filters for online nonlinear/non-Gaussian Bayesian tracking. IEEE Trans. Signal Process. 50(2), 174–188 (2002)
J. Besag, P.J. Green, Spatial statistics and Bayesian computation. J. R. Stat. Soc. Ser. B 55(1), 25–37 (1993)
C. Botts, A modified adaptive accept-reject algorithm for univariate densities with bounded support. Technical Report, http://williams.edu/Mathematics/cbotts/Research/paper3.pdf (2010)
G.E.P. Box, G.C. Tiao, Bayesian Inference in Statistical Analysis (Wiley, New York, 1973)
L. Devroye, Non-uniform Random Variate Generation (Springer, New York, 1986)
M. Evans, T. Swartz, Random variate generation using concavity properties of transformed densities. J. Comput. Graph. Stat. 7(4), 514–528 (1998)
J. Geweke, Bayesian inference in econometric models using Monte Carlo integration. Econometrica 24, 1317–1399 (1989)
W.R. Gilks, Derivative-free adaptive rejection sampling for Gibbs sampling. Bayesian Stat. 4, 641–649 (1992)
W.R. Gilks, P. Wild, Adaptive rejection sampling for Gibbs sampling. Appl. Stat. 41(2), 337–348 (1992)
W.R. Gilks, N.G. Best, K.K.C. Tan, Adaptive rejection Metropolis sampling within Gibbs sampling. Appl. Stat. 44(4), 455–472 (1995)
D. Gorur, Y.W. Teh, Concave convex adaptive rejection sampling. J. Comput. Graph. Stat. 20(3), 670–691 (2011)
T.L. Griffiths, Z. Ghahramani, The indian buffet process: an introduction and review. J. Mach. Learn. Res. 12, 1185–1224 (2011)
H. Hirose, A. Todoroki, Random number generation for the generalized normal distribution using the modified adaptive rejection method. Int. Inf. Inst. 8(6), 829–836 (2005)
L. Holden, Adaptive chains. Technical Report Norwegian Computing Center (1998)
L. Holden, R. Hauge, M. Holden, Adaptive independent Metropolis-Hastings. Ann. Appl. Probab. 19(1), 395–413 (2009)
W. Hörmann, A rejection technique for sampling from T-concave distributions. ACM Trans. Math. Softw. 21(2), 182–193 (1995)
W. Hörmann, J. Leydold, G. Derflinger, Automatic Nonuniform Random Variate Generation (Springer, New York, 2003)
W. Hörmann, J. Leydold, G. Derflinger, Inverse transformed density rejection for unbounded monotone densities. Research Report Series, Department of Statistics and Mathematics (Economy and Business), Vienna University (2007)
Y. Huang, J. Zhang, P.M. Djurić, Bayesian detection for BLAST. IEEE Trans. Signal Process. 53(3), 1086–1096 (2005)
M.C. Jones, Distributions generated by transformation of scale using an extended cauchy-schlömilch transformation. Indian J. Stat. 72-A(2), 359–375 (2010)
J. Leydold, J. Janka, W. Hörmann, Variants of transformed density rejection and correlation induction, in Monte Carlo and Quasi-Monte Carlo Methods 2000 (Springer, Heidelberg, 2002), pp. 345–356
F. Liang, C. Liu, R. Caroll, Advanced Markov Chain Monte Carlo Methods: Learning from Past Samples. Wiley Series in Computational Statistics (Wiley, Chichester, 2010)
L. Martino, Novel Schemes for Adaptive Rejection Sampling (Universidad Carlos III, Madrid, 2011)
L. Martino, Parsimonious adaptive rejection sampling. IET Electron. Lett. 53(16), 1115–1117 (2017)
L. Martino, F. Louzada, Adaptive rejection sampling with fixed number of nodes. Commun. Stat. Simul. Comput., 1–11 (2017, to appear)
L. Martino, J. Míguez, A generalization of the adaptive rejection sampling algorithm. Stat. Comput. 21, 633–647 (2010). doi: https://doi.org/10.1007/s11222-010-9197-9
L. Martino, J. Míguez, Generalized rejection sampling schemes and applications in signal processing. Signal Process. 90(11), 2981–2995 (2010)
L. Martino, J. Read, D. Luengo, Independent doubly adaptive rejection Metropolis sampling within Gibbs sampling, IEEE Trans. Signal Process. 63(12), 3123–3138 (2015)
L. Martino, R. Casarin, F. Leisen, D. Luengo, Adaptive independent sticky MCMC algorithms. EURASIP J. Adv. Signal Process. (2018, to appear)
R. Meyer, B. Cai, F. Perron, Adaptive rejection Metropolis sampling using Lagrange interpolation polynomials of degree 2. Comput. Stat. Data Anal. 52(7), 3408–3423 (2008)
J. Michel, The use of free energy simulations as scoring functions. PhD Thesis, University of Southampton (2006)
R. Neal, MCMC using Hamiltonian dynamics, Chap. 5, in Handbook of Markov Chain Monte Carlo ed. by S. Brooks, A. Gelman, G. Jones, X.-L. Meng (Chapman and Hall/CRC Press, Boca Raton, 2011)
J.G. Proakis, Digital Communications, 4th edn. (McGraw-Hill, Singapore, 2000)
Y.W. Teh, D. Görür, Z. Ghahramani, Stick-breaking construction for the indian buffet process, in Proceedings of the International Conference on Artificial Intelligence and Statistics (2007)
L. Tierney, Exploring posterior distributions using Markov Chains, in Computer Science and Statistics: Proceedings of IEEE 23rd Symposium on the Interface (1991), pp. 563–570
Author information
Authors and Affiliations
Rights and permissions
Copyright information
© 2018 Springer International Publishing AG, part of Springer Nature
About this chapter
Cite this chapter
Martino, L., Luengo, D., Míguez, J. (2018). Adaptive Rejection Sampling Methods. In: Independent Random Sampling Methods. Statistics and Computing. Springer, Cham. https://doi.org/10.1007/978-3-319-72634-2_4
Download citation
DOI: https://doi.org/10.1007/978-3-319-72634-2_4
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-72633-5
Online ISBN: 978-3-319-72634-2
eBook Packages: Mathematics and StatisticsMathematics and Statistics (R0)