Gibbs Sampling

Part of the Use R! book series (USE R)

One attractive method for constructing an MCMC algorithm is Gibbs sampling, introduced in Chapter 6. To slightly generalize our earlier discussion, suppose that we partition the parameter vector of interest into p components θ = (θ1,..., θp), where θk may consist of a vector of parameters. The MCMC algorithm is implemented by sampling in turn from the p conditional posterior distributions.


Posterior Distribution Grade Point Average Gibbs Sampling Posterior Density Order Restriction 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

Copyright information

© Springer Science+Business Media, LLC 2007

Personalised recommendations