Gibbs Sampling

  • Jim Albert

One attractive method for constructing an MCMC algorithm is Gibbs sampling, introduced in Chapter 6. To slightly generalize our earlier discussion, suppose that we partition the parameter vector of interest into \(p\) components \(\theta = (\theta_1, \ldots, \theta_p)\), where \(\theta_k\) may consist of a vector of parameters. The MCMC algorithm is implemented by sampling in turn from the \(p\) conditional posterior distributions.


Posterior Distribution Grade Point Average Gibbs Sampling Posterior Density Order Restriction 


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

Copyright information

© Springer-Verlag New York 2009

Authors and Affiliations

  • Jim Albert
    • 1
  1. 1.Bowling Green state UniversityBowling GreenUSA

Personalised recommendations