Advertisement

Pseudo-Gibbs sampler for discrete conditional distributions

  • Kun-Lin Kuo
  • Yuchung J. WangEmail author
Article
  • 123 Downloads

Abstract

Conditionally specified models offers a higher level of flexibility than the joint approach. Regression switching in multiple imputation is a typical example. However, reasonable-seeming conditional models are generally not coherent with one another. Gibbs sampler based on incompatible conditionals is called pseudo-Gibbs sampler, whose properties are mostly unknown. This article investigates the richness and commonalities among their stationary distributions. We show that Gibbs sampler replaces the conditional distributions iteratively, but keep the marginal distributions invariant. In the process, it minimizes the Kullback–Leibler divergence. Next, we prove that systematic pseudo-Gibbs projections converge for every scan order, and the stationary distributions share marginal distributions in a circularly fashion. Therefore, regardless of compatibility, univariate consistency is guaranteed when the orders of imputation are circularly related. Moreover, a conditional model and its pseudo-Gibbs distributions have equal number of parameters. Study of pseudo-Gibbs sampler provides a fresh perspective for understanding the original Gibbs sampler.

Keywords

Incompatibility Iterative conditional replacement Kullback–Leibler information divergence Multiple imputation Scan order Stationary distribution 

Notes

Acknowledgements

This work was supported in part by the Ministry of Science and Technology, Taiwan (MOST 104-2118-M-390-001 and MOST 105-2118-M-390-002). The authors thank two referees and one Associated Editor for their comments.

References

  1. Chen, S.-H., Ip, E. H., Wang, Y. J. (2011). Gibbs ensembles for nearly compatible and incompatible conditional models. Computational Statistics and Data Analysis, 55, 1760–1769.Google Scholar
  2. Chen, S.-H., Ip, E. H., Wang, Y. J. (2013). Gibbs ensembles for incompatible dependence networks. WIREs Computational Statistics, 5, 478–485.Google Scholar
  3. Csiszár, I. (1975). I-divergence geometry of probability distributions and minimization problems. Annals of Probability, 3, 146–158.Google Scholar
  4. Darroch, J. N., Ratcliff, D. (1972). Generalized iterative scaling for log-linear models. Annals of Mathematical Statistics, 43, 1470–1480.Google Scholar
  5. Drechsler, J., Rässler, S. (2008). Does convergence really matter? In Shalabh, C. Heumann (Eds.), Recent advances in linear models and related areas (pp. 341–355). Heidelberg: Physica-Verlag.Google Scholar
  6. Gelman, A., Raghunathan, T. E. (2001). Comment on “Conditionally specified distributions” by B.C. Arnold, E. Castillo and J.M. Sarabia. Statistical Science, 16, 268–269.Google Scholar
  7. Heckerman, D., Chickering, D. M., Meek, C., Rounthwaite, R., Kadie, C. (2000). Dependency networks for inference, collaborative filtering, and data visualization. Journal of Machine Learning Research, 1, 49–75.Google Scholar
  8. Hughes, R. A., White, I. R., Seaman, S. R., Cappenter, J. R., Tilling, K., Sterne, J. A. C. (2014). Joint modelling rationale for chained equations. BMC Medical Research Methodology, 14, 28.Google Scholar
  9. Kuo, K.-L., Wang, Y. J. (2011). A simple algorithm for checking compatibility among discrete conditional distributions. Computational Statistics and Data Analysis, 55, 2457–2462.Google Scholar
  10. van Buuren, S., Boshuizen, H. C., Knook, D. L. (1999). Multiple imputation of missing blood pressure covariates in survival analysis. Statistics in Medicine, 18, 681–94.Google Scholar
  11. van Buuren, S., Brand, J. P. L., Groothuis-Oudshoorn, C. G. M., Rubin, D. B. (2006). Fully conditional specification in multivariate imputation. Journal of Statistical Computation and Simulation, 76, 1049–1064.Google Scholar

Copyright information

© The Institute of Statistical Mathematics, Tokyo 2017

Authors and Affiliations

  1. 1.Institute of StatisticsNational University of KaohsiungKaohsiungTaiwan
  2. 2.Department of Mathematical SciencesRutgers UniversityCamdenUSA

Personalised recommendations