Parallel Gibbs Sampler for Wavelet-Based Bayesian Compressive Sensing with High Reconstruction Accuracy


Bayesian compressive sensing (BCS) helps address ill-posed signal recovery problems using the Bayesian estimation framework. Gibbs sampling is a technique used in Bayesian estimation that iteratively draws samples from conditional posterior distributions, which is inherently sequential. In this work, we propose a two-stage parallel coefficient update scheme for wavelet-based BCS, where the first stage approximates the real distributions of the wavelet coefficients and the second stage computes the final estimate of the coefficients. While in the first stage, the parallel computing units share information with each other, in the second stage, the parallel units work independently. Even when the computing units share information, when the number of computing units is large, the process deviates from the sequential Gibbs sampler resulting in large reconstruction error. We propose two new coefficient re-computation schemes to reduce the reconstruction error at the cost of longer computation time. We also propose a new coefficient update scheme that updates coefficients in both stages based on data generated a few rounds ago. Such a scheme helps in relaxing the timing constraints for communication in the first stage and computations in the second stage. We design the corresponding parallel architecture and synthesize it in 7 nm technology node. For the system with 8 computing units, the proposed algorithm reduces the execution time up to 6.8× at maximum compared to the sequential implementation.

This is a preview of subscription content, log in to check access.

Figure 1
Figure 2
Figure 3
Figure 4
Figure 5
Figure 6
Figure 7
Figure 8
Figure 9
Figure 10
Figure 11
Figure 12
Figure 13
Figure 14


  1. 1.

    The brand new sections are Sections, and 5.2. We also revise Sections, and 5.4.


  1. 1.

    Antonini, M., Barlaud, M., Mathieu, P., & Daubechies, I. (1992). Image coding using wavelet transform. IEEE Transactions on Image Processing, 1, 205–220.

    Article  Google Scholar 

  2. 2.

    Usevitch, B.E. (2001). Tutorial on modern lossy wavelet image compression: foundations of JPEG 2000. IEEE Signal Processing Magazine, 18, 22–35.

    Article  Google Scholar 

  3. 3.

    Candès, E.J., & Wakin, M.B. (2008). An introduction to compressive sampling. IEEE Signal Processing Magazine, 25(2), 21–30.

    Article  Google Scholar 

  4. 4.

    Duarte, M.F., Davenport, M.A., Takhar, D., Laska, J.N., Sun, T., Kelly, K.F., & Baraniuk, R.G. (2008). Single-pixel imaging via compressive sampling. IEEE Signal Processing Magazine, 25(2), 83–91.

    Article  Google Scholar 

  5. 5.

    Donoho, D.L. (2006). Compressed sensing. IEEE Transactions on Information Theory, 52(4), 1289–1306.

    MathSciNet  Article  Google Scholar 

  6. 6.

    Tropp, J.A., & Gilbert, A.C. (2007). Signal recovery from random measurements via orthogonal matching pursuit. IEEE Transactions on Information Theory, 53(12), 4655–4666.

    MathSciNet  Article  Google Scholar 

  7. 7.

    Baraniuk, R.G., Cevher, V., Duarte, M.F., & Hegde, C. (2010). Model-based compressive sensing. IEEE Transactions on Information Theory, 56, 1982–2001.

    MathSciNet  Article  Google Scholar 

  8. 8.

    Ji, S., Xue, Y., & Carin, L. (2008). Bayesian compressive sensing. IEEE Transactions on Signal Processing, 56(6), 2346–2356.

    MathSciNet  Article  Google Scholar 

  9. 9.

    He, L., & Carin, L. (2009). Exploiting structure in wavelet-based Bayesian compressive sensing. IEEE Transactions on Signal Processing, 57(9), 3488–3497.

    MathSciNet  Article  Google Scholar 

  10. 10.

    Robert, C.P., & Casella, G. (2004). Monte Carlo statistical methods, 2nd edn. New York: Springer.

    Google Scholar 

  11. 11.

    Johnson, M., Saunderson, J., & Willsky, A. (2013). Analyzing Hogwild parallel Gaussian Gibbs sampling. In Advances in neural information processing systems (pp. 2715–2723).

  12. 12.

    Terenin, A., Simpson, D., & Draper, D. (2018). Asynchronous Gibbs sampling. arXiv:

  13. 13.

    Zhou, J., & Chakrabarti, C. (2018). Parallel wavelet-based Bayesian compressive sensing based on Gibbs sampling. In IEEE International workshop on signal processing systems (SiPS) (pp. 140–145).

  14. 14.

    Zhang, S., Choromanska, A.E., & LeCun, Y. (2015). Deep learning with elastic averaging SGD. In Advances in neural information processing systems (pp. 685–693).

  15. 15.

    Crouse, M.S., Nowak, R.D., & Baraniuk, R.G. (1998). Wavelet-based statistical signal processing using hidden Markov model. IEEE Transactions on Signal Processing, 46, 886–902.

    MathSciNet  Article  Google Scholar 

  16. 16.

    Mallat, S. (2009). A wavelet tour of signal processing, 3rd edn. New York: Academic Press.

    Google Scholar 

  17. 17.

    Sahlin, K. (2011). Estimating convergence of Markov chain Monte Carlo simulations, Stockholm University Master Thesis.

  18. 18.

    Chib, S., & Greenberg, E. (1995). Understanding the metropolis-hastings algorithm. The American Statistician, 49(4), 327–335.

    Google Scholar 

  19. 19.

    Daubechies, I. (1992). Ten lectures on wavelets. SIAM.

  20. 20.

    Clark, L.T., Vashishtha, V., Shifren, L., Gujja, A., Sinha, S., Cline, B., Ramamurthy, C., & Yeric, G. (2016). ASAP7: a 7-nm FinFET predictive process design kit. Microelectronics Journal, 53, 105–115.

    Article  Google Scholar 

Download references

Author information



Corresponding author

Correspondence to Chaitali Chakrabarti.

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

Zhou, J., Papandreou-Suppappola, A. & Chakrabarti, C. Parallel Gibbs Sampler for Wavelet-Based Bayesian Compressive Sensing with High Reconstruction Accuracy. J Sign Process Syst (2020).

Download citation


  • Gibbs sampling
  • Parallel implementation
  • Relaxed computing
  • Bayesian compressive sensing