Computational Geosciences

, Volume 17, Issue 1, pp 99–116 | Cite as

Stochastic simulation of patterns using Bayesian pattern modeling

Original Paper


In this paper, a Bayesian framework is introduced for pattern modeling and multiple point statistics simulation. The method presented here is a generalized clustering-based method where the patterns can live on a hyper-plane of very low dimensionality in each cluster. The provided generalizationallows a remarkable increase in variability of the model and a significant reduction in the number of necessary clusters for pattern modeling which leads to more computational efficiency compared with clustering-based methods. The Bayesian model employed here is a nonlinear model which is composed of a mixture of linear models. Therefore, the model is stronger than linear models for data modeling and computationally more effective than nonlinear models. Furthermore, the model allows us to extract features from incomplete patterns and to compare patterns in feature space instead of spatial domain. Due to the lower dimensionality of feature space, comparison in feature space results in more computational efficiency as well. Despite most of the previously employed methods, the feature extraction filters employed here are customized for each training image (TI). This causes the features to be more informative and useful. Using a fully Bayesian model, the method does not require extensive parameter setting and tunes its parameters itself in a principled manner. Extensive experiments on different TIs (either continuous or categorical) show that the proposed method is capable of better reproduction of complex geostatistical patterns compared with other clustering-based methods using a very limited number of clusters.


Multiple point statistics Geostatistics Stochastic simulation Bayesian pattern modeling Mixture modeling 


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. 1.
    Guardiano, F., Srivastava, M.: Multivariate geostatistics: beyond bivariate moments. In: Geostatistics Troia, pp. 133–144. Kluwer Academic, Dordrecht (1993)CrossRefGoogle Scholar
  2. 2.
    Isaaks, E.: The application of Monte Carlo methods to the analysis of spatially correlated data. Ph.D. thesis, Stanford University (1990)Google Scholar
  3. 3.
    Journel, AG.: Non-parametric estimation of spatial distributions. Math. Geol. 15(3), 445–468 (1983)CrossRefGoogle Scholar
  4. 4.
    Strebelle, S.: Conditional simulation of complex geological structures using multiple point statistics. Math. Geol. 34(1), 1–22 (2002)CrossRefGoogle Scholar
  5. 5.
    Mariethoz, G., Renard, P., Straubhaar, J.: The Direct Sampling method to perform multiple-points geostatistical simulations. Water Resour. Res. (2010) doi: 10.1029/2008WR007621
  6. 6.
    Mariethoz, G., Renard, P.: Reconstruction of incomplete data sets or images using direct sampling. Math. Geosci. 42(3), 245–268 (2010)CrossRefGoogle Scholar
  7. 7.
    Tjelmeland, H., Eidsvik, J.: Directional Metropolis–Hastings updates for posteriors with nonlinear likelihood. In: Leuangthong, O., Deutsch, C.V. (eds.) Geostatistics, pp. 195–104. Springer, Banff (2004)Google Scholar
  8. 8.
    Tjelmeland, H.: Stochastic models in reservoir characterization and Markov random fields for compact objects. Doctoral dissertation, Norwegian University of Science and Technology, Trondheim (1996)Google Scholar
  9. 9.
    Kjønsberg, H., Kolbjørnsen, O.: Markov mesh simulations with data conditioning through indicator kriging. In: Proceedings of the 8th International Geostatistics Congress. Santiago, Chile (2008)Google Scholar
  10. 10.
    Lyster, S., Deutsch, C.V.: MPS simulation in a Gibbs sampler algorithm. In: Proceedings of the 8th International Geostatistics Congress. Santiago, Chile (2008)Google Scholar
  11. 11.
    Arpat, BG., Caers, J.,: Stochastic simulation with patterns. Math. Geol. 39(202), 177–203 (2007)CrossRefGoogle Scholar
  12. 12.
    Zhang, T., Switzer, P., Journel, A.G.: Filter-based classification of training image patterns for spatial simulation. Math. Geol. 38, 63–80 (2006)CrossRefGoogle Scholar
  13. 13.
    Wu, J., Zhang, T., Journel, A.G.: A fast FILTERSIM simulation with score-based distance. Math. Geosci. 47, 773–788 (2008)CrossRefGoogle Scholar
  14. 14.
    Honarkhah, M., Caers, J.: Stochastic simulation of patterns using distance-based pattern modeling. Math. Geosci. 42, 487–517 (2010)CrossRefGoogle Scholar
  15. 15.
    Gloaguen, E., Dimitrakopoulos, R.: Two-dimensional conditional simulations based on the wavelet decomposition of training images. Math. Geosci. 41(6), 679–701 (2009)CrossRefGoogle Scholar
  16. 16.
    Tahmasebi, P., Hezarkhani, A., Sahimi, M.: Multiple-point geostatistical modeling based on the cross-correlation functions. Comput. Geosci. 16(3), 779–797 (2012)CrossRefGoogle Scholar
  17. 17.
    Bishop, C.M., Winn, J.: Non-linear Bayesian image modeling. In: Proceedings of the 6th European Conference on Computer Vision. Dublin, Ireland (2000)Google Scholar
  18. 18.
    Choudrey, R.A., Roberts, S.J.: Variational mixture of Bayesian independent component analyzers. Neural Comput. 15(1), 213–252 (2003)CrossRefGoogle Scholar
  19. 19.
    Chen, M., Silva, J., Paisley, J., Wang, C., Dunson, D., Carin, L.: Compressive sensing on manifolds using a nonparametric mixture of factor analyzers: algorithm and performance bounds. IEEE Trans. Sig. Proc. 58(12), 6140–6155 (2010)CrossRefGoogle Scholar
  20. 20.
    Carin, L., Baraniuk, R.G., Cevher, V., Dunson, D., Jordan, M.I., Sapiro, G., Wakin, M.B.: Learning low-dimensional signal models. IEEE Sig. Proc. Mag. 28(2), 39–51 (2011)CrossRefGoogle Scholar
  21. 21.
    Bouguila, N., Ziou, D.: A Dirichlet process mixture of generalized Dirichlet distributions for proportional data modeling. IEEE Trans. Neural Nets. 21(1) 107–122 (2010)CrossRefGoogle Scholar
  22. 22.
    Caers, J., Journel, A.G.: Stochastic reservoir simulation using neural networks trained on outcrop data. SPE paper 49026 (1998)Google Scholar
  23. 23.
    Gonzalez, R.C., Woods, R.E.: Digital Image Processing, 3rd edn. Prentice Hall, Upper Saddle River (2008)Google Scholar
  24. 24.
    Jolliffe, I.T.: Principal Component Analysis, 2nd edn. Springer, Heidelberg (2002)Google Scholar
  25. 25.
    Bell, A.J., Sejnowski, T.J.: An information maximization approach to blind separation and blind deconvolution. Neural Comput. 7(6), 1129–1159 (1996)CrossRefGoogle Scholar
  26. 26.
    Aharon, M., Elad, M., Bruckstein, A.M.: K-SVD: An algorithm for designing of overcomplete dictionaries for sparse representation. IEEE Trans. Sig. Proc. 54(11), 4311–4322 (2006)CrossRefGoogle Scholar
  27. 27.
    Lewicki, M.S., Olshausen, B.A.: A probabilistic framework for the adaptation and comparison of image codes. J. Opt. Soc. Amer. A: Opt., Image Sci. Vision. 16(7), 1587–1601 (1999)CrossRefGoogle Scholar
  28. 28.
    Tošić, I., Frossard, P.: Dictionary learning what is the right representation for my signal. IEEE Sig. Proc. Mag. 28(2), 27–38 (2011)CrossRefGoogle Scholar
  29. 29.
    Jiang, J.: Image compression with neural networks—a survey. Signal Process. Image Commun. 14, 737–760 (1999)CrossRefGoogle Scholar
  30. 30.
    Tsekouras, G.E, Antonios, M., Anagnostopoulos, C., Gavalas, D., Economou, D.: Improved batch fuzzy learning vector quantization for image compression. Inf. Sci. 178, 3895–3907 (2008)CrossRefGoogle Scholar
  31. 31.
    Bishop, C.M.: Pattern Recognition and Machine Learning. Springer, New York (2006)Google Scholar
  32. 32.
    Geman, S., Geman, D.: Stochastic relaxation, Gibbs distributions, and the Bayesian restoration of images. IEEE Trans. Pattern Anal. Mach. Intell. (PAMI) 6(6), 721–741 (1984)CrossRefGoogle Scholar

Copyright information

© Springer Science+Business Media Dordrecht 2012

Authors and Affiliations

  1. 1.Electrical Engineering DepartmentAmirkabir University of TechnologyTehranIran

Personalised recommendations