Skip to main content
Log in

On Approximations of the Beta Process in Latent Feature Models: Point Processes Approach

  • Published:
Sankhya A Aims and scope Submit manuscript

Abstract

In recent times, the beta process has been widely used as a nonparametric prior for different models in machine learning, including latent feature models. In this paper, we prove the asymptotic consistency of the finite dimensional approximation of the beta process due to Paisley and Carin (2009). In particular, we show that this finite approximation converges in distribution to the Ferguson and Klass representation of the beta process. We implement this approximation to derive asymptotic properties of functionals of the finite dimensional beta process. In addition, we derive an almost sure approximation of the beta process. This new approximation provides a direct method to efficiently simulate the beta process. A simulated example, illustrating the work of the method and comparing its performance to several existing algorithms, is also included.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  • Abramowitz M. and Stegun I. A. (1972). Handbook of Mathematical Functions with Formulas, Graphs, and Mathematical Tables, 10th. Dover, New York.

  • Al Labadi L. and Zarepour M. (2013a). On asymptotic properties and almost sure approximation of the norMalized inverse-Gaussian process. Bayesian Anal. 8, 3, 553–568.

    Article  MathSciNet  MATH  Google Scholar 

  • Al Labadi L. and Zarepour M. (2013b). A Bayesian nonparametric goodness of fit test for right censored data based on approximate samples from the beta-Stacy process. Canad. J. Statist. 41, 3, 466–487.

    Article  MathSciNet  MATH  Google Scholar 

  • Al Labadi L. and Zarepour M. (2014a). On Simulations from the Two-Parameter Poisson-Dirichlet Process and the NorMalized Inverse-Gaussian Process. Sankhya A 76, 158–176.

    Article  MathSciNet  MATH  Google Scholar 

  • Al Labadi L. and Zarepour M. (2014b). Goodness of fit tests based on the distance between the Dirichlet process and its base measure. J. Nonparametr. Stat. 26, 341–357.

    Article  MathSciNet  MATH  Google Scholar 

  • Billingsley P. (1999) Convergence of Probability Measures, third edition. John Wiley and Sons, Inc.

  • Broderick T, Jordan M. I. and Pitman J. (2012). Beta processes, stick-breaking, and power laws. Bayesian Anal. 7, 439–476.

    Article  MathSciNet  MATH  Google Scholar 

  • Chen B., Paisley J. and Carin L. (2010) Sparse linear regression with beta process priors. International Conference on Acoustics, Speech and Signal Processing. Dallas, TX.

  • Chen B., Chen M., Paisley J., Zaas A., Woods C., Ginsburg G. S., Hero A., Lucas J., Dunson D. and Carin L. (2010). Bayesian inference of the number of factors in gene-expression analysis: Application to human virus challenge studies. BMC Bioinformatics 11, 552.

    Article  Google Scholar 

  • Damien P., Laud P. and Smith A. F. M. (1995). Approximate random variate generation from infinitely divisible distributions with applications to Bayesian inference. J. R. Stat. Soc. Ser. B Stat. Methodol. 57, 547–563.

    MathSciNet  MATH  Google Scholar 

  • Ferguson T. S. (1973). A Bayesian analysis of some nonparametric problems. Ann. Statist. 1, 209–230.

    Article  MathSciNet  MATH  Google Scholar 

  • Ferguson T. S. and Klass M. J. (1972). A representation of independent increment processes without Gaussian components. Annals of Mathematical Statistics 1, 209–230.

    Article  MathSciNet  MATH  Google Scholar 

  • Fox E., Sudderth E., Jordan M. and Willsky A. (2009). Sharing features among dynamic systems with beta processes. Adv. Neural Inf. Process. Syst. 22, 549–557.

    Google Scholar 

  • Griffiths T. and Ghahramani Z. (2006). Infinite latent feature models and the Indian buffet process. Adv. Neural Inf. Process. Syst. 18, 475–482.

    Google Scholar 

  • Hjort N. L. (1990). Nonparametric Bayes estimators based on Beta processes in models for life history data. Ann. Statist. 18, 1259–1294.

    Article  MathSciNet  MATH  Google Scholar 

  • Ishwaran H. and Zarepour M. (2009). Series Representations for Multivariate Generalized Gamma Processes via a Scale Invariance Principle. Statist. Sinica 19, 1665–1682.

    MathSciNet  MATH  Google Scholar 

  • Ishwaran H. and Zarepour M. (2002). Exact and approximate sum representations for the Dirichlet process. Canad. J. Statist. 30, 269–283.

    Article  MathSciNet  MATH  Google Scholar 

  • Kallenberg, O. (1983) Random Measures, third edition. Akademie-Verlag, Berlin.

  • Kim Y. (1999). Nonparametric Bayesian estimators for counting processes. Ann. Statist. 27, 562–588.

    Article  MathSciNet  MATH  Google Scholar 

  • Kim Y., James L. and Weibbach R. (2012). Bayesian analysis of multistate event history data: Beta-dirichlet process prior. Biometrika 99, 127–140.

    Article  MathSciNet  MATH  Google Scholar 

  • Kingman J. F. C. (1967). Completely random measures. Pacific J. Math. 21, 59–78.

    Article  MathSciNet  MATH  Google Scholar 

  • Lee J. (2007). Sampling methods for neutral to the right processes. J. Comput. Graph. Statist. 16, 656–671.

    Article  MathSciNet  Google Scholar 

  • Lee J. and Kim Y. (2004). A new algorithm to generate beta processes. Comput. Statist. Data Anal. 25, 401–405.

    MathSciNet  MATH  Google Scholar 

  • Miller K. T. (2011) Bayesian Nonparametric Latent Feature Models. PhD thesis, University of California, Berkeley.

  • Paisley J. 2010 Machine Learning with Dirichlet and Beta Process Priors: Theory and Applications. Phd Thesis, Duke University.

  • Paisley J., Blei D. and Jordan M. (2012). Stick-breaking beta processes and the Poisson process. Proceedings of the International Conference on Artifcial Intelligence and Statistics (AISTATS).

  • Paisley J. and Carin L. (2009). Nonparametric factor analysis with beta process priors. Proceedings of the International Conference on Machine learning (ICML).

  • Paisley J., Zaas A., Woods C. W., Ginsburg G. S. and Carin L. (2010). A stick-breaking construction of the beta process. Proceedings of the International Conference on Machine learning (ICML).

  • Resnick S. I. (1987). Extreme Values Regular Variation and Point Processes. Springer, New York.

    Book  MATH  Google Scholar 

  • Sethuraman J. (1994). A constructive definition of Dirichlet priors. Statist. Sinica 4, 639–650.

    MathSciNet  MATH  Google Scholar 

  • Teh Y. W., Görür D. and Ghahramani Z. (2007). Stick-breaking construction for the Indian buffet process. Proceedings of the International Conference on Artifcial Intelligence and Statistics (AISTATS).

  • Thibaux R. and Jordan M. I. (2007). Hierarchical beta processes and the Indian buffet process. Proceedings of the International Conference on Artificial Intelligence and Statistics (AISTATS).

  • Walker S. and Muliere P. (1997). Beta-stacy processes and a generalisation of the polya-urn scheme. Ann. Statist. 25, 1762–1780.

    Article  MathSciNet  MATH  Google Scholar 

  • West M. (2003). Bayesian factor regression models in the“large p, small n” paradigm. Bayesian Anal. 7, 723–732.

    Google Scholar 

  • Wolpert R. L. and Ickstadt K. (1998). Simulation of Lévy random fields. Springer, New York. Practical Nonparametric and Semiparametric Bayesian Statistics, Day D., Nuller P. and Sinha D. (eds.), p. 237–242.

  • Zarepour M. and Al Labadi L. (2012). On a rapid simulation of the Dirichlet process. Statist. Probab. Lett. 82, 916–924.

    Article  MathSciNet  MATH  Google Scholar 

  • Zhou M., Yang H., Sapiro G., Dunson D. and Carin L. (2011) Dependent hierarchical beta process for image interpolation and denoising. Inproceedings of the International Conference on Artificial Intelligence and Statistics (AISTATS).

  • Zhou M., Chen H., Paisley J., Ren L., Li L., Xing Z., Dunson D., Sapiro G. and Carin L. (2012). Nonparametric Bayesian dictionary learning for analysis of noisy and incomplete images. IEEE Trans. Image Process 21, 130–144.

    Article  MathSciNet  MATH  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Luai Al Labadi.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Al Labadi, L., Zarepour, M. On Approximations of the Beta Process in Latent Feature Models: Point Processes Approach. Sankhya A 80, 59–79 (2018). https://doi.org/10.1007/s13171-017-0103-9

Download citation

  • Received:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s13171-017-0103-9

Keywords and phrases

AMS (2000) subject classification

Navigation