Skip to main content

Closed-Form EM for Sparse Coding and Its Application to Source Separation

  • Conference paper
  • 2485 Accesses

Part of the book series: Lecture Notes in Computer Science ((LNTCS,volume 7191))

Abstract

We define and discuss a novel sparse coding algorithm based on closed-form EM updates and continuous latent variables. The underlying generative model consists of a standard ‘spike-and-slab’ prior and a Gaussian noise model. Closed-form solutions for E- and M-step equations are derived by generalizing probabilistic PCA. The resulting EM algorithm can take all modes of a potentially multimodal posterior into account. The computational cost of the algorithm scales exponentially with the number of hidden dimensions. However, with current computational resources, it is still possible to efficiently learn model parameters for medium-scale problems. Thus, the algorithm can be applied to the typical range of source separation tasks. In numerical experiments on artificial data we verify likelihood maximization and show that the derived algorithm recovers the sparse directions of standard sparse coding distributions. On source separation benchmarks comprised of realistic data we show that the algorithm is competitive with other recent methods.

This is a preview of subscription content, log in via an institution.

Buying options

Chapter
USD   29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD   39.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD   54.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Learn about institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Amari, S., Cichocki, A., Yang, H.H.: A new learning algorithm for blind signal separation. In: NIPS, pp. 757–763. MIT Press (1996)

    Google Scholar 

  2. Bishop, C.: Pattern Recognition and Machine Learning. Springer, Heidelberg (2006)

    MATH  Google Scholar 

  3. Cichocki, A., Amari, S., Siwek, K., Tanaka, T., Phan, A., Zdunek, R.: ICALAB-MATLAB Toolbox Version 3 (2007)

    Google Scholar 

  4. Dayan, P., Abbott, L.F.: Theoretical Neuroscience. MIT Press, Cambridge (2001)

    MATH  Google Scholar 

  5. Jaakkola, T.: Tutorial on variational approximation methods. In: Opper, M., Saad, D. (eds.) Advanced Mean Field Methods: Theory and Practice. MIT Press (2000)

    Google Scholar 

  6. Jordan, M., Ghahramani, Z., Jaakkola, T., Saul, L.: An introduction to variational methods for graphical models. Machine Learning 37, 183–233 (1999)

    Article  MATH  Google Scholar 

  7. Knowles, D., Ghahramani, Z.: Infinite Sparse Factor Analysis and Infinite Independent Components Analysis. In: Davies, M.E., James, C.J., Abdallah, S.A., Plumbley, M.D. (eds.) ICA 2007. LNCS, vol. 4666, pp. 381–388. Springer, Heidelberg (2007)

    Chapter  Google Scholar 

  8. Knowles, D., Ghahramani, Z.: Nonparametric Bayesian sparse factor models with application to gene expression modelling. CoRR, abs/1011.6293 (2010)

    Google Scholar 

  9. Lee, H., Battle, A., Raina, R., Ng, A.: Efficient sparse coding algorithms. In: NIPS, vol. 22, pp. 801–808 (2007)

    Google Scholar 

  10. Lücke, J., Sheikh, A.-S.: Closed-form EM for sparse coding and its application to source separation. arXiv:1105.2493v1 [stat.ML]

    Google Scholar 

  11. Mairal, J., Bach, F., Ponce, J., Sapiro, G.: Online dictionary learning for sparse coding. In: ICML, p. 87 (2009)

    Google Scholar 

  12. Mitchell, T.J., Beauchamp, J.J.: Bayesian variable selection in linear regression. Journal of the American Statistical Association 83(404), 1023–1032 (1988)

    Article  MathSciNet  MATH  Google Scholar 

  13. Mohamed, S., Heller, K., Ghahramani, Z.: Sparse exponential family latent variable models. In: NIPS Workshop (2010)

    Google Scholar 

  14. Neal, R., Hinton, G.: A view of the EM algorithm that justifies incremental, sparse, and other variants. In: Jordan, M.I. (ed.) Learning in Graphical Models. Kluwer (1998)

    Google Scholar 

  15. Olshausen, B., Field, D.: Emergence of simple-cell receptive field properties by learning a sparse code for natural images. Nature 381, 607–609 (1996)

    Article  Google Scholar 

  16. Paisley, J.W., Carin, L.: Nonparametric factor analysis with beta process priors. In: ICML, p. 98 (2009)

    Google Scholar 

  17. Seeger, M.: Bayesian inference and optimal design for the sparse linear model. JMLR 9, 759–813 (2008)

    MathSciNet  MATH  Google Scholar 

  18. Suzuki, T., Sugiyama, M.: Least-squares independent component analysis. Neural Computation 23(1), 284–301 (2011)

    Article  MathSciNet  MATH  Google Scholar 

  19. Teh, Y.W., Görür, D., Ghahramani, Z.: Stick-breaking construction for the indian buffet process. Journal of Machine Learning Research - Proceedings Track 2, 556–563 (2007)

    Google Scholar 

  20. West, M.: Bayesian factor regression models in the ”large p, small n” paradigm. In: Bayesian Statistics, vol. 7, pp. 723–732. Oxford University Press (2003)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Fabian Theis Andrzej Cichocki Arie Yeredor Michael Zibulevsky

Rights and permissions

Reprints and permissions

Copyright information

© 2012 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Lücke, J., Sheikh, AS. (2012). Closed-Form EM for Sparse Coding and Its Application to Source Separation. In: Theis, F., Cichocki, A., Yeredor, A., Zibulevsky, M. (eds) Latent Variable Analysis and Signal Separation. LVA/ICA 2012. Lecture Notes in Computer Science, vol 7191. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-28551-6_27

Download citation

  • DOI: https://doi.org/10.1007/978-3-642-28551-6_27

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-642-28550-9

  • Online ISBN: 978-3-642-28551-6

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics