Skip to main content

ICA with Sparse Connections: Revisited

  • Conference paper
Independent Component Analysis and Signal Separation (ICA 2009)

Part of the book series: Lecture Notes in Computer Science ((LNTCS,volume 5441))

Abstract

When applying independent component analysis (ICA), sometimes we expect the connections between the observed mixtures and the recovered independent components (or the original sources) to be sparse, to make the interpretation easier or to reduce the random effect in the results. In this paper we propose two methods to tackle this problem. One is based on adaptive Lasso, which exploits the L 1 penalty with data-adaptive weights. We show the relationship between this method and the classic information criteria such as BIC and AIC. The other is based on optimal brain surgeon, and we show how its stopping criterion is related to the information criteria. This method produces the solution path of the transformation matrix, with different number of zero entries. These methods involve low computational loads. Moreover, in each method, the parameter controlling the sparsity level of the transformation matrix has clear interpretations. By setting such parameters to certain values, the results of the proposed methods are consistent with those produced by classic information criteria.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 84.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 109.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Akaike, H.: Information theory and an extension of the maximum likelihood principle. In: Proc. 2nd Int. Symp. on Information Theory, pp. 267–281 (1973)

    Google Scholar 

  2. Fan, J., Li, R.: Variable selection via nonconcave penalized likelihood and its oracle properties. J. Amer. Statist. Assoc. 96, 1348–1360 (2001)

    Article  MathSciNet  MATH  Google Scholar 

  3. Hassibi, B., Stork, D.G.: Second order derivatives for network pruning: Optimal brain surgeon. In: NIPS 5, pp. 164–171. Morgan Kaufmann, San Francisco (1993)

    Google Scholar 

  4. Hyvärinen, A., Karhunen, J., Oja, E.: Independent Component Analysis. John Wiley & Sons, Inc., Chichester (2001)

    Book  Google Scholar 

  5. Hyvärinen, A., Karthikesh, R.: Imposing sparsity on the mixing matrix in independent component analysis. Neurocomputing 49, 151–162 (2002)

    Article  MATH  Google Scholar 

  6. Pham, D.T., Garat, P.: Blind separation of mixture of independent sources through a quasi-maximum likelihood approach. IEEE Trans. on Signal Processing 45(7), 1712–1725 (1997)

    Article  MATH  Google Scholar 

  7. Schwarz, G.: Estimating the dimension of a model. The Annals of Statistics 6, 461–464 (1978)

    Article  MathSciNet  MATH  Google Scholar 

  8. Shimizu, S., Hoyer, P.O., Hyvärinen, A., Kerminen, A.J.: A linear non-Gaussian acyclic model for causal discovery. JMLR 7, 2003–2030 (2006)

    MathSciNet  MATH  Google Scholar 

  9. Silva, F.M., Almeida, L.B.: Acceleration techniques for the backpropagation algorithm. In: Neural Networks, pp. 110–119. Springer, Heidelberg (1990)

    Chapter  Google Scholar 

  10. Tibshirani, R.: Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society 58(1), 267–288 (1996)

    MathSciNet  MATH  Google Scholar 

  11. Zhang, K., Chan, L.-W.: ICA with sparse connections. In: Corchado, E., Yin, H., Botti, V., Fyfe, C. (eds.) IDEAL 2006. LNCS, vol. 4224, pp. 530–537. Springer, Heidelberg (2006)

    Chapter  Google Scholar 

  12. Zhang, K., Chan, L.: Minimal nonlinear distortion principle for nonlinear independent component analysis. JMLR 9, 2455–2487 (2008)

    MathSciNet  MATH  Google Scholar 

  13. Zhao, P., Yu, B.: On model selection consistency of lasso. JMLR 7, 2541–2563 (2006)

    MathSciNet  MATH  Google Scholar 

  14. Zou, H.: The adaptive lasso and its oracle properties. Journal of the American Statistical Association 101(476), 1417–1429 (2006)

    Article  MathSciNet  MATH  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2009 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Zhang, K., Peng, H., Chan, L., Hyvärinen, A. (2009). ICA with Sparse Connections: Revisited. In: Adali, T., Jutten, C., Romano, J.M.T., Barros, A.K. (eds) Independent Component Analysis and Signal Separation. ICA 2009. Lecture Notes in Computer Science, vol 5441. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-00599-2_25

Download citation

  • DOI: https://doi.org/10.1007/978-3-642-00599-2_25

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-642-00598-5

  • Online ISBN: 978-3-642-00599-2

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics