Advertisement

A Robust Objective Function of Joint Approximate Diagonalization

  • Yoshitatsu Matsuda
  • Kazunori Yamaguchi
Part of the Lecture Notes in Computer Science book series (LNCS, volume 7553)

Abstract

Joint approximate diagonalization (JAD) is a method solving blind source separation, which can extract non-Gaussian sources without any other prior knowledge. However, it is not robust when the sample size is small because JAD is based on an algebraic objective function. In this paper, a new robust objective function of JAD is derived by an information theoretic approach. It has been shown in previous works that the “true” probabilistic distribution of non-diagonal elements of approximately-diagonalized cumulant matrices in JAD is Gaussian with a fixed variance. Here, the distribution of the diagonal elements is also approximated as Gaussian where the variance is an adjustable parameter. Then, a new objective function is defined as the likelihood of the distribution. Numerical experiments verify that the new objective function is effective when the sample size is small.

Keywords

blind source separation independent component analysis joint approximate diagonalization information theoretic approach 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Amari, S., Cichocki, A.: A new learning algorithm for blind signal separation. In: Touretzky, D., Mozer, M., Hasselmo, M. (eds.) Advances in Neural Information Processing Systems 8, pp. 757–763. MIT Press, Cambridge (1996)Google Scholar
  2. 2.
    Cardoso, J.F.: High-order contrasts for independent component analysis. Neural Computation 11(1), 157–192 (1999)CrossRefMathSciNetGoogle Scholar
  3. 3.
    Cardoso, J.F., Souloumiac, A.: Blind beamforming for non Gaussian signals. IEE Proceedings-F 140(6), 362–370 (1993)Google Scholar
  4. 4.
    Cichocki, A., Amari, S.: Adaptive Blind Signal and Image Processing: Learning Algorithms and Applications. Wiley (2002)Google Scholar
  5. 5.
    Hyvärinen, A., Karhunen, J., Oja, E.: Independent Component Analysis. Wiley (2001)Google Scholar
  6. 6.
    Lee, T.W., Girolami, M., Sejnowski, T.J.: Independent component analysis using an extended infomax algorithm for mixed subgaussian and supergaussian sources. Neural Computation 11(2), 417–441 (1999)CrossRefGoogle Scholar
  7. 7.
    Matsuda, Y., Yamaguchi, K.: An adaptive threshold in joint approximate diagonalization by assuming exponentially distributed errors. Neurocomputing 74, 1994–2001 (2011)CrossRefGoogle Scholar
  8. 8.
    Matsuda, Y., Yamaguchi, K.: An Information Theoretic Approach to Joint Approximate Diagonalization. In: Lu, B.-L., Zhang, L., Kwok, J. (eds.) ICONIP 2011, Part I. LNCS, vol. 7062, pp. 20–27. Springer, Heidelberg (2011)CrossRefGoogle Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2012

Authors and Affiliations

  • Yoshitatsu Matsuda
    • 1
  • Kazunori Yamaguchi
    • 2
  1. 1.Department of Integrated Information TechnologyAoyama Gakuin UniversitySagamihara-shiJapan
  2. 2.Department of General Systems Studies, Graduate School of Arts and SciencesThe University of TokyoMeguro-kuJapan

Personalised recommendations