Advertisement

Toward Robust and Fast Two-Dimensional Linear Discriminant Analysis

  • Tetsuya Yoshida
  • Yuu Yamada
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 8210)

Abstract

This paper presents an approach toward robust and fast Two-Dimensional Linear Discriminant Analysis (2DLDA). 2DLDA is an extension of Linear Discriminant Analysis (LDA) for 2-dimensional objects such as images. Linear transformation matrices are iteratively calculated based on the eigenvectors of asymmetric matrices in 2DLDA. However, repeated calculation of eigenvectors of asymmetric matrices may lead to unstable performance. We propose to use simultaneous diagonalization of scatter matrices so that eigenvectors can be stably calculated. Furthermore, for fast calculation, we propose to use approximate decomposition of a scatter matrix based on its several leading eigenvectors. Preliminary experiments are conducted to investigate the effectiveness of our approach. Results are encouraging, and indicate that our approach can achieve comparative performance with the original 2DLDA with reduced computation time.

Keywords

Linear Discriminant Analysis Class Number Normalize Mutual Information Scatter Matrix Label Information 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Anderson, T.W.: An Introduction to Multivariate Statistical Analysis. Wiley-Interscience (2003)Google Scholar
  2. 2.
    Ding, C., Ye, J.: Two-dimensional singular value decomposition (2dsvd) for 2d maps and images. In: Proc. of SDM 2005, pp. 32–43 (2005)Google Scholar
  3. 3.
    Hartigan, J., Wong, M.: Algorithm as136: A k-means clustering algorithm. Journal of Applied Statistics 28, 100–108 (1979)CrossRefzbMATHGoogle Scholar
  4. 4.
    Harville, D.A.: Matrix Algebra From a Statistican’s Perspective. Splinger (2008)Google Scholar
  5. 5.
    Strehl, A., Ghosh, J.: Cluster ensembles — a knowledge reuse framework for combining multiple partitions. J. Machine Learning Research 3(3), 583–617 (2002)MathSciNetGoogle Scholar
  6. 6.
    Swets, D.L., Weng, J.J.: Using discriminat eigenfeatures for image retrieval. IEEE Transactions on Pattern Analysis and Machine Intelligence 18(8), 831–836 (1996)CrossRefGoogle Scholar
  7. 7.
    von Luxburg, U.: A tutorial on spectral clustering. Statistics and Computing 17(4), 395–416 (2007)MathSciNetCrossRefGoogle Scholar
  8. 8.
    Voss, J.: Tagging, folksonomy & co – renaissance of manual indexing? In: Proc. 10th International Symposium for Information Science, pp. 234–254 (2007)Google Scholar
  9. 9.
    Yang, J., Zhang, D., Frangi, A.F., Yu Yang, J.: Two-dimensional pca: A new approach to appearance-based face representation and recognition. IEEE Transactions on Pattern Analysis and Machine Intelligence 26(1), 131–137 (2004)CrossRefGoogle Scholar
  10. 10.
    Ye, J., Janardan, R., Li, Q.: Two-dimensional linear discriminant analysis. In: Proc. of NIPS 2004, pp. 1569–1576 (2004)Google Scholar

Copyright information

© Springer International Publishing Switzerland 2013

Authors and Affiliations

  • Tetsuya Yoshida
    • 1
  • Yuu Yamada
    • 1
  1. 1.Graduate School of Information Science and TechnologyHokkaido UniversitySapporoJapan

Personalised recommendations