Advertisement

Graph Pattern Spaces from Laplacian Spectral Polynomials

  • Bin Luo
  • Richard C. Wilson
  • Edwin R. Hancock
Part of the Lecture Notes in Computer Science book series (LNCS, volume 3211)

Abstract

Graph structures have proved computationally cumbersome for pattern analysis. The reason for this is that before graphs can be converted to pattern vectors, correspondences must be established between the nodes of structures which are potentially of different size. To overcome this problem, in this paper we turn to the spectral decomposition of the Laplacian matrix. We show how the elements of the spectral matrix for the Laplacian can be used to construct symmetric polynomials that are permutation invariants. The co-efficients of these polynomials can be used as graph-features which can be encoded in a vectorial manner. We explore whether the vectors of invariants can be embedded in a low dimensional space using a number of alternative strategies including principal components analysis (PCA), multidimensional scaling (MDS) and locality preserving projection (LPP).

Keywords

Multidimensional Scaling Independent Component Analysis Laplacian Matrix Symmetric Polynomial Pattern Space 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. 1.
    Christmas, W.J., Kittler, J., Petrou, M.: Structural Matching in Computer Vision using Probabilistic Relaxation. IEEE PAMI 17, 749–764 (1995)Google Scholar
  2. 2.
    Chung, F.R.K.: Spectral Graph Theory. In: CBMS series 92, American Mathmatical Society Ed. Providence (1997)Google Scholar
  3. 3.
    Umeyama, S.: An eigen decomposition approach to weighted graph matching problems. IEEE PAMI 10, 695–703 (1988)zbMATHGoogle Scholar
  4. 4.
    Luo, B., Wilson, R.C., Hancock, E.R.: Spectral Embedding of Graphs. Pattern Recognition 36, 2213–2230 (2003)zbMATHCrossRefGoogle Scholar
  5. 5.
    Roweis, S., Saul, L.: Non-linear dimensionality reduction by locally linear embedding. Science 299, 2323–2326 (2002)Google Scholar
  6. 6.
    Tenenbaum, J.B., Silva, V.D., Langford, J.C.: A global geometric framework for non-linear dimensionality reduction. Science 290, 586–591 (2000)CrossRefGoogle Scholar
  7. 7.
    He, X., Niyogi, P.: Locality preserving projections In: NIPS 2003 (to appear) Google Scholar
  8. 8.
    Shokoufandeh, S.D., Siddiqi, K., Zucker, S.: Indexing using a Spectral Coding of Topological Structure. In: CVPR, pp. 491–497 (1999)Google Scholar
  9. 9.
    Segen, J.: Learning graph models of shape. In: Laird, J. (ed.) Proceedings of the Fifth International Conference on Machine Learning, pp. 29–25 (1988)Google Scholar
  10. 10.
    Bagdanov, A.D., Worring, M.: First Order Gaussian Graphs for Efficient Structure Classification. Pattern Recogntion 36, 1311–1324 (2003)zbMATHCrossRefGoogle Scholar
  11. 11.
    Wong, A.K.C., Constant, J., You, M.L.: Random Graphs. In: Syntactic and Structural Pattern Recognition, World Scientific, Singapore (1990)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2004

Authors and Affiliations

  • Bin Luo
    • 1
  • Richard C. Wilson
    • 1
  • Edwin R. Hancock
    • 1
  1. 1.Department of Computer ScienceUniversity of YorkYorkUK

Personalised recommendations