Skip to main content

Practical Algorithms of Spectral Clustering: Toward Large-Scale Vision-Based Motion Analysis

  • Chapter
Machine Learning for Vision-Based Motion Analysis

Part of the book series: Advances in Pattern Recognition ((ACVPR))

Abstract

This chapter presents some practical algorithms of spectral clustering for large-scale data. Spectral clustering is a kernel-based method of grouping data on separate nonlinear manifolds. Reducing its computational expense without critical loss of accuracy contributes to its practical use especially in vision-based applications. The present algorithms exploit random projection and subsampling techniques for reducing dimensionality and the cost for evaluating pairwise similarities of data. The computation time is quasilinear with respect to the data cardinality, and it can be independent of data dimensionality in some appearance-based applications. The efficiency of the algorithms is demonstrated in appearance-based image/video segmentation.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 129.00
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 169.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 169.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    C can be a forward circulant matrix. We prefer the back-circulant matrix just because it is symmetric.

References

  1. Achlioptas, D.: Database-friendly random projections: Johnson–Lindenstrauss with binary coins. J. Comput. Syst. Sci. 66, 671–687 (2003)

    Article  MathSciNet  MATH  Google Scholar 

  2. Ali, S., Shah, M.: A Lagrangian particle dynamics approach for crowd flow segmentation and stability analysis. In: CVPR (2007)

    Google Scholar 

  3. Berry, M.W.: Large scale sparse singular value computations. Int. J. Supercomput. Appl. 6, 13–49 (1992)

    Google Scholar 

  4. Brigham, E., Maninila, H.: Random projection in dimensionality reduction: applications to image and text data. In: ACM SIGKDD ICKDDM, pp. 245–250. ACM, New York (2001)

    Google Scholar 

  5. Carnegie Mellon University Informedia Project: Mountain Skywater, segment 11 of 12. http://www.open-video.org/ (1996)

  6. Cour, T., Benezit, F., Shi, J.: Spectral segmentation with multiscale graph decomposition. In: CVPR Proc., vol. 2, pp. 1124–1131. IEEE Comput. Soc., Washington (2005)

    Google Scholar 

  7. Dasgupta, S., Gupta, A.: An elementary proof of the Johnson–Lindenstrauss lemma. Technical Report, UC Berkeley (1999)

    Google Scholar 

  8. Dhillon, I.S.: Co-clustering documents and words using bipartite spectral graph partitioning. In: ACM SIGKDD, pp. 269–274. ACM, New York (2001)

    Google Scholar 

  9. Ding, C.H.Q., He, X., Zha, H., Gu, M., Simon, H.D.: A min-max cut algorithm for graph partitioning and data clustering. In: Proc. of ICDM 2001, pp. 107–114 (2001)

    Google Scholar 

  10. Drineas, P., Mahoney, M.W.: On the Nyström method for approximating a Gram matrix for improved kernel-based learning. J. Mach. Learn. Res. 6, 2153–2175 (2005)

    MathSciNet  MATH  Google Scholar 

  11. Eibl, G., Brändle, N.: Evaluation of clustering methods for finding dominant optical flow fields in crowded scenes. In: ICPR08 (2008)

    Google Scholar 

  12. Fiedler, M.: A property of eigenvectors of nonnegative symmetric matrices and its application to graph theory. Czechoslov. Math. J. 25, 619–633 (1975)

    Article  MathSciNet  Google Scholar 

  13. Fowlkes, C., Belongie, S., Chung, F., Malik, J.: Spectral grouping using the Nyström method. IEEE Trans. Pattern Anal. Mach. Intell. 26(2), 214–225 (2004)

    Article  Google Scholar 

  14. Fradkin, D., Madigan, D.: Experiments with random projections for machine learning. In: ACM SIGKDD ICKDDM, pp. 517–522. ACM, New York (2003)

    Google Scholar 

  15. Freitas, N.D., Wang, Y., Mahdaviani, M., Lang, D.: Fast Krylov methods for N-body learning. In: Advances in Neural Information Processing Systems, vol. 18, pp. 251–258. MIT Press, Cambridge (2006)

    Google Scholar 

  16. Golub, G.H., Loan, C.F.V.: Matrix Computations, 3rd edn. Johns Hopkins University Press, Baltimore (1996)

    MATH  Google Scholar 

  17. Gu, M., Eisenstat, S.C.: A stable and fast algorithm for updating the singular value decomposition. Technical Report YALEU/DCS/RR-966, Yale University (1994)

    Google Scholar 

  18. Hagen, L., Kahng, A.: New spectral methods for ratio cut partitioning and clustering. IEEE Trans. Comput. Aided Des. 11(9), 1074–1085 (1992)

    Article  Google Scholar 

  19. IEICE Information and System Society: PRMU algorithm contest 2006: Shot boundary detection from image sequence, sample data. http://www-sens.sys.es.osaka-u.ac.jp/alcon/data/level3.avi (2006) (in Japanese)

  20. Johnson, W., Lindenstrauss, J.: Extensions of Lipschitz maps into a Hilbert space. Contemp. Math. 26, 189–206 (1984)

    Article  MathSciNet  MATH  Google Scholar 

  21. Mahadevan, S.: Fast spectral learning using Lanczos eigenspace projections. In: AAAI, pp. 1472–1475 (2008)

    Google Scholar 

  22. Ng, A.Y., Jordan, M.I., Weiss, Y.: On spectral clustering: analysis and an algorithm. In: Advances in Neural Information Processing Systems, vol. 14, pp. 849–856. MIT Press, Cambridge (2002)

    Google Scholar 

  23. Sakai, T.: Monte Carlo subspace method: an incremental approach to high-dimensional data classification. In: International Conference on Pattern Recognition (2008)

    Google Scholar 

  24. Scott, G.L., Longuet-Higgins, H.C.: Feature grouping by relocalisation of eigenvectors of the proximity matrix. In: British Machine Vision Conference, pp. 103–108 (1990)

    Google Scholar 

  25. Shi, J., Malik, J.: Normalized cuts and image segmentation. IEEE Trans. Pattern Anal. Mach. Intell. 22(8), 888–905 (2000)

    Article  Google Scholar 

  26. Song, Y., Chen, W.Y., Bai, H., Lin, C.J., Chang, E.Y.: Parallel spectral clustering. In: ECML PKDD. Lecture Notes in Computer Science, vol. 5212, pp. 374–389. Springer, Berlin (2008)

    Google Scholar 

  27. Strehl, A., Ghosh, J.: Cluster ensembles—a knowledge reuse framework for combining multiple partitions. J. Mach. Learn. Res. 3, 583–617 (2002)

    MathSciNet  Google Scholar 

  28. Vempala, S.S.: The Random Projection Method. Series in Discrete Mathematics and Theoretical Computer Science, vol. 65. American Mathematical Society, Providence (2004)

    MATH  Google Scholar 

  29. Von Luxburg, U.: A tutorial on spectral clustering. Stat. Comput. 17(4), 395–416 (2007)

    Article  MathSciNet  Google Scholar 

  30. Watanabe, T., Takimoto, E., Amano, K., Maruoka, A.: Random projection and its application to learning. In: Proc. 2005 Workshop on Randomness and Computation, pp. 3–4 (2005)

    Google Scholar 

  31. Weiss, Y.: Segmentation using eigenvectors: a unifying view. In: International Conference on Computer Vision, pp. 975–982 (1999)

    Google Scholar 

  32. Williams, C.K.I., Seeger, M.: Using the Nyström method to speed up kernel machines. In: Advances in Neural Information Processing Systems, vol. 13, pp. 682–688. MIT Press, Cambridge (2001)

    Google Scholar 

  33. Yu, S.X., Shi, J.: Multiclass spectral clustering. In: International Conference on Computer Vision, pp. 313–319 (2003)

    Google Scholar 

  34. Zach, C., Pock, T., Bischof, H.: A duality based approach for realtime TV-L1 optical flow. In: Pattern Recognition, Proc. DAGM, Heidelberg, Germany, pp. 214–223 (2007)

    Google Scholar 

  35. Zelnik-Manor, L., Perona, P.: Self-tuning spectral clustering. In: Advances in Neural Information Processing Systems, vol. 17, pp. 1601–1608. MIT Press, Cambridge (2004)

    Google Scholar 

  36. Zhang, K., Kwok, J.T.: Density-weighted Nyström method for computing large kernel eigensystems. Neural Comput. 21(1), 121–146 (2009). doi:10.1162/neco.2009.11-07-651

    Article  MathSciNet  MATH  Google Scholar 

Download references

Acknowledgement

The first author was partially supported by the Grant-in-Aid for Young Scientists, from the Ministry of Education, Culture, Sports, Science and Technology of Japan under MEXT KAKEN 22700163.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Tomoya Sakai .

Editor information

Editors and Affiliations

Appendix: Clustering Scores

Appendix: Clustering Scores

The conditional entropy (CE) and the normalized mutual information (NMI) [27] are defined as follows.

$$ \begin{array}{*{20}c} {{\rm CE} = \mathop \Sigma \limits_{i = 1}^k \frac{{\left| {C_i } \right|}}{{ - n\log k}}\mathop \Sigma \limits_{j = 1}^k \frac{{\left| {X_{ij} } \right|}}{{\left| {C_i } \right|}}\log \frac{{\left| X \right|}}{{\left| {C_i } \right|}},}\\ {{\rm NMI} = \frac{{\Sigma _{i = 1}^k \Sigma _{j = 1}^k \frac{{\left| {X_{ij} } \right|}}{n}\log \frac{{n\left| {X_{ij} } \right|}}{{\left| {C_i \left| {A_j } \right|} \right|}}}}{{\sqrt {(\Sigma _{i = 1}^k \frac{{\left| {C_i } \right|}}{n}\log } \frac{{\left| {C_i } \right|}}{n})(\Sigma _{j = 1}^k \frac{{\left| {A_j } \right|}}{n}\log \frac{{\left| {A_j } \right|}}{n})}}}\\ \end{array}$$

Here, |C i | and |A j | are the numbers of samples in the estimated cluster C i and the optimal cluster A j , respectively. X ij =C i ∩A j is the set of common samples. The smaller the CE is, or the larger the NMI is, the better the clustering result is. The NMI takes a value between 0 and 1.

Rights and permissions

Reprints and permissions

Copyright information

© 2011 Springer-Verlag London Limited

About this chapter

Cite this chapter

Sakai, T., Imiya, A. (2011). Practical Algorithms of Spectral Clustering: Toward Large-Scale Vision-Based Motion Analysis. In: Wang, L., Zhao, G., Cheng, L., Pietikäinen, M. (eds) Machine Learning for Vision-Based Motion Analysis. Advances in Pattern Recognition. Springer, London. https://doi.org/10.1007/978-0-85729-057-1_1

Download citation

  • DOI: https://doi.org/10.1007/978-0-85729-057-1_1

  • Publisher Name: Springer, London

  • Print ISBN: 978-0-85729-056-4

  • Online ISBN: 978-0-85729-057-1

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics