Clustering with Hypergraphs: The Case for Large Hyperedges

  • Pulak Purkait
  • Tat-Jun Chin
  • Hanno Ackermann
  • David Suter
Part of the Lecture Notes in Computer Science book series (LNCS, volume 8692)


The extension of conventional clustering to hypergraph clustering, which involves higher order similarities instead of pairwise similarities, is increasingly gaining attention in computer vision. This is due to the fact that many grouping problems require an affinity measure that must involve a subset of data of size more than two, i.e., a hyperedge. Almost all previous works, however, have considered the smallest possible hyperedge size, due to a lack of study into the potential benefits of large hyperedges and effective algorithms to generate them. In this paper, we show that large hyperedges are better from both theoretical and empirical standpoints. We then propose a novel guided sampling strategy for large hyperedges, based on the concept of random cluster models. Our method can generate pure large hyperedges that significantly improve grouping accuracy without exponential increases in sampling costs. In the important applications of face clustering and motion segmentation, our method demonstrates substantially better accuracy and efficiency.


Hypergraph clustering model fitting guided sampling 


Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.


  1. 1.
    Agarwal, S., Branson, K., Belongie, S.: Higher order learning with graphs. In: ICML (2006)Google Scholar
  2. 2.
    Govindu, V.M.: A tensor decomposition for geometric grouping and segmentation. In: CVPR (2005)Google Scholar
  3. 3.
    Agarwal, S., Lim, J., Zelnik-Manor, L., Perona, P., Kriegman, D., Belongie, S.: Beyond pairwise clustering. In: CVPR (2005)Google Scholar
  4. 4.
    Chen, G., Lerman, G.: Spectral curvature clustering (scc). IJCV 81(3), 317–330 (2009)CrossRefGoogle Scholar
  5. 5.
    Liu, H., Latecki, L., Yan, S.: Robust clustering as ensembles of affinity relations. In: NIPS (2010)Google Scholar
  6. 6.
    Liu, H., Yan, S.: Efficient structure detection via random consensus graph. In: CVPR (2012)Google Scholar
  7. 7.
    Ochs, P., Brox, T.: Higher order motion models and spectral clustering. In: CVPR (2012)Google Scholar
  8. 8.
    Jain, S., Govindu, V.M.: Efficient higher-order clustering on the grassmann manifold. In: ICCV (2013)Google Scholar
  9. 9.
    MacKay, D.J.C. (extra) The Swendsen-Wang method. In: Information Theory, Inference, and Learning Algorithms. Cambridge University Press (2003)Google Scholar
  10. 10.
    Pham, T.T., Chin, T.J., Yu, J., Suter, D.: The random cluster model for robust geometric fitting. In: CVPR (2012)Google Scholar
  11. 11.
    Alpert, C.J., Kahng, A.B.: Recent directions in netlist partitioning: A survey. Integration: The VLSI Journal 19(1–2), 1–81 (1995)zbMATHGoogle Scholar
  12. 12.
    Zhou, D., Huang, J., Schölkopf, B.: Learning with hypergraphs: Clustering, classification, and embedding. In: NIPS (2006)Google Scholar
  13. 13.
    Shi, J., Malik, J.: Normalized cuts and image segmentation. IEEE TPAMI 22(8), 888–905 (2000)CrossRefGoogle Scholar
  14. 14.
    Bulo’, S.R., Pelillo, M.: A game-theoretic approach to hypergraph clustering. IEEE TPAMI 35(6), 1312–1327 (2013)CrossRefGoogle Scholar
  15. 15.
    Elhamifar, E., Vidal, R.: Sparse subspace clustering. In: CVPR (2009)Google Scholar
  16. 16.
    Georghiades, A., Belhumeur, P., Kriegman, D.: From few to many: illumination cone models for face recognition under variable lighting and pose. IEEE TPAMI 23(6), 643–660 (2001)CrossRefGoogle Scholar
  17. 17.
    Tron, R., Vidal, R.: A benchmark for the comparison of 3-d motion segmentation algorithms. In: CVPR (2007)Google Scholar
  18. 18.
    Tordoff, B., Murray, D.W.: Guided sampling and consensus for motion estimation. In: Heyden, A., Sparr, G., Nielsen, M., Johansen, P. (eds.) ECCV 2002, Part I. LNCS, vol. 2350, pp. 82–96. Springer, Heidelberg (2002)CrossRefGoogle Scholar
  19. 19.
    Raguram, R., Frahm, J.-M., Pollefeys, M.: A comparative analysis of RANSAC techniques leading to adaptive real-time random sample consensus. In: Forsyth, D., Torr, P., Zisserman, A. (eds.) ECCV 2008, Part II. LNCS, vol. 5303, pp. 500–513. Springer, Heidelberg (2008)CrossRefGoogle Scholar
  20. 20.
    Chin, T.-J., Yu, J., Suter, D.: Accelerated hypothesis generation for multi-structure robust fitting. In: Daniilidis, K., Maragos, P., Paragios, N. (eds.) ECCV 2010, Part V. LNCS, vol. 6315, pp. 533–546. Springer, Heidelberg (2010)CrossRefGoogle Scholar
  21. 21.
    Vidal, R., Ma, Y., Sastry, S.: Generalized principal component analysis (gpca). IEEE TPAMI 27(12), 1945–1959 (2005)CrossRefGoogle Scholar
  22. 22.
    Zhang, T., Szlam, A., Wang, Y., Lerman, G.: Hybrid linear modeling via local best-fit flats. IJCV 100(3), 217–240 (2012)CrossRefzbMATHMathSciNetGoogle Scholar
  23. 23.
    Ma, Y., Derksen, H., Hong, W., Wright, J.: Segmentation of multivariate mixed data via lossy data coding and compression. IEEE TPAMI 29(9), 1546–1562 (2007)CrossRefGoogle Scholar
  24. 24.
    Yan, J., Pollefeys, M.: A general framework for motion segmentation: Independent, articulated, rigid, non-rigid, degenerate and non-degenerate. In: Leonardis, A., Bischof, H., Pinz, A. (eds.) ECCV 2006. Part IV. LNCS, vol. 3954, pp. 94–106. Springer, Heidelberg (2006)Google Scholar
  25. 25.
    Shi, F., Zhou, Z., Xiao, J., Wu, W.: Robust trajectory clustering for motion segmentation. In: ICCV (2013)Google Scholar
  26. 26.
    Brox, T., Malik, J.: Object segmentation by long term analysis of point trajectories. In: Daniilidis, K., Maragos, P., Paragios, N. (eds.) ECCV 2010, Part V. LNCS, vol. 6315, pp. 282–295. Springer, Heidelberg (2010)CrossRefGoogle Scholar

Copyright information

© Springer International Publishing Switzerland 2014

Authors and Affiliations

  • Pulak Purkait
    • 1
  • Tat-Jun Chin
    • 1
  • Hanno Ackermann
    • 2
  • David Suter
    • 1
  1. 1.The University of AdelaideAustralia
  2. 2.Leibniz UniversitätHannoverGermany

Personalised recommendations