Highly Sparse Reductions to Kernel Spectral Clustering

  • Raghvendra Mall
  • Rocco Langone
  • Johan A. K. Suykens
Part of the Lecture Notes in Computer Science book series (LNCS, volume 8251)


Kernel spectral clustering is a model-based spectral clustering method formulated in a primal-dual framework. It has a powerful out-of-sample extension property and a model selection procedure based on the balanced line fit criterion. This paper is an improvement of a previous work which sparsified the kernel spectral clustering method using the line structure of the data projections in the eigenspace. However, the previous method works only in the case of well formed and well separated clusters as in other cases the line structure is lost. In this paper, we propose two highly sparse extensions of kernel spectral clustering that can overcome these limitations. For the selection of the reduced set we use the concept of angles between the data projections in the eigenspace. We show the effectiveness and the amount of sparsity obtained by the proposed methods for several synthetic and real world datasets.


Input Space Spectral Cluster Extension Property Line Structure Balance Line 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.


  1. 1.
    Ng, A.Y., Jordan, M.I., Weiss, Y.: On spectral clustering: analysis and an algorithm. In: Dietterich, T.G., Becker, S., Ghahramani, Z. (eds.) Proceedings of the Advances in Neural Information Processing Systems, pp. 849–856. MIT Press, Cambridge (2002)Google Scholar
  2. 2.
    Luxburg, U.: A tutorial on Spectral clustering. Statistics and Computing 17(4), 395–416Google Scholar
  3. 3.
    Shi, J., Malik, J.: Normalized cuts and image segmentation. IEEE Transactions on Pattern Analysis and Intelligence 22(8), 888–905 (2000)CrossRefGoogle Scholar
  4. 4.
    Alzate, C., Suykens, J.A.K.: Multiway spectral clustering with out-of-sample extensions through weighted kernel PCA. IEEE Transactions on Pattern Analysis and Machine Intelligence 32(2), 335–347 (2010)CrossRefGoogle Scholar
  5. 5.
    Alzate, C., Suykens, J.A.K.: Highly Sparse Kernel Spectral Clustering with Predictive Out-of-sample extensions. In: ESANN, pp. 235–240 (2010)Google Scholar
  6. 6.
    Mall, R., Langone, R., Suykens, J.A.K.: Kernel Spectral Clustering for Big Data Networks. Entropy 15(5), 1567–1586 (2013)MathSciNetCrossRefGoogle Scholar
  7. 7.
    Langone, R., Mall, R., Suykens, J.A.K.: Soft Kernel Spectral Clustering. IJCNN (2013)Google Scholar
  8. 8.
    Alzate, C., Suykens, J.A.K.: Sparse kernel spectral clustering models for large-scale data analysis. Neurocomputing 74(9), 1382–1390 (2011)CrossRefGoogle Scholar
  9. 9.
    Suykens, J.A.K., Van Gestel, T., De Brabanter, J., De Moor, B., Vandewalle, J.: Least Squares Support Vector Machines. World Scientific, Singapore (2002)CrossRefzbMATHGoogle Scholar
  10. 10.
    Girolami, M.: Orthogonal series density estimation and the kernel eigenvalue problem. Neural Computation 14(3), 1000–1017 (2002)MathSciNetCrossRefGoogle Scholar
  11. 11.
    Kenney, J.F., Keeping, E.S.: Linear Regression and Correlation. Mathematics of Statistics 3(1), ch. 15, 252–285Google Scholar
  12. 12.
    Rabbany, R., Takaffoli, M., Fagnan, J., Zaiane, O.R., Campello, R.J.G.B.: Relative Validity Criteria for Community Mining Algorithms. In: International Conference on Advances in Social Networks Analysis and Mining (ASONAM), pp. 258–265 (2012)Google Scholar

Copyright information

© Springer-Verlag Berlin Heidelberg 2013

Authors and Affiliations

  • Raghvendra Mall
    • 1
  • Rocco Langone
    • 1
  • Johan A. K. Suykens
    • 1
  1. 1.Department of Electrical Engineering, ESAT-SCDKatholieke Universiteit LeuvenLeuvenBelgium

Personalised recommendations