The importance of Markov chains in modeling diverse systems, including biological, physical, social and economic systems, has long been known and is well documented. More recently, Markov chains have proven to be effective when applied to internet search engines such as Google’s PageRank model [7], and in data mining applications wherein data trends are sought. It is with this type of Markov chain application that we focus our research efforts. Our starting point is the work of Fiedler who in the early 70’s developed a spectral partitioning method to obtain the minimum cut on an undirected graph (symmetric system). The vector that results from the spectral decomposition, called the Fiedler vector, allows the nodes of the graph to be partitioned into two subsets. At the same time that Fiedler proposed his spectral approach, Stewart proposed a method based on the dominant eigenvectors of a Markov chain — a method which was more broadly applicable to nonsymmetric systems. Enlightened by these, somewhat orthogonal, results and combining them together, we show that spectral partitioning can be viewed in the framework of state clustering on Markov chains. Our research results to date are two-fold. First, we prove that the second eigenvector of the signless Laplacian provides a heuristic solution to the NP-complete state clustering problem which is the dual of the minimum cut problem. Second, we propose two clustering techniques for Markov chains based on two different clustering measures.


spectral clustering graph partitioning Markov chains eigenvector 


  1. 1.
    Brand, M., Huang, K.: A unifying theorem for spectral embedding and clustering. In: 9th International Conference on Artificial Intelligence and Statistics (2003)Google Scholar
  2. 2.
    Cvetkovic, D., Rowlinson, P., Simic, S.K.: Signless Laplacians of finite graphs. Linear Algebra and its Applications 423, 155–171 (2007)CrossRefzbMATHMathSciNetGoogle Scholar
  3. 3.
    Fiedler, M.: Algebraic connectivity of graphs. Czecheslovak Mathematical Journal 23, 298–305 (1973)MathSciNetGoogle Scholar
  4. 4.
    Filippone, M., Camastra, F., Masulli, F., Rovetta, S.: A survey of kernel and spectral methods for clustering. Pattern Recognition 41(1), 176–190 (2008)CrossRefzbMATHGoogle Scholar
  5. 5.
    Golub, G.H., Van Loan, C.F.: Matrix computations, 3rd edn. Johns Hopkins University Press (1996)Google Scholar
  6. 6.
    Higham, D.J., Kibble, M.: A unified view of spectral clustering. University of Strathclyde Mathematics Research Report (2004)Google Scholar
  7. 7.
    Hilgers, P.V., Langville, A.N.: The five greatest applications of Markov Chains (2006)Google Scholar
  8. 8.
    Meila, M., Shi, J.: Learning segmentation by random walks. In: NIPS, pp. 873–879 (2000)Google Scholar
  9. 9.
    Meila, M., Shi, J.: A random walks view of spectral segmentation (2001)Google Scholar
  10. 10.
    Meyer, C.D.: Matrix analysis and applied linear algebra. SIAM (2000)Google Scholar
  11. 11.
    Pothen, A., Simon, H., Liou, K.-P.: Partitioning sparse matrices with eigenvectors of graphs. SIAM Journal on Matrix Analysis and Applications 11(3), 430–452 (1990)CrossRefzbMATHMathSciNetGoogle Scholar
  12. 12.
    Shi, J., Malik, J.: Normalized Cuts and Image Segmentation. IEEE Transactions on Pattern Analysis and Machine Intelligence 22, 888–905 (2000)CrossRefGoogle Scholar
  13. 13.
    Stewart, W.J.: Introduction to the Numerical Solution of Markov Chains. Princeton (1994)Google Scholar

Copyright information

© IFIP International Federation for Information Processing 2011

Authors and Affiliations

  • Ning Liu
    • 1
  • William J. Stewart
    • 1
  1. 1.Department of Computer ScienceNorth Carolina State UniversityRaleighUSA

Personalised recommendations