Quantum annealing for combinatorial clustering

  • Vaibhaw Kumar
  • Gideon Bass
  • Casey Tomlin
  • Joseph DulnyIII


Clustering is a powerful machine learning technique that groups “similar” data points based on their characteristics. Many clustering algorithms work by approximating the minimization of an objective function, namely the sum of within-the-cluster distances between points. The straightforward approach involves examining all the possible assignments of points to each of the clusters. This approach guarantees the solution will be a global minimum; however, the number of possible assignments scales quickly with the number of data points and becomes computationally intractable even for very small datasets. In order to circumvent this issue, cost function minima are found using popular local search-based heuristic approaches such as k-means and hierarchical clustering. Due to their greedy nature, such techniques do not guarantee that a global minimum will be found and can lead to sub-optimal clustering assignments. Other classes of global search-based techniques, such as simulated annealing, tabu search, and genetic algorithms, may offer better quality results but can be too time-consuming to implement. In this work, we describe how quantum annealing can be used to carry out clustering. We map the clustering objective to a quadratic binary optimization problem and discuss two clustering algorithms which are then implemented on commercially available quantum annealing hardware, as well as on a purely classical solver “qbsolv.” The first algorithm assigns N data points to K clusters, and the second one can be used to perform binary clustering in a hierarchical manner. We present our results in the form of benchmarks against well-known k-means clustering and discuss the advantages and disadvantages of the proposed techniques.


Quantum annealing Clustering Machine learning k-means 



We acknowledge the support of the Universities Space Research Association, Quantum AI Lab Research Opportunity Program, Cycle 2.


  1. 1.
    Ben-Dor, A., Shamir, R., Yakhini, Z.: Clustering gene expression patterns. J. Comput. Biol. 6(3–4), 281 (1999)CrossRefGoogle Scholar
  2. 2.
    Das, R., Saha, S.: 2016 IEEE Congress on (IEEE, 2016) in Evolutionary Computation (CEC), pp. 3124–3130Google Scholar
  3. 3.
    Gorzałczany, M.B., Rudzínski, F., Piekoszewski, J.: 2016 International Joint Conference on (IEEE, 2016) in Neural Networks (IJCNN), pp. 3666–3673Google Scholar
  4. 4.
    Marisa, L., de Reyniès, A., Duval, A., Selves, J., Gaub, M.P., Vescovo, L., Etienne-Grimaldi, M.C., Schiappa, R., Guenot, D., Ayadi, M., et al.: Gene expression classification of colon cancer into molecular subtypes: characterization, validation, and prognostic value. PLoS Med. 10(5), e1001453 (2013)CrossRefGoogle Scholar
  5. 5.
    Xie, P., Xing, E.P.: CoRR abs/1309.6874. (2013)
  6. 6.
    Balabantaray, R.C., Sarma, C., Jha, M.: CoRR abs/1502.07938. (2015)
  7. 7.
    Mudambi, S.: Branding importance in business-to-business markets: three buyer clusters. Ind. Mark. Manag. 31(6), 525 (2002)CrossRefGoogle Scholar
  8. 8.
    Sharma, A., Lambert, D.M.: Segmentation of markets based on customer service. Int. J. Phys. Distrib. Logist. Manag. 24(4), 50–58 (1994)CrossRefGoogle Scholar
  9. 9.
    Chan, K.Y., Kwong, C., Hu, B.Q.: Market segmentation and ideal point identification for new product design using fuzzy data compression and fuzzy clustering methods. Appl. Soft Comput. 12(4), 1371 (2012)CrossRefGoogle Scholar
  10. 10.
    Friedman, J., Hastie, T., Tibshirani, R.: The Elements of Statistical Learning, vol. 1. Springer, New York (2001)MATHGoogle Scholar
  11. 11.
    Hartigan, J.A., Wong, M.A.: Algorithm AS 136: a k-means clustering algorithm. J. R. Stat. Soc. Ser. C (Appl. Stat.) 28(1), 100 (1979)MATHGoogle Scholar
  12. 12.
    Johnson, S.C.: Hierarchical clustering schemes. Psychometrika 32(3), 241 (1967)CrossRefMATHGoogle Scholar
  13. 13.
    Jain, A.K.: Data clustering: 50 years beyond K-means. Pattern Recogn. Lett. 31(8), 651 (2010)CrossRefGoogle Scholar
  14. 14.
    Garey, M.R., Johnson, D.S.: Computers and Intractability: a guide to the theory of NP-completeness. W. H. Freeman & Co., New York (1979)MATHGoogle Scholar
  15. 15.
    Papadimitriou, C.H.: The Euclidean travelling salesman problem is NP-complete. Theor. Comput. Sci. 4(3), 237 (1977)CrossRefMATHGoogle Scholar
  16. 16.
    Al-Sultana, K.S., Khan, M.M.: Computational experience on four algorithms for the hard clustering problem. Pattern Recogn. Lett. 17(3), 295 (1996)CrossRefGoogle Scholar
  17. 17.
    Kirkpatrick, S., Gelatt, C.D., Vecchi, M.P., et al.: Optimization by simulated annealing. Science 220(4598), 671 (1983)ADSMathSciNetCrossRefMATHGoogle Scholar
  18. 18.
    Selim, S.Z., Alsultan, K.: A simulated annealing algorithm for the clustering problem. Pattern Recogn. 24(10), 1003 (1991)MathSciNetCrossRefGoogle Scholar
  19. 19.
    Mitra, D., Romeo, F., Sangiovanni-Vincentelli, A.: 1985 24th IEEE Conference on Decision and Control, vol. 24, pp. 761–767. IEEE (1985)Google Scholar
  20. 20.
    Szu, H., Hartley, R.: Fast simulated annealing. Phys. Lett. A 122(3–4), 157 (1987)ADSCrossRefGoogle Scholar
  21. 21.
    Ingber, L.: Very fast simulated re-annealing. Math. Comput. Model. 12(8), 967 (1989)MathSciNetCrossRefMATHGoogle Scholar
  22. 22.
    Bouleimen, K., Lecocq, H.: A new efficient simulated annealing algorithm for the resource-constrained project scheduling problem and its multiple mode version. Eur. J. Oper. Res. 149(2), 268 (2003)MathSciNetCrossRefMATHGoogle Scholar
  23. 23.
    Kadowaki, T., Nishimori, H.: Quantum annealing in the transverse Ising model. Phys. Rev. E 58(5), 5355 (1998)ADSCrossRefGoogle Scholar
  24. 24.
    Santoro, G.E., Tosatti, E.: Optimization using quantum mechanics: quantum annealing through adiabatic evolution. J. Phys. A Math. Gen. 39(36), R393 (2006)ADSMathSciNetCrossRefMATHGoogle Scholar
  25. 25.
    Denchev, V.S., Boixo, S., Isakov, S.V., Ding, N., Babbush, R., Smelyanskiy, V., Martinis, J., Neven, H.: What is the computational value of finite-range tunneling? Phys. Rev. X 6(3), 031015 (2016)Google Scholar
  26. 26.
    Born, M., Fock, V.: Beweis des Adiabatensatzes. Z. Angew. Phys. 51, 165 (1928). MATHGoogle Scholar
  27. 27.
    Albash, T., Lidar, D.A.: ArXiv e-prints (2016)Google Scholar
  28. 28.
    Biamonte, J., Wittek, P., Pancotti, N., Rebentrost, P., Wiebe, N., Lloyd, S.: ArXiv e-prints (2016)Google Scholar
  29. 29.
    Dulny, J., III, Kim, M.: ArXiv e-prints (2016)Google Scholar
  30. 30.
    Neven, H., Denchev, V.S., Drew-Brook, M., Zhang, J., Macready, W.G., Rose, G.: Binary classification using hardware implementation of quantum annealing. In: Demonstrations at NIPS-09, 24th Annual Conference on Neural Information Processing Systems, pp. 1–17 (2009)Google Scholar
  31. 31.
    Denchev, V.S.: Binary Classification with Adiabatic Quantum Optimization. Ph.D. thesis, Purdue University (2013)Google Scholar
  32. 32.
    Farinelli, A.: Theory and Practice of Natural Computing: 5th International Conference, TPNC 2016, Sendai, Japan, December 12–13, 2016, Proceedings, vol. 10071, p. 175. Springer (2016)Google Scholar
  33. 33.
    Kurihara, K., Tanaka, S., Miyashita, S.: Proceedings of the Twenty-Fifth Conference on Uncertainty in Artificial Intelligence, pp. 321–328. AUAI Press (2009)Google Scholar
  34. 34.
    Sato, I., Tanaka, S., Kurihara, K., Miyashita, S., Nakagawa, H.: Quantum annealing for Dirichlet process mixture models with applications to network clustering. Neurocomputing 121, 523 (2013)CrossRefGoogle Scholar
  35. 35.
    Ising, E.: Zeitschrift für Physik 31(1), 253 (1925). ADSCrossRefGoogle Scholar
  36. 36.
    Dahl, E.D.: Programming with d-wave: map coloring problem. D-Wave Official Whitepaper (2013)Google Scholar
  37. 37.
    Ishikawa, H.: Transformation of general binary MRF minimization to the first-order case. IEEE Trans. Pattern Anal. Mach. Intell. 33(6), 1234 (2011). CrossRefGoogle Scholar
  38. 38.
    Booth, M., Reinhardt, S.P., Roy, A.: Partitioning optimization problems for hybrid classical/quantum execution. Technical Report, pp. 1–9 (2017)Google Scholar
  39. 39.
    Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Prettenhofer, P., Weiss, R., Dubourg, V., Vanderplas, J., Passos, A., Cournapeau, D., Brucher, M., Perrot, M., Duchesnay, E.: Scikit-learn: machine learning in python. J. Mach. Learn. Res. 12, 2825 (2011)MathSciNetMATHGoogle Scholar
  40. 40.
    Arthur, D., Vassilvitskii, S.: Proceedings of the Eighteenth Annual ACM-SIAM Symposium on Discrete Algorithms, pp. 1027–1035. Society for Industrial and Applied Mathematics (2007)Google Scholar
  41. 41.
    Savaresi, S.M., Boley, D.L.: Proceedings of the 2001 SIAM International Conference on Data Mining, pp. 1–14. SIAM (2001)Google Scholar
  42. 42.
    Cai, J., Macready, W.G., Roy, A.: arXiv preprint arXiv:1406.2741 (2014)
  43. 43.
    Guénoche, A., Hansen, P., Jaumard, B.: Efficient algorithms for divisive hierarchical clustering with the diameter criterion. J. Classif. 8(1), 5 (1991). MathSciNetCrossRefMATHGoogle Scholar

Copyright information

© Springer Science+Business Media, LLC, part of Springer Nature 2018

Authors and Affiliations

  1. 1.Booz Allen HamiltonMcLeanUSA

Personalised recommendations