Principles and Guidelines for Quantum Performance Analysis

  • Catherine C. McGeochEmail author
Conference paper
Part of the Lecture Notes in Computer Science book series (LNCS, volume 11413)


Expanding access to practical quantum computers prompts a widespread need to evaluate their performance. Principles and guidelines for carrying out sound empirical work on quantum computing systems are proposed. The guidelines draw heavily on classical experience in experimental algorithmics and computer systems performance analysis, with some adjustments to address issues in quantum computing. The focus is on issues related to quantum annealing processors, although much of the discussion applies to more general scenarios.


Quantum computing Experimental methodology 


  1. 1.
    Reverse quantum annealing for local refinement of solutions. D-Wave Whitepaper, 14-1018A-A (2017)Google Scholar
  2. 2.
    Andriyash, E., et al.: Boosting integer factoring performance via quantum annealing offsets. 14-1002A-B (2016)Google Scholar
  3. 3.
    Bailey, D.H.: Misleading performance reporting in the supercomputing field. Sci. Program. 1(2), 141–151 (1992)Google Scholar
  4. 4.
    Bailey, D.H.: 12 ways to fool the masses: Fast forward to 2011 (Powerpoint) (2011). An earlier version appeared as, 1991 “Twelve ways to fool the masses when giving performance results on parallel computers. Supercomputing Review 4(8), 54–55
  5. 5.
    Barr, R.S., Golden, B.L., Kelly, J.P., Resende, M.G.C., Steward Jr., W.R.: Designing and reporting on computational experiments with heuristic methods. J. Heuristics 1, 9–32 (1995)CrossRefGoogle Scholar
  6. 6.
    Barr, R.S., Hickman, B.L.: Reporting computational experiments with parallel algorithms: issues, measures, and experts’ opinions. ORSA J. Comput. 5(1), 2–18 (1993)CrossRefGoogle Scholar
  7. 7.
    Bartz-Beielstein, T., Chiarandini, M., Paquete, L., Preuss, M. (eds.): Experimental Methods for the Analysis of Optimization Algorithms. Springer, Heidelberg (2010). Scholar
  8. 8.
    Boixo, S., et al.: Characterizing quantum supremacy in near-term devices. Nat. Phys. 14, 595–600 (2018)CrossRefGoogle Scholar
  9. 9.
    Cohen, P.R.: Empirical Methods for Artificial Intelligence. MIT Press, Cambridge (1995)zbMATHGoogle Scholar
  10. 10.
    Denchev, V.S., et al.: What is the computational value of finite range tunneling? Phys. Rev. X 5, 031026 (2016)Google Scholar
  11. 11.
    Hen, I., Job, J., Albash, T., Rønnow, T.R., Troyer, M., Lidar, D.: Probing for quantum speedup in spin glass problems with planted solutions. arXiv:1502.01663 (2015)
  12. 12.
    Hockney, R.W.: The Science of Computer Benchmarking. SIAM, Philadelphia (1996)CrossRefGoogle Scholar
  13. 13.
    Jain, R.: The Art of Computer Systems Performance Analysis. Wiley, Hoboken (1991)zbMATHGoogle Scholar
  14. 14.
    Job, J., Lidar, D.A.: Test-driving 1000 qubits. arXiv:1706.07124 (2017)
  15. 15.
    Johnson, D.S.: A theoretician’s guide to the experimental analysis of algorithms. In: Goldwasser, M.H., et al. (eds.) Data Structures, Near Neighbor Searches, and Methodology: Fifth and Sixth DIMACS Implementation Challenges. Discrete Mathematics and Theoretical Computer Science, AMS, vol. 59 (2002)Google Scholar
  16. 16.
    Katzgraber, H.G., Hamze, F., Zhu, Z., Ochoa, A.J., Munoz-Bauza, H.: Seeking quantum speedup through spin glasses: the good, the bad, and the ugly. Phys. Rev. X 5, 031026 (2015)Google Scholar
  17. 17.
    King, A.D., Lanting, T., Harris, R.: Performance of a quantum annealer on range-limited constraint satisfaction problems. arXiv:1502.02089 (2015)
  18. 18.
    Macready, W.G., Wolpert, D.H.: What makes an optimization problem hard? Complexity 5, 40–46 (1996)MathSciNetCrossRefGoogle Scholar
  19. 19.
    Mandrà, S., Katzgraber, H.G.: A deceptive step towards quantum speedup detection. arXiv:1711.03168 (2018)
  20. 20.
    Marshall, J., Venturelli, D., Hen, I., Rieffel, E.G.: The power of pausing: advancing understanding of thermalization in experimental quantum annealers. arXiv:1810.05581 (2016)
  21. 21.
    McGeoch, C.C.: Toward an experimental method for algorithm simulation (feature article). INFORMS J. Comput. 8(1), 1–15 (1995)CrossRefGoogle Scholar
  22. 22.
    McGeoch, C.C.: A Guide to Experimental Algorithmics. Cambridge Press, Cambridge (2012)Google Scholar
  23. 23.
    McGeoch, C., Sanders, P., Fleischer, R., Cohen, P.R., Precup, D.: Using finite experiments to study asymptotic performance. In: Fleischer, R., Moret, B., Schmidt, E.M. (eds.) Experimental Algorithmics. LNCS, vol. 2547, pp. 93–126. Springer, Heidelberg (2002). Scholar
  24. 24.
    Panny, W.: Deletions in random binary search trees: a story of errors. J. Stat. Plan. Infer. 140(8), 2335–2345 (2010)MathSciNetCrossRefGoogle Scholar
  25. 25.
    Parekh, O., et al.: Benchmarking Adiabatic Quantum Optimization for Complex Network Analysis, volume SAND2015-3025. Sandia Report, April 2015Google Scholar
  26. 26.
    Rønnow, T.F., et al.: Defining and detecting quantum speedup. Science 345(6195), 420–424 (2014)CrossRefGoogle Scholar
  27. 27.
    Trummer, I., Koch, C.: Multiple query optimization on the D-Wave 2x adiabatic quantum computer. VLDB 9, 648–659 (2016)Google Scholar
  28. 28.
    Wolpert, D.H., Macready, W.G.: No free lunch theorems for optimization. IEEE Trans. Evol. Comput. 1, 67–82 (1997)CrossRefGoogle Scholar

Copyright information

© Springer Nature Switzerland AG 2019

Authors and Affiliations

  1. 1.D-Wave SystemsBurnabyCanada

Personalised recommendations