Use of Simulation in High-Stakes Summative Assessments in Surgery

  • Sandra de Montbrun
  • Ajit K. SachdevaEmail author
Part of the Comprehensive Healthcare Simulation book series (CHS)


The recent emphasis on advancing competency-based education and training, focus on improving processes of credentialing and privileging using objective data, and evolving requirements for Maintenance of Certification have underscored the use of simulation in high-stakes summative assessments in surgery. This chapter includes review of key issues relating to the use of simulation in high-stakes summative assessments in surgery, addresses validity evidence as it relates to high-stakes assessments, defines competence for high-stakes decisions, and emphasizes the necessity of standard setting to establish passing scores that may help to affirm competence and proficiency. The chapter also highlights specific simulation-based examinations that have been developed and used to make high-stakes summative decisions and provides examples from both surgical and nonsurgical domains. Lastly, the chapter includes review of advantages and disadvantages relating to the use of simulation in high-stakes summative assessments that may be linked to certification, credentialing, and privileging and provides insight to the future of the use of simulation for such assessments.


Simulation Certification Licensure Credentialing Privileging Summative assessment of surgical skills 


  1. 1.
    D’Costa A. The validity of credentialing examinations. Eval Health Prof. 1986;9:137–69.CrossRefGoogle Scholar
  2. 2.
    Petrusa ER. Current challenges and future opportunities for simulation in high-stakes assessment. Simul Healthc. 2009;4(1):3–5.CrossRefGoogle Scholar
  3. 3.
    The American Board of Surgery. General surgery qualifying exam 2015. Available from:
  4. 4.
    The American Board of Surgery. General surgery certifying exam. Available from:
  5. 5.
    Warf BC, Donnelly MB, Schwartz RW, Sloan DA. The relative contributions of interpersonal and specific clinical skills to the perception of global clinical competence. J Surg Res. 1999;86(1):17–23.CrossRefGoogle Scholar
  6. 6.
    Feldman LS, Hagarty SE, Ghitulescu G, Stanbridge D, Fried GM. Relationship between objective assessment of technical skills and subjective in-training evaluations in surgical residents. J Am Coll Surg. 2004;198(1):105–10.CrossRefGoogle Scholar
  7. 7.
    Scott DJ, Valentine RJ, Bergen PC, Rege RV, Laycock R, Tesfay ST, et al. Evaluating surgical competency with the American Board of Surgery in-Training Examination, skill testing, and intraoperative assessment. Surgery. 2000;128(4):613–22.CrossRefGoogle Scholar
  8. 8.
    de Montbrun SL, Roberts PL, Lowry AC, Ault GT, Burnstein MJ, Cataldo PA, et al. A novel approach to assessing technical competence of colorectal surgery residents: the development and evaluation of the colorectal objective structured assessment of technical skill (COSATS). Ann Surg. 2013;258(6):1001–6.CrossRefGoogle Scholar
  9. 9.
    de Montbrun S, Roberts PL, Satterthwaite L, MacRae H. Implementing and evaluating a National Certification Technical Skills Examination: the colorectal objective structured assessment of technical skill. Ann Surg. 2016;264(1):1–6.CrossRefGoogle Scholar
  10. 10.
    Pandey VA, Wolfe JH, Liapis CD, Bergqvist D. The examination assessment of technical competence in vascular surgery. Br J Surg. 2006;93(9):1132–8.CrossRefGoogle Scholar
  11. 11.
    Birkmeyer JD, Finks JF, O’Reilly A, Oerline M, Carlin AM, Nunn AR, et al. Surgical skill and complication rates after bariatric surgery. N Engl J Med. 2013;369(15):1434–42.CrossRefGoogle Scholar
  12. 12.
    Sachdeva AK. Acquiring skills in new procedures and technology: the challenge and the opportunity. Arch Surg. 2005;140(4):387–9.CrossRefGoogle Scholar
  13. 13.
    Bass BL, Polk HC, Jones RS, Townsend CM, Whittemore AD, Pellegrini CA, et al. Surgical privileging and credentialing: a report of a discussion and study group of the American Surgical Association. J Am Coll Surg. 2009;209(3):396–404.CrossRefGoogle Scholar
  14. 14.
    Sachdeva AK, Russell TR. Safe introduction of new procedures and emerging technologies in surgery: education, credentialing, and privileging. Surg Clin North Am. 2007;87(4):853–66, vi–vii.CrossRefGoogle Scholar
  15. 15.
    The Joint Commission. Standards BoosterPak for focused professional practice evaluation/ongoing professional practice evaluation. Oakbrook Terrace: The Joint Commission; 2011.Google Scholar
  16. 16.
    Messick S. Validity of psychological assessment. Validation of inferences from persons’ responses and performances as scientific inquiry into score meaning. Am Psychol. 1995;50(9):741–9.CrossRefGoogle Scholar
  17. 17.
    Korndorffer JR Jr, Kasten SJ, Downing SM. A call for the utilization of consensus standards in the surgical education literature. Am J Surg. 2010;199(1):99–104.CrossRefGoogle Scholar
  18. 18.
    van Hove PD, Tuijthof GJ, Verdaasdonk EG, Stassen LP, Dankelman J. Objective assessment of technical surgical skills. Br J Surg. 2010;97(7):972–87.CrossRefGoogle Scholar
  19. 19.
    Ahmed K, Miskovic D, Darzi A, Athanasiou T, Hanna GB. Observational tools for assessment of procedural skills: a systematic review. Am J Surg. 2011;202(4):469–80.e6.CrossRefGoogle Scholar
  20. 20.
    Downing SM. Validity: on meaningful interpretation of assessment data. Med Educ. 2003;37(9):830–7.CrossRefGoogle Scholar
  21. 21.
    Messick S. Validity. In: Linn RL, editor. Educational measurement. 3rd ed. New York: American Council on Education and Macmillan; 1989.Google Scholar
  22. 22.
    Kane M. Educational measurement. Washington: Rowman and Littlefield Publishers Inc.; 2006.Google Scholar
  23. 23.
    Kane M. Validating the interpretations and uses of test scores. J Educ Meas. 2013;50(1):1–73.CrossRefGoogle Scholar
  24. 24.
    Satava RM, Gallagher AG, Pellegrini CA. Surgical competence and surgical proficiency: definitions, taxonomy, and metrics. J Am Coll Surg. 2003;196(6):933–7.CrossRefGoogle Scholar
  25. 25.
    Dreyfus SE. The five-stage model of adult skill acquisition. Bull Sci Technol Soc. 2004;24(3):177–81.CrossRefGoogle Scholar
  26. 26.
    Szasz P, Louridas M, Harris KA, Aggarwal R, Grantcharov TP. Assessing technical competence in surgical trainees: a systematic review. Ann Surg. 2015;261(6):1046–55.CrossRefGoogle Scholar
  27. 27.
    Norcini JJ. Setting standards on educational tests. Med Educ. 2003;37(5):464–9.CrossRefGoogle Scholar
  28. 28.
    Schindler N, Corcoran J, DaRosa D. Description and impact of using a standard-setting method for determining pass/fail scores in a surgery clerkship. Am J Surg. 2007;193(2):252–7.CrossRefGoogle Scholar
  29. 29.
    Norman GR, Vleuten C, Newble D. International handbook of research in medical education, vol. 2. Dordrecht/Boston: Kluwer Academic; 2002. p. xi, 1106.CrossRefGoogle Scholar
  30. 30.
    Barman A. Standard setting in student assessment: is a defensible method yet to come? Ann Acad Med Singap. 2008;37(11):957–63.PubMedGoogle Scholar
  31. 31.
    de Montbrun S, Satterthwaite L, Grantcharov TP. Setting pass scores for assessment of technical performance by surgical trainees. Br J Surg. 2016;103(3):300–6.CrossRefGoogle Scholar
  32. 32.
    Berkenstadt H, Ziv A, Gafni N, Sidi A. Incorporating simulation-based objective structured clinical examination into the Israeli National Board Examination in anesthesiology. Anesth Analg. 2006;102(3):853–8.CrossRefGoogle Scholar
  33. 33.
    Berkenstadt H, Ben-Menachem E, Dach R, Ezri T, Ziv A, Rubin O, et al. Deficits in the provision of cardiopulmonary resuscitation during simulated obstetric crises: results from the Israeli Board of Anesthesiologists. Anesth Analg. 2012;115(5):1122–6.CrossRefGoogle Scholar
  34. 34.
    Berkenstadt H, Ziv A, Gafni N, Sidi A. The validation process of incorporating simulation-based accreditation into the anesthesiology Israeli national board exams. Isr Med Assoc J. 2006;8(10):728–33.PubMedGoogle Scholar
  35. 35.
    Medical Council of Canada. About us 2013. Available from:
  36. 36.
    Reznick R, Smee S, Rothman A, Chalmers A, Swanson D, Dufresne L, et al. An objective structured clinical examination for the licentiate: report of the pilot project of the Medical Council of Canada. Acad Med. 1992;67(8):487–94.CrossRefGoogle Scholar
  37. 37.
    Medical Council of Canada. Medical Council of Canada Qualifying Examination Part II, Ottawa, Ontario, Canada. 2018. Accessed Oct 2018.
  38. 38.
    Reznick RK, Blackmore D, Cohen R, Baumber J, Rothman A, Smee S, et al. An objective structured clinical examination for the licentiate of the Medical Council of Canada: from research to reality. Acad Med. 1993;68(10 Suppl):S4–6.CrossRefGoogle Scholar
  39. 39.
    Vassiliou MC, Ghitulescu GA, Feldman LS, Stanbridge D, Leffondre K, Sigman HH, et al. The MISTELS program to measure technical skill in laparoscopic surgery : evidence for reliability. Surg Endosc. 2006;20(5):744–7.CrossRefGoogle Scholar
  40. 40.
    Fraser SA, Klassen DR, Feldman LS, Ghitulescu GA, Stanbridge D, Fried GM. Evaluating laparoscopic skills: setting the pass/fail score for the MISTELS system. Surg Endosc. 2003;17(6):964–7.CrossRefGoogle Scholar
  41. 41.
    Surgery ABo. Training requirements for general surgery certification 2012. Available from:
  42. 42.
    Miller GE. The assessment of clinical skills/competence/performance. Acad Med. 1990;65(9 Suppl):S63–7.CrossRefGoogle Scholar
  43. 43.
    Maran NJ, Glavin RJ. Low- to high-fidelity simulation - a continuum of medical education? Med Educ. 2003;37(Suppl 1):22–8.CrossRefGoogle Scholar
  44. 44.
    Issenberg SB, McGaghie WC, Hart IR, Mayer JW, Felner JM, Petrusa ER, et al. Simulation technology for health care professional skills training and assessment. JAMA. 1999;282(9):861–6.CrossRefGoogle Scholar
  45. 45.
    The Royal College of Physicians and Surgeons of Canada. Credentialing fees 2014. Available from:
  46. 46.
    The American Board of Colon and Rectal Surgery. Examinations 2012. Available from:
  47. 47.
    Accreditation Council for Graduate Medical Education. Number of accredited programs: academic year 2015–2016 -United States 2016. Available from:
  48. 48.
    Sachdeva AK, Pellegrini CA, Johnson KA. Support for simulation-based surgical education through American College of Surgeons – accredited education institutes. World J Surg. 2008;32(2):196–207.CrossRefGoogle Scholar
  49. 49.
    Parks R, Warren PM, Boyd KM, Cameron H, Cumming A, Lloyd-Jones G. The objective structured clinical examination and student collusion: marks do not tell the whole truth. J Med Ethics. 2006;32(12):734–8.CrossRefGoogle Scholar
  50. 50.
    Swartz MH, Colliver JA, Cohen DS, Barrows HS. The effect of deliberate, excessive violations of test security on performance on a standardized-patient examination. Acad Med. 1993;68(10 Suppl):S76–8.CrossRefGoogle Scholar
  51. 51.
  52. 52.
    Sachdeva AK. Continuing professional development in the twenty-first century. J Contin Educ Heal Prof. 2016;36(Suppl 1):S8–S13.CrossRefGoogle Scholar

Copyright information

© Springer Nature Switzerland AG 2019

Authors and Affiliations

  1. 1.Department of SurgeryUniversity of Toronto, St. Michael’s HospitalTorontoCanada
  2. 2.Division of EducationAmerican College of SurgeonsChicagoUSA

Personalised recommendations