Advertisement

The Future of Medical Education: Simulation-Based Assessment in a Competency-by-Design Curriculum

  • Mitchell G. Goldenberg
  • Teodor P. Grantcharov
Chapter

Abstract

Competency-Based Medical Education (CBME) represents the biggest change to medical education since Halstead proposed his apprenticeship model over 100 years ago. This new paradigm has been crafted over nearly four decades (McGaghie 1978), and is built on the principle that trainee physicians and surgeons must fulfill core competencies, spanning professionalism to technical aptitude, prior to independent practice (Frank et al. 2015). This shift to CBME as an underpinning framework in medical education has brought about increased demand for assessment of trainee performance. Simulation has been identified as a means to increase trainee exposure to and experience with clinical tasks, without increasing the burden of patient harm (Griswold et al. 2012; Holmboe et al. 2010).

Evidence suggests that both mastery learning and iterative simulation training can move physicians along their learning curve and impact both educational and clinical outcomes (Zendejas et al. 2011; Brydges et al. 2015). Additionally, simulation-based assessments (SBA) have accumulated multiple sources of validity evidence from studies in the medical and surgical literature (Cook et al. 2014). Despite the widening access to simulation at medical institutions, many questions surrounding the optimal integration of simulation into the CBME framework still exist. Failure to take a methodical and evidence-based approach to the implementation of simulation training and SBA in CBME would be a disservice to medical trainees and the public alike.

In this chapter, we will explore some of the issues regarding the use of simulation for the training and assessment of resident physicians in the new era of CBME. We will do so by assembling the relevant literature describing simulation-based educational interventions that have been previously implemented, taking an analytical approach to probe the underlying validity evidence for the use of simulation as a tool to allow trainees to meet competency requirements as set out by the stakeholders in medical education. We will also discuss future utilities of the surgical boot camp in surgical training, beyond the simple introduction of technical and non-technical skills to new trainees.

Keywords

Simulation Assessment CBME CBD 

References

  1. Aghazadeh MA, et al. Performance of robotic simulated skills tasks is positively associated with clinical robotic surgical performance. Br J Urol Int. 2016;118(3):475–81.CrossRefGoogle Scholar
  2. Albanese MA, et al. Defining characteristics of educational competencies. Med Educ. 2008;42(3):248–55.CrossRefPubMedGoogle Scholar
  3. Barrows HS, Williams RG, Moy RH. A comprehensive performance-based assessment of fourth-year students’ clinical skills. J Med Educ. 1987;62(10):805–9.PubMedGoogle Scholar
  4. Barsuk JH, et al. Long-term retention of central venous catheter insertion skills after simulation-based mastery learning. Acad Med. 2010;85:S9–S12.CrossRefPubMedGoogle Scholar
  5. Barsuk JH, Cohen ER, Caprio T, et al. Simulation-based education with mastery learning improves residents’ lumbar puncture skills. Neurology. 2012a;79(2):132–7.CrossRefPubMedPubMedCentralGoogle Scholar
  6. Barsuk JH, Cohen ER, Vozenilek JA, et al. Simulation-based education with mastery learning improves paracentesis skills. J Grad Med Educ. 2012b;4(1):23–7.CrossRefPubMedPubMedCentralGoogle Scholar
  7. Birkmeyer JD, et al. Surgical skill and complication rates after bariatric surgery. N Engl J Med. 2013;369(15):1434–42.CrossRefPubMedGoogle Scholar
  8. Brown C, et al. Money makes the (medical assessment) world go round: the cost of components of a summative final year objective structured clinical examination (OSCE). Med Teach. 2015;37(7):653–9.CrossRefPubMedGoogle Scholar
  9. Brydges R, et al. Linking simulation-based educational assessments and patient-related outcomes. Acad Med. 2015;90(2):246–56.CrossRefPubMedGoogle Scholar
  10. Cleland J, et al. Simulation-based education: understanding the socio-cultural complexity of a surgical training “boot camp”. Med Educ. 2016;50(8):829–41.CrossRefPubMedGoogle Scholar
  11. Cook DA, et al. Mastery learning for health professionals using technology-enhanced simulation: a systematic review and meta-analysis. Acad Med. 2013a;88(8):1178–86.CrossRefPubMedGoogle Scholar
  12. Cook DA, et al. Technology-enhanced simulation to assess health professionals: a systematic review of validity evidence, research methods, and reporting quality. Acad Med. 2013b;88(6):872–83.CrossRefPubMedGoogle Scholar
  13. Cook DA, et al. What counts as validity evidence? Examples and prevalence in a systematic review of simulation-based assessment. Adv Health Sci Educ Theory Pract. 2014;19(2):233–50.CrossRefPubMedGoogle Scholar
  14. Cook DA, et al. A contemporary approach to validity arguments: a practical guide to Kane’s framework. Med Educ. 2015;49(6):560–75.CrossRefPubMedGoogle Scholar
  15. Culligan P, et al. Predictive validity of a training protocol using a robotic surgery simulator. Female Pelvic Med Reconstr Surg. 2014;20(1):48–51.CrossRefPubMedGoogle Scholar
  16. de Montbrun S, Satterthwaite L, Grantcharov TP. Setting pass scores for assessment of technical performance by surgical trainees. Br J Surg. 2015;103(3):300–6.CrossRefPubMedGoogle Scholar
  17. de Montbrun S, et al. Implementing and evaluating a National Certification Technical Skills Examination. Ann Surg. 2016;264:1–6.CrossRefPubMedGoogle Scholar
  18. Dedy NJ, et al. Teaching nontechnical skills in surgical residency: a systematic review of current approaches and outcomes. Surgery. 2013;154(5):1000–8.CrossRefPubMedGoogle Scholar
  19. Dreyfus SE. The five-stage model of adult skill acquisition. Bull Sci Technol Soc. 2004;24(3):177–81.CrossRefGoogle Scholar
  20. Frank JR, et al. Competency-based medical education: theory to practice. Med Teach. 2010;32(8):638–45.CrossRefPubMedGoogle Scholar
  21. Frank JR, Snell L, Sherbino J. CanMEDS 2015 physician competency framework. Ottawa: Royal College of Physicians and Surgeons of CA; 2015.Google Scholar
  22. Fraser SA, et al. Evaluating laparoscopic skills. Surg Endosc. 2003;17(6):964–7.CrossRefPubMedGoogle Scholar
  23. Goldenberg MG, et al. Systematic review to establish absolute standards for technical performance in surgery. Br J Surg. 2017a;104(1):13–21.CrossRefPubMedGoogle Scholar
  24. Goldenberg MG, et al. Surgeon performance predicts early continence after robot-assisted radical prostatectomy. J Endourol. 2017b;31(9):858–63.PubMedCrossRefGoogle Scholar
  25. Greenberg CC, et al. Patterns of communication breakdowns resulting in injury to surgical patients. J Am Coll Surg. 2007;204(4):533–40.CrossRefPubMedGoogle Scholar
  26. Griswold S, et al. The emerging role of simulation education to achieve patient safety: translating deliberate practice and debriefing to save lives. Pediatr Clin N Am. 2012;59(6):1329–40.CrossRefGoogle Scholar
  27. Hamilton NA, et al. Video review using a reliable evaluation metric improves team function in high-fidelity simulated trauma resuscitation. J Surg Educ. 2012;69(3):428–31.CrossRefPubMedGoogle Scholar
  28. Harden RM, Gleeson FA. Assessment of clinical competence using an objective structured clinical examination (OSCE). Med Educ. 1979;13(1):39–54.CrossRefGoogle Scholar
  29. Harden RM, et al. Assessment of clinical competence using objective structured examination. Br Med J. 1975;1(5955):447–51.CrossRefPubMedPubMedCentralGoogle Scholar
  30. Hatala R, et al. Constructing a validity argument for the objective structured assessment of technical skills (OSATS): a systematic review of validity evidence. Adv Health Sci Educ. 2015;20(5):1–27.Google Scholar
  31. Heskin L, et al. The impact of a surgical boot camp on early acquisition of technical and nontechnical skills by novice surgical trainees. Am J Surg. 2015;210(3):570–7.CrossRefPubMedGoogle Scholar
  32. Hodges B, et al. OSCE checklists do not capture increasing levels of expertise. Acad Med. 1999;74(10):1129.CrossRefPubMedGoogle Scholar
  33. Hogg ME, et al. Grading of surgeon technical performance predicts postoperative pancreatic fistula for pancreaticoduodenectomy independent of patient-related variables. Ann Surg. 2016;264(3):482–91.CrossRefPubMedGoogle Scholar
  34. Holmboe ES, et al. The role of assessment in competency-based medical education. Med Teach. 2010;32(8):676–82.CrossRefPubMedGoogle Scholar
  35. Louridas M, Szasz P, de Montbrun S, et al. Can we predict technical aptitude? A systematic review. Ann Surg. 2015a;263:1–21.Google Scholar
  36. Louridas M, Szasz P, Fecso AB, et al. Novel approach to stratifying learning potential of technical skills in novice trainees before entry into surgical training. J Am Coll Surg. 2015b;221(4 Supplement 1):S46.CrossRefGoogle Scholar
  37. Matsumoto ED, Hamstra SJ, Radomski SB. A novel approach to endourological training: training at the Surgical Skills Center. J Urol. 2001;166(4):1261–6.CrossRefPubMedGoogle Scholar
  38. McGaghie WCAO. Competency-based curriculum development in medical education. An introduction. Public Health Papers No. 68. 1978.Google Scholar
  39. Miller GE. The assessment of clinical skills/competence/performance. Acad Med. 1990;65(9 Suppl):S63–7.CrossRefPubMedGoogle Scholar
  40. Mishra A, Catchpole K, McCulloch P. The Oxford NOTECHS System: reliability and validity of a tool for measuring teamwork behaviour in the operating theatre. Qual Saf Health Care. 2009;18(2):104–8.CrossRefPubMedGoogle Scholar
  41. Mudumbai SC, et al. External validation of simulation-based assessments with other performance measures of third-year anesthesiology residents. Simul Healthc. 2012;7(2):73–80.CrossRefPubMedGoogle Scholar
  42. Norcini JJ, Hancock EW, Webster GD. A criterion-referenced examination of physician competence. Eval Health Prof. 1988;11(1):98–112.CrossRefGoogle Scholar
  43. Page GG, Fielding DW. Performance on PMPs and performance in practice: are they related? Acad Med. 1980;55(6):529.CrossRefGoogle Scholar
  44. Palter VN, Grantcharov TP. Individualized deliberate practice on a virtual reality simulator improves technical performance of surgical novices in the operating room. Ann Surg. 2014;259(3):443–8.CrossRefPubMedGoogle Scholar
  45. Ram P, et al. Assessment in general practice: the predictive value of written-knowledge tests and a multiple-station examination for actual medical performance in daily practice. Med Educ. 1999;33(3):197–203.CrossRefPubMedGoogle Scholar
  46. Ramani S, Krackov SK. Twelve tips for giving feedback effectively in the clinical environment. Med Teach. 2012;34(10):787–91.CrossRefPubMedGoogle Scholar
  47. Raza SJ, et al. Construct validation of the key components of Fundamental Skills of Robotic Surgery (FSRS) curriculum--a multi-institution prospective study. J Surg Educ. 2014;71(3):316–24.CrossRefPubMedGoogle Scholar
  48. Regehr G, et al. Comparing the psychometric properties of checklists and global rating scales for assessing performance on an OSCE-format examination. Acad Med. 1998;73(9):993.CrossRefPubMedGoogle Scholar
  49. Rethans JJ, et al. Does competence of general practitioners predict their performance? Comparison between examination setting and actual practice. BMJ. 1991;303(6814):1377–80.CrossRefPubMedPubMedCentralGoogle Scholar
  50. Rethans JJ, et al. The relationship between competence and performance: implications for assessing practice performance. Med Educ. 2002;36(10):901–9.CrossRefPubMedGoogle Scholar
  51. Reznick R, et al. Testing technical skill via an innovative “bench station” examination. Am J Surg. 1997;173(3):226–30.CrossRefPubMedGoogle Scholar
  52. Riley W, et al. Didactic and simulation nontechnical skills team training to improve perinatal patient outcomes in a community hospital. Jt Comm J Qual Patient Saf. 2011;37(8):357–64.CrossRefPubMedGoogle Scholar
  53. Sroka G, et al. Fundamentals of laparoscopic surgery simulator training to proficiency improves laparoscopic performance in the operating room-a randomized controlled trial. Am J Surg. 2010;199(1):115–20.CrossRefPubMedGoogle Scholar
  54. Stefanidis D, et al. Simulation in surgery: What’s needed next? Ann Surg. 2015;261(5):846–53.CrossRefPubMedGoogle Scholar
  55. Szasz P, et al. Assessing technical competence in surgical trainees: a systematic review. Ann Surg. 2014;261(6):1–1055.Google Scholar
  56. Szasz P, Grantcharov TP, Sweet RM, Korndorffer JR, Pedowitz RA, Roberts PL, Sachdeva AK, et al. Simulation-based summative assessments in surgery. Surgery. 2016;160(3):528–35.CrossRefPubMedGoogle Scholar
  57. Tavares W, et al. Simulation-based assessment of paramedics and performance in real clinical contexts. Prehosp Emerg Care. 2013;18(1):116–22.CrossRefPubMedGoogle Scholar
  58. Thomsen ASS, et al. Simulation-based certification for cataract surgery. Acta Ophthalmol. 2015;93(5):416–21.CrossRefPubMedGoogle Scholar
  59. Undre S, et al. Observational teamwork assessment for surgery (OTAS): refinement and application in urological surgery. World J Surg. 2007;31(7):1373–81.CrossRefPubMedGoogle Scholar
  60. Yule S, et al. Surgeons’ non-technical skills in the operating room: reliability testing of the NOTSS behavior rating system. World J Surg. 2008;32(4):548–56.CrossRefPubMedGoogle Scholar
  61. Zendejas B, et al. Simulation-based mastery learning improves patient outcomes in laparoscopic inguinal hernia repair. Ann Surg. 2011;254(3):502–11.CrossRefPubMedGoogle Scholar

Copyright information

© Springer International Publishing AG, part of Springer Nature 2018

Authors and Affiliations

  • Mitchell G. Goldenberg
    • 1
  • Teodor P. Grantcharov
    • 2
  1. 1.Division of UrologyUniversity of TorontoTorontoCanada
  2. 2.Li Ka Shing Knowledge Institute, St. Michael’s HospitalUniversity of TorontoTorontoCanada

Personalised recommendations