Abstract
Assessment is an essential feature of the competency-based educational model because only by means of evaluation can we verify achievement of specified learning outcomes. This is especially important in the context of health professions education, where the competencies of interest impact the well-being of patients. Therefore, just as with planning the instructional component of a curriculum, development of an assessment system must start with the specification of desired learning outcomes in the form of knowledge, skills, and attitudes expected of trainees or practitioners in order to provide safe and effective patient care.
Issues to consider when judging the quality of evaluation methods include the reliability of data generated by the assessment, validity of decisions based on test results, educational impact on individuals undergoing evaluation and other stakeholders, and the feasibility of implementing the assessment system. In addition to these criteria and the particular competencies to be evaluated, the choice of testing methods from among numerous available techniques should consider multiple dimensions, such as appropriate level of assessment, stage of learner development, and, very importantly, overall purpose and context of the assessment. Ultimately, no one method can assess all aspects of professional competence, but familiarity with strengths and limitations of various modalities can guide the development of appropriate assessment systems. Strengths of simulation-based methods for evaluative purposes include the ability to assess actual performance of psychomotor skills and demonstration of nontechnical professional competencies in environments that safely and authentically mirror real practice settings. In addition, the programmability of simulations permits on-demand testing of rare but important clinical situations and consistent presentation of evaluation problems to multiple examinees; this reproducibility becomes especially important when high-stakes decisions are contingent upon such assessments.
This is a preview of subscription content, log in via an institution.
Buying options
Tax calculation will be finalised at checkout
Purchases are for personal use only
Learn about institutional subscriptionsReferences
Issenberg SB, McGaghie WC, Petrusa ER, Lee Gordon D, Scalese RJ. Features and uses of high-fidelity medical simulations that lead to effective learning: a BEME systematic review. Med Teach. 2005;27(1):10–28.
McGaghie WC, Issenberg SB, Petrusa ER, Scalese RJ. A critical review of simulation-based medical education research: 2003–2009. Med Educ. 2010;44(1):50–63.
Cook DA, Hatala R, Brydges R, et al. Technology-enhanced simulation for health professions education: a systematic review and meta-analysis. JAMA. 2011;306(9):978–88.
Hatala R, Kassen BO, Nishikawa J, Cole G, Issenberg SB. Incorporating simulation technology in a Canadian internal medicine specialty examination: a descriptive report. Acad Med. 2005;80(6):554–6.
Hatala R, Scalese RJ, Cole G, Bacchus M, Kassen B, Issenberg SB. Development and validation of a cardiac findings checklist for use with simulator-based assessments of cardiac physical examination competence. Simul Healthc. 2009;4(1):17–21.
Barrows HS, Abahamson S. The programmed patient: a technique for appraising student performance in clinical neurology. J Med Educ. 1964;39:802–5.
Abrahamson S, Denson JS, Wolf RM. Effectiveness of a simulator in training anesthesiology residents. J Med Educ. 1969;44(6):515–9.
Stevenson A, Lindberg CA, editors. New Oxford American Dictionary. 3rd ed. New York: Oxford University Press; 2010.
McGaghie WC. Simulation in professional competence assessment: basic considerations. In: Tekian A, McGuire CH, McGaghie WC, editors. Innovative simulations for assessing professional competence. Chicago: Department of Medical Education, University of Illinois at Chicago; 1999. p. 7–22.
Maran NJ, Glavin RJ. Low- to high-fidelity simulation – a continuum of medical education? Med Educ. 2003;37 Suppl 1:22–8.
Quirey Jr WO, Adams J. National Practitioner Data Bank revisited – the lessons of Michael Swango, M.D. Virginia State Bar Web site. http://www.vsb.org/sections/hl/bank.pdf. Accessed 30 Aug 2012.
The Shipman Inquiry. Second report: The police investigation of March 1998. Official Documents Web site. http://www.official-documents.gov.uk/document/cm58/5853/5853.pdf. Published 2003. Accessed 30 Aug 2012.
Harden RM, Crosby JR, Davis M. An introduction to outcome-based education. Med Teach. 1999;21(1):7–14.
Harden RM, Crosby JR, Davis MH, Friedman M. AMEE guide no. 14: outcome-based education: part 5. From competency to meta-competency: a model for the specification of learning outcomes. Med Teach. 1999;21(6):546–52.
Schwarz MR, Wojtczak A. Global minimum essential requirements: a road towards competence-oriented medical education. Med Teach. 2002;24(2):125–9.
General Medical Council. Tomorrow’s Doctors – outcomes and standards for undergraduate medical education. 2nd ed. London: General Medical Council; 2009.
Scottish Deans’ Medical Curriculum Group. The Scottish Doctor – learning outcomes for the medical undergraduate in Scotland: a foundation for competent and reflective practitioners. 3rd ed. Dundee: Association for Medical Education in Europe; 2007.
Frank JR, editor. The CanMEDS 2005 physician competency framework. Better standards. Better physicians. Better care. Ottawa: The Royal College of Physicians and Surgeons of Canada; 2005.
Accreditation Council for Graduate Medical Education. Core program requirements categorization. ACGME 2012 Standards – Categorization of Common Program Requirements Web site. http://www.acgme-nas.org/assets/pdf/CPR-Categorization-TCC.pdf. Published 7 Feb 2012. Updated 2012. Accessed 24 June 2012.
General Medical Council. Good medical practice. 3rd ed. London: General Medical Council; 2009.
American Association of Colleges of Nursing. The essentials of baccalaureate education for professional nursing practice. Washington, D.C.: American Association of Colleges of Nursing; 2008.
Core competencies for nurse practitioners. National Organization of Nurse Practitioner Faculties (NONPF) Web site. http://nonpf.com/displaycommon.cfm?an=1&subarticlenbr=14. Updated 2012. Accessed 30 Aug 2012.
Competencies for the physician assistant profession. National Commission on Certification of Physician Assistants Web site. http://www.nccpa.net/pdfs/Definition%20of%20PA%20Competencies%203.5%20for%20Publication.pdf. Published 2006. Accessed 30 Aug 2012.
Core competencies. Association of American Veterinary Medical Colleges Web site. http://www.aavmc.org/data/files/navmec/navmeccorecompetencies.pdf. Published 2012. Accessed 30 Aug 2012.
Van Der Vleuten CPM, Schuwirth LWT. Assessing professional competence: from methods to programmes. Med Educ. 2005;39(3):309–17.
Norcini J, Anderson B, Bollela V, et al. Criteria for good assessment: consensus statement and recommendations from the Ottawa 2010 conference. Med Teach. 2011;33(3):206–14.
Axelson RD, Kreiter CD. Reliability. In: Downing SM, Yudkowsky R, editors. Assessment in health professions education. New York: Routledge; 2009. p. 57–74.
Clauser BE, Margolis MJ, Swanson DB. Issues of validity and reliability for assessments in medical education. In: Holmboe ES, Hawkins RE, editors. Practical guide to the evaluation of clinical competence. Philadelphia: Mosby/Elsevier; 2008. p. 10–23.
McAleer S. Choosing assessment instruments. In: Dent JA, Harden RM, editors. A practical guide for medical teachers. 2nd ed. Edinburgh: Elsevier/Churchill Livingstone; 2005. p. 302–10.
Downing SM, Haladyna TM. Validity and its threats. In: Downing SM, Yudkowsky R, editors. Assessment in health professions education. New York: Routledge; 2009. p. 21–56.
Downing SM. Validity: on the meaningful interpretation of assessment data. Med Educ. 2003;37(9):830–7.
Kane MT. An argument-based approach to validity. Psych Bull. 1992;112(3):527–35.
Kane MT. The assessment of professional competence. Eval Health Prof. 1992;15(2):163–82.
McGaghie WC, Cohen ER, Wayne DB. Are United States Medical Licensing Exam Step 1 and 2 scores valid measures for postgraduate medical residency selection decisions? Acad Med. 2011;86(1):48–52.
Wayne DB, Fudala MJ, Butter J, et al. Comparison of two standard-setting methods for advanced cardiac life support training. Acad Med. 2005;80(10 Suppl):S63–6.
Wayne DB, Butter J, Cohen ER, McGaghie WC. Setting defensible standards for cardiac auscultation skills in medical students. Acad Med. 2009;84(10 Suppl):S94–6.
Messick S. Validity. In: Linn R, editor. Educational measurement. 3rd ed. New York: American Council on Education/Macmillan; 1989. p. 13–103.
Schuwirth LW, van der Vleuten CP. Changing education, changing assessment, changing research? Med Educ. 2004;38(8):805–12.
Federation of State Medical Boards of the United States/National Board of Medical Examiners. 2012 Bulletin of Information. Philadelphia: United States Medical Licensing Examination.
Bulletin: Examination content. United States Medical Licensing Examination (USMLE) Web site. http://www.usmle.org/bulletin/exam-content/#step2cs. Published 2011. Accessed Aug 2012.
Hodges B, Regehr G, Hanson M, McNaughton N. An objective structured clinical examination for evaluating psychiatric clinical clerks. Acad Med. 1997;72(8):715–21.
Miller GE. The assessment of clinical skills/competence/performance. Acad Med. 1990;65(9 Suppl):S63–7.
Downing SM, Yudkowsky R, editors. Assessment in health professions education. New York: Routledge; 2009.
Holmboe ES, Hawkins RE, editors. Practical guide to the evaluation of clinical competence. Philadelphia: Mosby/Elsevier; 2008.
Dent JA, Harden RM, editors. A practical guide for medical teachers. 2nd ed. Edinburgh: Elsevier/Churchill Livingstone; 2005.
Amin Z, Seng CY, Eng KH. Practical guide to medical student assessment. Singapore: World Scientific; 2006.
Norman GR, van der Vleuten CPM, Newble DI, editors. International handbook of research in medical education. Dordrecht: Kluwer Academic; 2002.
Downing SM, Yudkowsky R. Introduction to assessment in the health professions. In: Downing SM, Yudkowsky R, editors. Assessment in health professions education. New York: Routledge; 2009. p. 1–20.
Norcini J, Holmboe ES, Hawkins RE. Evaluation challenges in the era of outcome-based education. In: Holmboe ES, Hawkins RE, editors. Practical guide to the evaluation of clinical competence. Philadelphia: Mosby/Elsevier; 2008. p. 1–9.
Newble D. Assessment: Introduction. In: Norman GR, van der Vleuten CPM, Newble DI, editors. International handbook of research in medical education, vol. 2. Dordrecht: Kluwer Academic; 2002. p. 645–6.
Downing SM. Written tests: constructed-response and selected-response formats. In: Downing SM, Yudkowsky R, editors. Assessment in health professions education. New York: Routledge; 2009. p. 149–84.
Hawkins RE, Swanson DB. Using written examinations to assess medical knowledge and its application. In: Holmboe ES, Hawkins RE, editors. Practical guide to the evaluation of clinical competence. Philadelphia: Mosby/Elsevier; 2008. p. 42–59.
Downing SM. Assessment of knowledge with written test forms. In: Norman GR, van der Vleuten CPM, Newble DI, editors. International handbook of research in medical education, vol. 2. Dordrecht: Kluwer Academic; 2002. p. 647–72.
Schuwirth LWT, van der Vleuten CPM. Written assessments. In: Dent JA, Harden RM, editors. A practical guide for medical teachers. 2nd ed. Edinburgh: Elsevier/Churchill Livingstone; 2005. p. 311–22.
Tekian A, Yudkowsky R. Oral examinations. In: Downing SM, Yudkowsky R, editors. Assessment in health professions education. New York: Routledge; 2009. p. 269–86.
Marks M, Humphrey-Murto S. Performance assessment. In: Dent JA, Harden RM, editors. A practical guide for medical teachers. 2nd ed. Edinburgh: Elsevier/Churchill Livingstone; 2005. p. 323–35.
Petrusa ER. Clinical performance assessments. In: Norman GR, van der Vleuten CPM, Newble DI, editors. International handbook of research in medical education, vol. 2. Dordrecht: Kluwer Academic; 2002. p. 673–710.
Harden RM, Gleeson FA. Assessment of clinical competence using an objective structured clinical examination (OSCE). Med Educ. 1979;13:41–54.
Yudkowsky R. Performance tests. In: Downing SM, Yudkowsky R, editors. Assessment in health professions education. New York: Routledge; 2009. p. 217–43.
McGaghie WC, Issenberg SB. Simulations in assessment. In: Downing SM, Yudkowsky R, editors. Assessment in health professions education. New York: Routledge; 2009. p. 245–68.
Scalese RJ, Issenberg SB. Simulation-based assessment. In: Holmboe ES, Hawkins RE, editors. Practical guide to the evaluation of clinical competence. Philadelphia: Mosby/Elsevier; 2008. p. 179–200.
McGaghie WC, Butter J, Kaye M. Observational assessment. In: Downing SM, Yudkowsky R, editors. Assessment in health professions education. New York: Routledge; 2009. p. 185–216.
Davis MH, Ponnamperuma GG. Work-based assessment. In: Dent JA, Harden RM, editors. A practical guide for medical teachers. 2nd ed. Edinburgh: Elsevier/Churchill Livingstone; 2005. p. 336–45.
Lockyer JM, Clyman SG. Multisource feedback (360-degree evaluation). In: Holmboe ES, Hawkins RE, editors. Practical guide to the evaluation of clinical competence. Philadelphia: Mosby/Elsevier; 2008. p. 75–85.
Holmboe ES. Practice audit, medical record review, and chart-stimulated recall. In: Holmboe ES, Hawkins RE, editors. Practical guide to the evaluation of clinical competence. Philadelphia: Mosby/Elsevier; 2008. p. 60–74.
Cunnington J, Southgate L. Relicensure, recertification, and practice-based assessment. In: Norman GR, van der Vleuten CPM, Newble DI, editors. International handbook of research in medical education, vol. 2. Dordrecht: Kluwer Academic; 2002. p. 883–912.
Vukanovic-Criley JM, Criley S, Warde CM, et al. Competency in cardiac examination skills in medical students, trainees, physicians, and faculty: a multicenter study. Arch Intern Med. 2006;166(6):610–6.
Holmboe ES, Davis MH, Carraccio C. Portfolios. In: Holmboe ES, Hawkins RE, editors. Practical guide to the evaluation of clinical competence. Philadelphia: Mosby/Elsevier; 2008. p. 86–101.
Davis MH, Ponnamperuma GG. Portfolios, projects and dissertations. In: Dent JA, Harden RM, editors. A practical guide for medical teachers. 2nd ed. Edinburgh: Elsevier/Churchill Livingstone; 2005. p. 346–56.
Tekian A, Yudkowsky R. Assessment portfolios. In: Downing SM, Yudkowsky R, editors. Assessment in health professions education. New York: Routledge; 2009. p. 287–304.
Pangaro L, Holmboe ES. Evaluation forms and global rating scales. In: Holmboe ES, Hawkins RE, editors. Practical guide to the evaluation of clinical competence. Philadelphia: Mosby/Elsevier; 2008. p. 24–41.
Flin R, Patey R. Non-technical skills for anaesthetists: developing and applying ANTS. Best Pract Res Clin Anaesthesiol. 2011;25(2):215–27.
Stufflebeam DL. Guidelines for developing evaluation checklists: The checklists development checklist (CDC). Western Michigan University: The Evaluation Center Web site. http://www.wmich.edu/evalctr/archive_checklists/guidelines_cdc.pdf. Published 2010. Accessed 31 Aug 2012.
Bichelmeyer BA. Checklist for formatting checklists. Western Michigan University: The Evaluation Center Web site. http://www.wmich.edu/evalctr/archive_checklists/cfc.pdf. Published 2003. Accessed 31 Aug 2012.
Yudkowsky R, Downing SM, Tekian A. Standard setting. In: Downing SM, Yudkowsky R, editors. Assessment in health professions education. New York: Routledge; 2009. p. 119–48.
Norcini J, Guille R. Combining tests and setting standards. In: Norman GR, van der Vleuten CPM, Newble DI, editors. International handbook of research in medical education, vol. 2. Dordrecht: Kluwer Academic; 2002. p. 811–34.
Norcini J. Standard setting. In: Dent JA, Harden RM, editors. A practical guide for medical teachers. 2nd ed. Edinburgh: Elsevier/Churchill Livingstone; 2005. p. 293–301.
Accreditation Council for Graduate Medical Education/American Board of Medical Specialties. Toolbox of assessment methods. http://www.partners.org/Assets/Documents/Graduate-Medical-Education/ToolTable.pdf. Updated 2000. Accessed 27 June 2012.
Duffy FD, Holmboe ES. Competence in improving systems of care through practice-based learning and improvement. In: Holmboe ES, Hawkins RE, editors. Practical guide to the evaluation of clinical competence. Philadelphia: Mosby/Elsevier; 2008. p. 149–78.
Bandiera G, Sherbino J, Frank JR, editors. The CanMEDS assessment tools handbook. An introductory guide to assessment methods for the CanMEDS competencies. Ottawa: The Royal College of Physicians and Surgeons of Canada; 2006.
Pangaro L. A new vocabulary and other innovations for improving descriptive in-training evaluations. Acad Med. 1999;74(11):1203–7.
Introduction to the basic log book. Royal College of Obstetricians and Gynaecologists Web site. http://www.rcog.org.uk/files/rcog-corp/uploaded-files/ED-Basic-logbook.pdf. Published 2006. Accessed 31 Aug 2012.
‘Staged’ models of skills acquisition. University of Medicine and Dentistry of New Jersey Web site. http://www.umdnj.edu/idsweb/idst5340/models_skills_acquisition.htm. Accessed 31 Aug 2012.
Issenberg SB, McGaghie WC, Hart IR, et al. Simulation technology for health care professional skills training and assessment. JAMA. 1999;282(9):861–6.
Fincher RE, Lewis LA. Simulations used to teach clinical skills. In: Norman GR, van der Vleuten CPM, Newble DI, editors. International Handbook of Research in Medical Education, vol. 1. Dordrecht: Kluwer Academic; 2002. p. 499–535.
Collins JP, Harden RM. AMEE education guide no. 13: Real patients, simulated patients and simulators in clinical examinations. Med Teach. 1998;20:508–21.
Clauser BE, Schuwirth LWT. The use of computers in assessment. In: Norman GR, van der Vleuten CPM, Newble DI, editors. International handbook of research in medical education, vol. 2. Dordrecht: Kluwer Academic; 2002. p. 757–92.
Gaba DM. Crisis resource management and teamwork training in anaesthesia. Br J Anaesth. 2010;105(1):3–6.
Institute of Medicine. To err is human: building a safer health system. Washington, D.C.: National Academy Press; 2000.
Department of Health. An organisation with a memory: report of an expert group on learning from adverse events in the NHS. London: The Stationery Office; 2000.
Goodman W. The world of civil simulators. Flight Int Mag. 1978;18:435.
Wachtel J, Walton DG. The future of nuclear power plant simulation in the United States. In: Simulation for nuclear reactor technology. Cambridge: Cambridge University Press; 1985.
Ressler EK, Armstrong JE, Forsythe G. Military mission rehearsal: from sandtable to virtual reality. In: Tekian A, McGuire CH, McGaghie WC, editors. Innovative simulations for assessing professional competence. Chicago: Department of Medical Education, University of Illinois at Chicago; 1999. p. 157–74.
N.Y. jet crash called ‘miracle on the Hudson’. msnbc.com Web site. http://www.msnbc.msn.com/id/28678669/ns/us_news-life/t/ny-jet-crash-called-miracle-hudson/. Published 15 Jan 2009. Updated 2009. Accessed 10 June 2012.
Berkenstadt H, Ziv A, Gafni N, Sidi A. Incorporating simulation-based objective structured clinical examination into the Israeli National Board Examination in Anesthesiology. Anesth Analg. 2006;102(3):853–8.
Borrell-Carrio F, Poveda BF, Seco EM, Castillejo JA, Gonzalez MP, Rodriguez EP. Family physicians’ ability to detect a physical sign (hepatomegaly) from an unannounced standardized patient (incognito SP). Eur J Gen Pract. 2011;17(2):95–102.
Maiburg BH, Rethans JJ, van Erk IM, Mathus-Vliegen LM, van Ree JW. Fielding incognito standardised patients as ‘known’ patients in a controlled trial in general practice. Med Educ. 2004;38(12):1229–35.
Gorter SL, Rethans JJ, Scherpbier AJ, et al. How to introduce incognito standardized patients into outpatient clinics of specialists in rheumatology. Med Teach. 2001;23(2):138–44.
Hodges B, Regehr G, McNaughton N, Tiberius R, Hanson M. OSCE checklists do not capture increasing levels of expertise. Acad Med. 1999;74(10):1129–34.
Kneebone R, Kidd J, Nestel D, Asvall S, Paraskeva P, Darzi A. An innovative model for teaching and learning clinical procedures. Med Educ. 2002;36(7):628–34.
Kneebone RL, Kidd J, Nestel D, et al. Blurring the boundaries: scenario-based simulation in a clinical setting. Med Educ. 2005;39(6):580–7.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2013 Springer Science+Business Media New York
About this chapter
Cite this chapter
Scalese, R.J., Hatala, R. (2013). Competency Assessment. In: Levine, A.I., DeMaria, S., Schwartz, A.D., Sim, A.J. (eds) The Comprehensive Textbook of Healthcare Simulation. Springer, New York, NY. https://doi.org/10.1007/978-1-4614-5993-4_11
Download citation
DOI: https://doi.org/10.1007/978-1-4614-5993-4_11
Published:
Publisher Name: Springer, New York, NY
Print ISBN: 978-1-4614-5992-7
Online ISBN: 978-1-4614-5993-4
eBook Packages: MedicineMedicine (R0)