An analysis of core EPAs reveals a gap between curricular expectations and medical school graduates’ self-perceived level of competence

Abstract

Background

Entrustable Professional Activities (EPAs) are being implemented worldwide as a means to promote competency-based medical education. In Switzerland, the new EPA-based curriculum for undergraduate medical education will be implemented in 2021. The aim of our study was to analyze the perceived, self-reported competence of graduates in 2019. The data represent a pre-implementation baseline and will provide guidance for curriculum developers.

Methods

Two hundred eighty-one graduates of the Master of Human Medicine program of the University of Zurich who had passed the Federal Licensing Exam in September 2019 were invited to complete an online survey. They were asked to rate their needed level of supervision (“observe only”, “direct, proactive supervision”, “indirect, reactive supervision”) for 46 selected EPAs. We compared the perceived competence with the expected competence of the new curriculum.

Results

The response rate was 54%. The need for supervision expressed by graduates varied considerably by EPA. The proportion of graduates rating themselves at expected level was high for “history taking”, “physical examination” “and documentation”; medium for “prioritizing differential diagnoses”, “interpreting results” and “developing and communicating a management plan”; low for “practical skills”; and very low for EPAs related to “urgent and emergency care”.

Conclusions

Currently, there are significant gaps between the expectations of curriculum developers and the perceived competences of students. This is most obvious for practical skills and emergency situations. The new curriculum will either need to fill this gap or expectations might need to be revised.

Peer Review reports

Background

Experts regularly claim medical education to be outdated to prepare students for their profession as physicians [1]. This criticism has triggered a change towards competency-based medical education (CBME) incorporating outcome-based frameworks [2]. To implement competency-based curricula into clinical teaching, new teaching and assessment tools are needed [3]. Ten Cate’s idea of entrustment, first published in 2005 [4], has become popular and Entrustable Professional Activities (EPAs) have been adopted by many medical specialties. Initially predominantly used in postgraduate programs, there are sound arguments for their application in undergraduate medical education [5].

In 2017, Switzerland introduced a completely revised version of its national catalog for learning objectives for undergraduate medical training. The document is entitled “Principal Relevant Objectives and Framework for Integrated Learning and Education in Switzerland” (PROFILES) [6]. PROFILES is a prerequisite for the accreditation of undergraduate medical curricula in Switzerland and will define the content of the federal licensing exam as of 2021. The new catalog focuses on competency-based objectives. It is based on three pillars: general objectives related to the different role models of medical doctors (inspired by the Canadian Medical Education Directives for Specialists [CanMEDS] roles [7]), situations as starting points and EPAs. The Swiss initiative to implement end-of-training EPAs is derived from the Core-EPAs for Entering Residency published by the Association of American Colleges (AAMC) [8].

All Swiss universities providing undergraduate medical training are currently adapting their curricula to meet the new accreditation requirements. Among other things, PROFILES specifies that EPAs are “reflecting the main medical tasks that a physician must be able to perform autonomously on the first day of his residency” [6]. While such capabilities have always been the goal of undergraduate medical education, they are defined explicitly for the first time in Switzerland. The expected competency level is defined as Level 3: distant supervision.

We carried out an online survey with the 2019 graduates of the University of Zurich. These young physicians were trained following a pre-revision curriculum and passed the Federal Licensing Exam in accordance with the requirements of the predecessor of PROFILES, which is still in place. In this study, the participants evaluated their own levels of autonomy regarding EPAs as defined by PROFILES. In doing so, this study assesses how confident graduates feel to handle standard clinical situations and thus how well they feel prepared for clinical practice as defined by the new standard.

As the curricular reforms are currently being shaped and implemented, we aim at providing guidance to curriculum developers by clarifying which EPAs have already been covered adequately by current curricula and which need to be the focus of change. Furthermore, our data may serve as a baseline for examining future changes.

Methods

In this section we describe our study design, the sampling and the data collection method.

Assessing EPAs in a survey study

PROFILES lists nine main EPAs (see Table 1) derived from the Association of American Medical Colleges (AAMC) Core EPAs in the United States [8]. These are further specified into tasks (nested EPAs) and descriptors. This results in 161 items.

Table 1 Main EPAs as defined by PROFILES

Designing a survey to evaluate students’ self-perception on the EPAs defined by PROFILES comes with several challenges. While the sheer length of the catalog is among them, the main concern is its heterogeneity. In accordance with a recent study by Meyer et al. [9], we found that some of the EPAs do not meet the standards of the EPA framework measured by the EQual rubric [10]. In this study, we aimed to only including high-quality EPAs. The list of EPAs was therefore edited for the purpose of this study. All authors are EPA experts. Three of the authors (AM, SF, SZ) hold a master’s degree in medical education and have been working with EPAs for several years. HBE is an educational expert and has gained extensive knowledge on the theory of EPAs while working on the implementation of PROFILES.

Figure 1 illustrates the selection and editing process. Each selection step was done by each of the authors separately; the resulting lists were then compared and discussed until consensus was reached. As a first step, items which were just descriptors of tasks and items that did not meet current EPA criteria [10] were discarded (e.g. all of EPA 9). As a second step, items that included more than one task were either split or simplified. Furthermore, the wording of the remaining tasks was edited to comply with the EPA framework; this resulted in 96 nested EPAs. As a final step, we compared our list with a prioritization previously done by a curriculum development committee of the Universities of Zurich. In a modified Delphi-process (not part of this study) the goal of this committee (consisting of two medical students, and 13 medical experts and faculty members) was to identify the Core-Tasks/EPAs within PROFILES. By deleting the EPAs not prioritized, a list of 46 items was generated.

Fig. 1
figure1

Selection and editing process for EPAs to be included in the survey

When working with EPAs in medical education, competence is typically assessed using an entrustment-supervision scale [11]. However, PROFILES only includes the expected level: graduates are expected to handle all EPAs in the catalog “at least under distant, on-demand supervision on the first day of their residency” (p. 16) [12]. This corresponds to Level 3: “indirect, reactive supervision”.

Table 2 presents the wording used in our survey. We phrased the scale using the first person singular to make it more actionable for the participants. As we are focusing on undergraduate training, only Levels 1–3 are provided in our survey. The resulting questionnaire is included in the supplements.

Table 2 Entrustment levels used in the survey (derived from Ten Cate [11, 13])

Sampling and data collection

An online survey was compiled using SurveyMonkey®. To minimize missing values, we used forced choice format for all questions considering EPAs. It was revised based on a pilot test with medical students and physicians working at the authors’ departments.

All 281 graduates of the Master of Human Medicine program of the University of Zurich who had passed the Federal Licensing Exam in September 2019 were invited to participate in the study by means of an email in October 2019. A reminder was sent after 13 and 24 days. Study participation was completely anonymous since IP addresses were not recorded and the software did not allow tracking of completed questionnaires to the originating email addresses. Participation in the study was voluntary and no financial or other incentive was offered. Once data collection was completed, the data were extracted from SurveyMonkey® and recorded in Microsoft Excel™ on a password-secured computer.

Ethical approval: the ethical commission of the Canton of Zurich has considered the project as not being within the scope of the Human Research Act (BASEC-Nr. Req-2019-00754).

Results

Of 281 medical students who graduated in 2019, 152 completed the survey, resulting in a response rate of 54%. All analyses are based on these cases only. Average completion time was 8 min and 44 s. Mean age of participants was 26.7 years and 63% of the participants were female.

The need expressed by graduates for supervision varied considerably by EPA (the data are displayed in Table 3). For history taking (EPA 1) and documentation (EPA 8), the proportion of graduates rating themselves at Level 3 (indirect supervision) was high. For history taking, 99% of the graduates reported Level 3; for documentation in a patient’s chart and for providing oral presentations of patient encounters, the corresponding numbers were 84 and 86% respectively. For physical examination (EPA 2), the number of graduates rating themselves at Level 3 was high for general physical examination procedures but considerably lower for specific procedures, e.g. ophthalmological, dermatological and psychiatric examinations. Prioritizing differential diagnoses and interpreting results (EPAs 3 and 4) were regarded to be feasible at Level 3 for 61 and 65% of our respondents respectively. Developing and communicating a management plan (EPA 7) was deemed possible with indirect supervision by half of the respondents.

Table 3 Levels of autonomy reported by graduates for various tasks (percentages and absolute numbers)

The responses for the practical skills listed in EPA 5 varied considerably by skill. Taking a patient’s temperature was regarded to be possible with indirect supervision by 97% of the graduates, whereas the corresponding figure was 21% for managing parenteral nutrition. The lowest levels of autonomy were reported for EPAs related to urgent and emergency care. While the variance for EPA 6 was very high, only one of eight items was regarded to be feasible with indirect supervision by the majority of the graduates. All other items ranged between 7 and 45%.

Discussion

Our data show that graduates’ confidence in their abilities to perform the EPAs listed in the Swiss catalog of learning objectives (PROFILES) varies considerably by EPA. For some EPAs, most graduates regarded themselves as capable of performing the tasks at the expected level of autonomy, i.e. with indirect, reactive supervision. For other EPAs, only a few graduates felt confident to act with indirect supervision only.

Most participants felt confident to perform history taking, physical examination and documentation (EPAs 1, 2 and 8) with distant supervision. These are the mainstay of the medical profession and are taught early on. The importance of these skills is evident and the need to teach them at medical school has been emphasized in the literature [14,15,16]. Moreover, a lot of effort has recently been put into the qualitative assessment modalities of these medical skills – the emergence of OSCEs (Objective Structured Clinical Examination) being a good example [17]. These EPAs can also be taught, performed and assessed repeatedly in many clinical situations at a low risk for patients.

As expected, graduates feel less confident regarding specific physical examination techniques of smaller specialties. One example is EPA 2j: “Assessment of eye movements, recognition and description of nystagmus”. This might be due to the fewer contact hours these specialties are allocated throughout the curriculum. Ophthalmology, for example, occupies only 16 of the total 580 h of practical training in the Zurich curriculum. Only a few students acquire more ophthalmological experience during electives. Moreover, some of these skills are obviously difficult to study on models or in a simulation center.

The majority of graduates rated documentation (EPA 8) as feasible with indirect supervision. Again, these tasks are regularly performed by students during clerkships. However, specific teaching activities for documentation and handover skills do not (yet) exist in the Zurich medical curriculum. Therefore, these results are surprising as documentation, reporting and handovers are more complex than commonly perceived [18] and require formal teaching [19].

Students felt less confident regarding their capabilities for clinical reasoning without direct supervision, i.e. prioritizing differential diagnoses (EPA 3: 61%), recommending and interpreting tests (EPA 4: 65%) and developing management plans (EPA 7: 50%). On a positive note, very few participants suggested they should only observe (1–5%, depending on the EPA in question). This variance might be due to variation in the clinical situation respondents had in mind while working on the questionnaire. More importantly, these EPAs are very complex as they rely on the ability to integrate clinical data and subsequently to make sound choices. As studies have shown, clinical reasoning is not yet well-developed at the end of medical school; the skill of clinical reasoning grows with experience [20]. Therefore, the restrained confidence of our graduates is reasonable: experience is gained through repeated exposure to clinical situations and thus with time. Nendaz et al. described the crucial elements that foster clinical reasoning: active prior knowledge and use transfer between different cases; using both analytical and non-analytical skills; and emphasizing the systematic collection of clinical information [21]. Even though these tasks are part of a lifelong learning process, curricular reform should aim at integrating new teaching modalities facilitating these elements: flipped classroom models, problem-based learning or specific case-based teaching techniques to name a few [22]. More complex instructional design models such as the 4C/ID method (https://www.4cid.org/about-4cid) show the need for ways to integrate clinical reasoning in medical schools, but simple and short teaching elements such as “the one-minute preceptor” [23] are as good a tool in the teachers’ hands.

The results regarding procedural skills (EPA 5) are heterogeneous. This is not altogether surprising since the complexity of the assessed skills varies considerably. Tasks for which the respondents indicated particularly high levels of autonomy were taking a patient’s body temperature and performing a pregnancy test. Both tasks are simple and doable even for most of the general population without specific medical knowledge. Other EPAs pertain to classical medical procedural skills such as intravenous injection or insertion of a peripheral intravenous line. These skills are typical for the medical profession and taught during medical school. Many medical schools, Zurich among them, use simulation models in skills labs to teach these objectives. There is sound data that such training leads to immediate improvements when assessed in a simulated environment but transferability to clinical practice and retention of skills is less well understood [24]. The fact that less than 60% of our graduates regard themselves capable of performing the tasks mentioned under indirect supervision suggests that transferability is lacking and needs to be improved. No solution on how this is best achieved is to be found in the literature and further studies are much needed. As there is evidence that how much one practices matters [25, 26], curriculum developers and researchers need to consider the frequency of training opportunities.

Few surgical tasks are incorporated in the list of EPAs and, for these, only a few graduates rated themselves as competent to perform them with indirect supervision: 43 and 36% of graduates declared themselves comfortable with indirect supervision for wound cleaning and applying wound dressings respectively. A lack of surgical experience in undergraduate medical education has been previously reported [27]. Diminished exposure to certain specialties additionally poses a threat to the workforce as students may then not develop an interest in these specialties [28, 29].

EPAs covering emergency situations (EPA 6) reveal the largest gap between the self-assessed competences of graduates and the expectations put forth by PROFILES. Teaching and learning the correct management of medical emergency situations are considered to be among the most challenging issues even in postgraduate education [30,31,32]. The use of cognitive aids has been described as a means to increase performance in these stressful situations [33]. In only one of the eight acute care EPAs summarized in Category 6, more than 50% of graduates rated themselves as ready for indirect supervision, i.e. managing a patient with “uncomplicated trauma such as a fall or minor traffic injury”. This EPA differs from the others in Category 6 by being well-defined without possible variation in complexity. “Managing a patient with severe shock or acute blood loss” is considerably more complex. Less than 10% of the graduates rated themselves competent to take care of such patients with indirect supervision, while more than one in three graduates preferred to act with direct supervision. This might not necessarily imply lack of competence; rather, it might denote an awareness of the risk and complexity inherent in these possibly life-threatening situations. Students usually have limited exposure to emergency situations during undergraduate training. If they do, they rarely find themselves in the role of the team leader as, in emergencies; patient safety is usually prioritized over teaching. Therefore, we believe the self-assessments of our graduates to be adequate. This is further supported since there are data that self-assessment for these skills agrees well between trainees and supervisors [34]. The PROFILES expectation that graduates will be able to handle these emergency situations alone for 30 min is rather high. Our data indicate that graduates do not reach the expected degree of autonomy by far. If emergency situations are a priority, universities might want to increase learning opportunities, applying tools such as virtual reality, high-fidelity simulation and use of cognitive aids. Alternatively, national authorities need to lower the expected level of supervision for this section.

In summary, our study suggests that curriculum developers need to focus on clinical reasoning and clinical decision-making skills; they need to make sure rare but important procedural skills are not underrepresented and that students are frequently exposed to emergency situations even if they are simulated. Graduates feel adequately prepared for history taking, general clinical examination and documentation. Nevertheless, teaching these tasks should not be neglected.

There are some limitations to our study. First, the response rate raises the possibility of non-responder bias. Graduates willing to participate in this survey might belong to a specific subgroup and therefore self-selection might confound the data. As Saleh et al. summarized, response rates of email surveys have always been a limitation of method of data acquisition [35]. The rates were regarded as satisfying in the early 1990s with averages of 50%. Since then, they have declined greatly to numbers as low as 19% [36] which is believed to be due to the loss of novelty and survey-fatigue. In comparison, our response rate of 54% is high. Nevertheless, any curriculum developer following our recommendations needs to be aware of this bias.

Second, our data are subjective as they represent the self-assessments of graduates. Adding assessments by supervisors might have increased the validity of the dataset since some authors have shown that students overestimate their competence in practical clinical skills [37]. On the other hand, there is evidence that self-assessment using entrustment-supervision scales seems to be accurate [34].

Third, the wording of the EPA leaves room for interpretation. The more specific the title of an EPA, the easier it is for the trainees to rate themselves. Many EPA titles in PROFILES are not specific and trainees’ perception of them (e.g. their complexity) might differ. This might lead to different self-assessment.

Fourth, generalizability might be limited because we have only evaluated data from one university and one cohort. However, the results are not surprising and are in agreement with previous research. We are therefore confident that the data are meaningful for curriculum developers at other institutions as well.

Lastly, we only focused on self-evaluations of undergraduate medical students. Since PROFILES and EPAs are yet to be implemented, no evaluations by clinical supervisors were available. To us it was important to give students a voice. We are planning a follow-up study which will include a comparison between supervisors and medical students’ evaluation of EPAs. Studies, including the faculty’s assessment of new residents, are necessary to evaluate the results effected by the curriculum change objectively.

Conclusion

As defined in PROFILES, all EPAs should be mastered at Level 3, i.e. with indirect supervision. However, our study reveals a substantial gap between this expectation and the self-reported level of competence of current graduates. Graduates indicated that they were well-prepared for low-risk tasks. On the other hand, they felt less prepared for high-risk tasks, such as performing procedural skills or handling emergency situations. This study provides important information for curriculum reforms: it reveals areas where reform is much needed and areas already well-covered by the current curriculum in medical school. Depending on the political commitment to provide resources for medical education, medical schools need to decide on how to allocate funding and teaching time.

Availability of data and materials

The datasets used and/or analysed during the current study are available from the corresponding author on reasonable request.

Abbreviations

AAMC:

Association of American Medical Colleges

CanMEDS:

Canadian Medical Education Directives for Specialists

CBME:

Competency-Based Medical Education

EPA:

Entrustable Professional Activity

OSCE:

Objective Structured Clinical Examination

PROFILES:

Principal Relevant Objectives and Framework for Integrated Learning and Education in Switzerland

References

  1. 1.

    Holmboe ES, Ward DS, Reznick RK, Katsufrakis PJ, Leslie KM, Patel VL, Ray DD, Nelson EA. Faculty development in assessment: the missing link in competency-based medical education. Acad Med. 2011;86(4):460–7.

    Article  Google Scholar 

  2. 2.

    Frank JR, Danoff D. The CanMEDS initiative: implementing an outcomes-based framework of physician competencies. Medical teacher. 2007;29(7):642–7.

    Article  Google Scholar 

  3. 3.

    Mulder H, Cate OT, Daalder R, Berkvens J. Building a competency-based workplace curriculum around entrustable professional activities: the case of physician assistant training. Medical teacher. 2010;32(10):e453–9.

    Article  Google Scholar 

  4. 4.

    Ten Cate O. Entrustability of professional activities and competency-bases training. Med Educ. 2005;39:1176–7.

    Article  Google Scholar 

  5. 5.

    Chen HC, van den Broek WS, ten Cate O. The case for use of entrustable professional activities in undergraduate medical education. Acad Med. 2015;90(4):431–6.

    Article  Google Scholar 

  6. 6.

    Michaud P-A, Jucker-Kupper P. The “ Profiles” document: a modern revision of the objectives of undergraduate medical studies in Switzerland. Swiss Med Wkly. 2016;146:w14270.

  7. 7.

    Frank J, Snell L, Sherbino J. The draft CanMEDS 2015 physician competency framework–series IV. Ottawa: the Royal College of Physicians and surgeons of Canada; 2014.

    Google Scholar 

  8. 8.

    AAMC: Core Entrustable Professional, Activities for Entering Residency. https://www.aamc.org/system/files/c/2/484778-epa13toolkit.pdf.

  9. 9.

    Meyer EG, Taylor DR, Uijtdehaage S, Durning SJ. EQual Rubric Evaluation of the Association of American Medical Colleges’ Core Entrustable Professional Activities for Entering Residency. Acad Med. 2020;95(11):1755–62.

  10. 10.

    Taylor DR, Park YS, Egan R, Chan M-K, Karpinski J, Touchie C, Snell LS, Tekian A. EQual, a novel rubric to evaluate entrustable professional activities for quality and structure. Acad Med. 2017;92(11S):S110–7.

    Article  Google Scholar 

  11. 11.

    Ten Cate O, Schwartz A, Chen HC. Assessing trainees and making entrustment decisions: on the nature and use of entrustment-supervision scales. Acad Med. 2020;95(11):1662–9.

  12. 12.

    Michaud P, Jucker-Kupper P. PROFILES; principal objectives and framework for integrated learning and education in Switzerland. Bern: Joint Commission of the Swiss Medical Schools; 2017.

    Google Scholar 

  13. 13.

    ten Cate O, Chen HC, Hoff RG, Peters H, Bok H, van der Schaaf M. Curriculum development for the workplace using entrustable professional activities (EPAs): AMEE guide no. 99. Med Teach. 2015;37(11):983–1002.

    Article  Google Scholar 

  14. 14.

    Aloia JF, Jonas E. Skills in history-taking and physical examination. Acad Med. 1976;51(5):410–5.

    Article  Google Scholar 

  15. 15.

    Feddock CA. The lost art of clinical skills. Am J Med. 2007;120(4):374–8.

    Article  Google Scholar 

  16. 16.

    McGlynn TJ, Sayre A, Kennedy D. Physical diagnosis courses—a question of emphasis. J Fam Pract. 1978;6(3):565–71.

    Google Scholar 

  17. 17.

    Khan KZ, Ramachandran S, Gaunt K, Pushkar P. The objective structured clinical examination (OSCE): AMEE guide no. 81. Part I: an historical and theoretical perspective. Med Teach. 2013;35(9):e1437–46.

    Article  Google Scholar 

  18. 18.

    Riesenberg LA, Leitzsch J, Massucci JL, Jaeger J, Rosenfeld JC, Patow C, Padmore JS, Karpovich KP. Residents’ and attending physicians’ handoffs: a systematic review of the literature. Acad Med. 2009;84(12):1775–87.

    Article  Google Scholar 

  19. 19.

    Isoardi J, Spencer L, Sinnott M, Eley R. Impact of formal teaching on medical documentation by interns in an emergency department in a Q ueensland teaching hospital. Emerg Med Australas. 2015;27(1):6–10.

    Article  Google Scholar 

  20. 20.

    Wilkerson L, Lee M. Assessing physical examination skills of senior medical students: knowing how versus knowing when. Acad Med. 2003;78(10):S30–2.

    Article  Google Scholar 

  21. 21.

    Nendaz M, Charlin B, Leblanc V, Bordage G. Le raisonnement clinique: données issues de la recherche et implications pour l’enseignement. Pédagogie Méd. 2005;6(4):235–54.

    Article  Google Scholar 

  22. 22.

    Furney SL, Orsini AN, Orsetti KE, Stern DT, Gruppen LD, Irby DM. Teaching the one-minute preceptor: a randomized controlled trial. J Gen Intern Med. 2001;16(9):620–4.

    Article  Google Scholar 

  23. 23.

    Neher JO, Gordon KC, Meyer B, Stevens N. A five-step “microskills” model of clinical teaching. J Am Board Fam Pract. 1992;5(4):419–24.

    Google Scholar 

  24. 24.

    Lynagh M, Burton R, Sanson-Fisher R. A systematic review of medical skills laboratory training: where to from here? Med Educ. 2007;41(9):879–87.

    Article  Google Scholar 

  25. 25.

    Ericsson KA. Deliberate practice and the acquisition and maintenance of expert performance in medicine and related domains. Acad Med. 2004;79(10):S70–81.

    Article  Google Scholar 

  26. 26.

    Moulaert V, Verwijnen MG, Rikers R, Scherpbier AJ. The effects of deliberate practice in undergraduate medical education. Med Educ. 2004;38(10):1044–52.

    Article  Google Scholar 

  27. 27.

    Zundel S, Blumenstock G, Zipfel S, Herrmann-Werner A, Holderried F. Portfolios enhance clinical activity in surgical clerks. J Surg Educ. 2015;72(5):927–35.

    Article  Google Scholar 

  28. 28.

    Kumar K, Xie F. Expanding horizons: increasing undergraduate exposure to tomorrow’s specialties. Med Teach. 2014;36(12):1088–9.

    Article  Google Scholar 

  29. 29.

    Shah S. Are curriculum changes the ideal method for increasing undergraduate exposure to tomorrow’s specialties? Adv Med Educ Pract. 2015;6:153.

    Article  Google Scholar 

  30. 30.

    Iirola T, Lund V, Katila A, Mattila-Vuori A, Pälve H. Teaching hospital physicians’ skills and knowledge of resuscitation algorithms are deficient. Acta Anaesthesiol Scand. 2002;46(9):1150–4.

    Article  Google Scholar 

  31. 31.

    Kohn LT, Corrigan J, Donaldson MS. To err is human: building a safer health system, vol. 6. Washington, DC: National academy press; 2000.

    Google Scholar 

  32. 32.

    Remes V, Sinisaari I, Harjula A, Helenius I. Emergency procedure skills of graduating medical doctors. Med Teach. 2003;25(2):149–54.

    Article  Google Scholar 

  33. 33.

    Goldhaber-Fiebert SN, Howard SK. Implementing emergency manuals: can cognitive aids help translate best practices for patient care during acute events? Anesth Analg. 2013;117(5):1149–61.

    Article  Google Scholar 

  34. 34.

    Marty AP, Schmelzer S, Thomasin RA, Braun J, Zalunardo MP, Spahn DR, Breckwoldt J. Agreement between trainees and supervisors on first-year entrustable professional activities for anaesthesia training. Br J Anaesth. 2020;125(1):98–103.

  35. 35.

    Saleh A, Bista K. Examining factors impacting online survey response rates in educational research: perceptions of graduate students. Online Submission. 2017;13(2):63–74.

    Google Scholar 

  36. 36.

    Sheehan KB. E-mail survey response rates: A review. J Comput Mediated Commun. 2001;6(2):JCMC621.

    Google Scholar 

  37. 37.

    Störmann S, Stankiewicz M, Raes P, Berchtold C, Kosanke Y, Illes G, Loose P, Angstwurm MW. How well do final year undergraduate medical students master practical clinical skills? GMS J Med Educ. 2016;33(4). https://doi.org/10.3205/zma001057.

Download references

Acknowledgements

We thank the Dean’s office of the Faculty of Medicine at University of Zurich for its support and assistance. We thank James Disley, Oxford Academic Editing, for his service.

Funding

Not applicable.

Author information

Affiliations

Authors

Contributions

APM conceptualized the study, developed the survey, analysed and interpreted the data, was a major contributor in writing the manuscript. SF conceptualized the study, developed the survey, analysed and interpreted the data, was a major contributor in writing the manuscript. HBE developed the survey, analysed and interpreted the data, was a major contributor in writing the manuscript. SZ conceptualized the study, developed the survey, analysed and interpreted the data, was a major contributor in writing the manuscript. All authors read and approved the final manuscript.

Authors’ information

Not applicable.

Corresponding author

Correspondence to Sabine Zundel.

Ethics declarations

Ethics approval and consent to participate

The Ethics Committee of the Canton of Zurich, Switzerland, determined that this research project does not fall within the scope of the Human Research Act (HRA) and therefore waived formal ethics approval. (BASEC-Nr. Req 2019–00754).

Consent for publication

Not applicable.

Competing interests

The authors declare that they have no competing interests.

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supplementary Information

Additional file 1.

Supplement 1: Questionnaire.

Rights and permissions

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated in a credit line to the data.

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

Marty, A., Frick, S., Bruderer Enzler, H. et al. An analysis of core EPAs reveals a gap between curricular expectations and medical school graduates’ self-perceived level of competence. BMC Med Educ 21, 105 (2021). https://doi.org/10.1186/s12909-021-02534-w

Download citation

Keywords

  • Competency-based medical education
  • Entrustable professional activities
  • Undergraduate medical education
  • Self-assessment