INTRODUCTION

Over the last decade, quality improvement (QI) and patient safety (PS) have become major foci of the United States’ healthcare system. The Accreditation Council for Graduate Medical Education (ACGME) requires the incorporation of QI/PS into medical education.1,2,3 Many residency programs, however, have struggled to meet this requirement effectively. Commonly cited barriers include a lack of faculty with QI expertise, minimal exposure to QI in medical school, already-crowded curricula, and the demands of patient care.4, 5 Not surprisingly, factors associated with successful implementation of QI/PS curricula include buy-in from learners and faculty members, combining didactic and experiential teaching methods, scheduling curricula to optimize QI project completion, and having a supportive institutional culture.6

Prior to our interventions, we had no formal QI education or dedicated project time. Projects did not follow standard QI methodology and had minimal faculty oversight. This disorganized structure left residents and faculty frustrated, feeling that QI had no meaning beyond checking a box to meet a requirement.

We addressed these challenges by analyzing our barriers to effective QI education and using our program’s adoption of an “X + Y” training schedule (inpatient rotation + ambulatory block, described further below) to develop a new curriculum. We met with a faculty development expert who provided instruction on the Kemp model for curriculum design.7 This model provides a framework for curriculum planning by outlining goals, learner characteristics, task analysis, objectives, content, instructional strategies, and evaluation methods. The model emphasizes harmony and alignment among all of these elements. Each area was outlined prior to initiation of the curriculum and helped us identify our problem areas.

Curricular Objectives

  1. 1.

    Create an experiential learning environment that equips residents to become QI leaders.

  2. 2.

    Demonstrate improvement in resident knowledge of QI principles through the use of the QI Knowledge Application Tool-Revised (QIKAT-R).8

  3. 3.

    Increase resident QI-specific scholarship.

SETTING AND PARTICIPANTS

Our internal medicine (IM) residency program, located in a community-based, large tertiary-care center in the Midwestern United States, has 45 categorical residents and 12 core faculty members. One faculty member oversees QI education and QI projects. Our hospital’s Graduate Medical Education department recently established a quality and safety (Q/S) fellowship, and the current fellow had completed an IM residency in this program. The IM program recently changed its block rotation schedule to a “4 + 2” design; residents rotate through 4 weeks of inpatient medicine followed by a 2-week ambulatory block.

PROGRAM DESCRIPTION

Curriculum Design

Instructional sessions occurred during conference time on residents’ ambulatory blocks. Eight content areas were covered over 16 sessions. During the first week of each ambulatory block, a content area was introduced; the following week, this content was reinforced with a focus on application (Table 1).

Table 1 QI Content Areas and Associated Application Session

The revised Quality Improvement Knowledge Application Tool (QIKAT-R) and a survey regarding attitudes, knowledge, and comfort were given to the residents at the beginning and end of the year.

We identified five problem areas that required specific attention (Table 2):

Table 2 Overview of Five Specific Problem Areas
  • Problem area #1: resident engagement

To incorporate QI into a tight schedule, we used existing conference times and protected administrative hours during residents’ ambulatory blocks. We engaged residents by encouraging them to list improvement opportunities on a board titled “things that annoy me” and “possible QI project ideas.” We highlighted potential venues to submit QI scholarly activity.

We made each session interactive through the use of educational games, such as Mr. Potato Head to teach plan-do-study-act (PDSA) cycles and the marshmallow challenge9, 10 to highlight collaboration, innovation, and prototyping when developing interventions. In the last session, residents pitched their projects in a “Shark Tank”11 competition.

An end of the year QI party was held. Residents voted on awards for the most sustainable, impactful, and interdisciplinary projects.

  • Problem area #2: project design

Prior projects were often not aimed at addressing a specific performance gap. Measures were typically limited to pre- and post-intervention surveys; data abstraction via chart review was rarely used. These interventions had minimal impact and sustainability.

We addressed this problem in three ways. First, the residents’ educational sessions emphasized the step-wise approach to implementing a QI project based on the Model for Improvement from the Institute for Healthcare Improvement.12 Second, we developed standardized project progress reports to guide residents through project steps (Appendix 1, online). Lastly, we built reports in our electronic medical record (EMR) and taught residents and faculty how to run them, allowing for clinical information to be easily assessed.

  • Problem area #3: project management

Previously, no standard existed for how often project mentors would meet with resident teams. We addressed this problem by requiring a progress report every 3 months. The QI mentor used a tracking form to note completed tasks and describe next steps.

At the end of the academic year, each group of PGY1 residents who rotate together in the ambulatory clinic select a project, resulting in three new projects generated per year and six projects within the program at any given time. This is considerably less than the 16+ projects that one faculty member previously attempted to oversee, allowing for an increased supervision of the projects.

  • Problem area #4: assessment

We had not been assessing our residents’ QI knowledge. We therefore included an evaluation of their QI knowledge as described in the “PROGRAM EVALUATION” section. Additionally, we identified relevant ACGME milestones and added them to the mentor project tracking form (Appendix 2, online), which is then reviewed at the program’s clinical competency committee.

  • Problem area #5: limited faculty

As mentioned, our IM program has just one faculty member trained in QI. In our program, the faculty are encouraged to attend resident didactic sessions to both stay current on topics and provide their own insights. The teaching faculty are often viewed as role models by residents, and if they show a lack of engagement in QI, it may send a message that the topic is not important.13 To address this problem, we tracked the attendance of non-QI-trained faculty at the QI didactics. We met with core faculty prior to launching the curriculum. When we noticed limited attendance, we sent specific invitations to sessions where they could be more active participants, such as being a judge for our “Shark Tank” competition.

PROGRAM EVALUATION

We developed outcome measures consistent with our curricular goals along with process measures to assure that we were on track throughout the curriculum to reach those goals. Process measurement was done weekly with results displayed visually on a run chart:

  1. 1)

    Quality of sessions: Residents were asked at the end of each session to place colored chips in buckets labeled: “presentation was definitely useful & I can apply the information,” “presentation was somewhat useful but not very applicable,” and “presentation was not useful; I didn’t learn anything today.” We used this feedback to modify future instructional sessions.

  2. 2)

    Non-QI faculty attendance at sessions (addressing the problem of limited QI faculty).

Outcome measures were:

  1. 1)

    Changes in mean QIKAT-R scores (addressing assessment of learner knowledge).

  2. 2)

    Residents’ perceived confidence to perform QI and their attitudes toward QI (reflective of resident engagement).

  3. 3)

    The number of QI projects accepted for publication and/or conference presentation (reflective of project design and management).

Results

We used the QIKAT-R to assess resident knowledge pre- and post-curriculum. The Q/S fellow scored the QIKAT-R after developing a common understanding of the grading criteria with the course director. The maximum possible QIKAT-R score is 27 points. At baseline, 39 of 45 (87%) residents completed the QIKAT-R with a mean score of 7 (SD 2.9, 95% CIs = 6.07, 7.97). After participating in the curriculum, 41 of 45 (91%) residents completed the QIKAT-R with a mean score of 16.6 (SD 4.7, 95% CIs = 15.13, 18.08) (Fig. 1). Using the paired t test, the mean QIKAT-R scores improved significantly (SD 4.6, 95% CIs = 8.14, 11.2, n = 37 pairs, p = 0.043).

Figure 1
figure 1

Pre- and post-curriculum mean QIKAT scores.

We developed our anonymous resident survey based on QI curricula reported in the literature14,15,16 and our curriculum objectives. Forty-three out of 45 residents (96%) completed the baseline survey, and 42 of 45 residents (93%) completed it post-curriculum (Fig. 2). Residents’ confidence to implement a QI project increased from 37% (16/43) before to 69% (29/42) after completion of the curriculum. Residents’ perceived confidence level in QI skills significantly improved. Residents’ self-reported estimate of the number of adverse events they personally reported increased from 44% (19/43) at baseline to 90% (28/31) post-curriculum (Fig. 3).

Figure 2
figure 2

Pre- and post-curriculum responses for survey questions.

Figure 3
figure 3

Pre- and post-curriculum number of adverse events reported.

Sixty percent (25/42) of residents agreed that this curriculum has improved their attitude toward QI, 83% (35/42) felt more prepared to implement QI initiatives, and 93% (25/27) of PGY2s and PGY3s “agreed” or “strongly agreed” that they wished they would have had this training prior to initiating their QI project.

The new curriculum has inspired dissemination of project results by our residents. In the year before the new curriculum, there was only one poster presentation at a regional conference. This past year, a total of seven presentations were given in local, regional, and national conferences.

Regarding our process measures, the “presentation was definitely useful” bucket consistently received the highest number of chips for each didactic presentation (Fig. 4). Given that each didactic is presented three times (once for each group of 15 residents on the outpatient rotation), we made rapid adjustments between sessions based on this feedback, such as modifying case scenarios that were found to be confusing. Non-QI faculty attended 24% of instructional sessions (8 of 33 available sessions), with the most attendance in response to a specific solicitation; 5% (3/25) of attendance was unsolicited.

Figure 4
figure 4

Results of presentation feedback on educational sessions.

DISCUSSION

Implementing QI curricula into busy residency schedules has proved challenging for many residency programs. We feel strongly that this curriculum’s success is due to the careful attention spent in planning and implementing it. By using the Kemp model of curricular design, we were able to incorporate our learners’ needs and biases thoughtfully into the curriculum’s content. By analyzing our problem areas and establishing plans to address them, we were able to proactively plan for anticipated challenges. Despite this planning, we did experience some unanticipated challenges. Residents were sometimes pulled from sessions to complete other program requirements, leading to sessions with insufficient participants to complete the activity. Additionally, some residents enjoyed the use of games but others did not.

Several studies have used objective scoring measures in combination with self-assessment surveys to demonstrate curriculum success.14, 15, 17,18,19,20,21 To our knowledge, only two studies20, 21 have used the QIKAT-R, which was developed to address subjectivity and inconsistent reliability noted in the original version.8 Our results show a larger improvement from pre- to post-curriculum scores with a difference of 9.6, compared with 2 and 3.3 in comparative studies; however, the mean post-curriculum score of 16.6 falls in-between these studies (15.3 and 19.1). Low baseline scores may be reflective of how little exposure to QI methodology our residents had prior to this curriculum. Our study differs from other curriculum descriptions in both the use of a curriculum design model and the details of how we addressed our program’s barriers to success, many of which are common to other IM programs.4,5,6 Key factors for this curriculum’s success included the regularity of QI educational sessions, integration with longitudinal QI projects, the celebration of project results within the program, and the creation of an optimal learning environment. The 4 + 2 program structure with protected time allows for regular application of PDSA cycles for QI projects.

We built this curriculum to fit our residency program’s structure and challenges. We received a tremendous amount of program support and didactic time which may not be feasible in other institutions. Similarly, having a Q/S fellowship program is a unique resource in that it afforded the fellow time for curricular development, opportunities to gain additional EMR training in report building, and access to the hospital’s QI department resources. We would not advocate for carte blanche applying this curriculum in a different setting, as it is unlikely that, in its entirety, it would adequately address another program’s unique structure and challenges. We would, however, advocate strongly for the use of a curricular design tool and a structured plan for addressing program-specific barriers. Certainly, we do hope that those processes, along with specific elements of our curriculum, may prove useful solutions to other programs.