The protocol-guided rapid evaluation of veterans experiencing new transient neurological symptoms (PREVENT) quality improvement program: rationale and methods
- 152 Downloads
Transient ischemic attack (TIA) patients are at high risk of recurrent vascular events; timely management can reduce that risk by 70%. The Protocol-guided Rapid Evaluation of Veterans Experiencing New Transient Neurological Symptoms (PREVENT) developed, implemented, and evaluated a TIA quality improvement (QI) intervention aligned with Learning Healthcare System principles.
This stepped-wedge trial developed, implemented and evaluated a provider-facing, multi-component intervention to improve TIA care at six facilities. The unit of analysis was the medical center. The intervention was developed based on benchmarking data, staff interviews, literature, and electronic quality measures and included: performance data, clinical protocols, professional education, electronic health record tools, and QI support. The effectiveness outcome was the without-fail rate: the proportion of patients who receive all processes of care for which they are eligible among seven processes. The implementation outcomes were the number of implementation activities completed and final team organization level. The intervention effects on the without-fail rate were analyzed using generalized mixed-effects models with multilevel hierarchical random effects. Mixed methods were used to assess implementation, user satisfaction, and sustainability.
PREVENT advanced three aspects of a Learning Healthcare System. Learning from Data: teams examined and interacted with their performance data to explore hypotheses, plan QI activities, and evaluate change over time. Learning from Each Other: Teams participated in monthly virtual collaborative calls. Sharing Best Practices: Teams shared tools and best practices. The approach used to design and implement PREVENT may be generalizable to other clinical conditions where time-sensitive care spans clinical settings and medical disciplines.
clinicaltrials.gov: NCT02769338 [May 11, 2016].
KeywordsCerebrovascular disease Transient ischemic attack Learning healthcare system Quality of care Implementation science Audit and feedback Systems redesign
Consolidated Framework for Implementation Research
Fast Analysis and Synthesis Template
- GO Score
Group Organization Score
Protocol-guided Rapid Evaluation of Veterans Experiencing New Transient Neurological Symptoms
Standards for Quality Improvement Reporting Excellence
Transient ischemic attack
Department of Veterans Affairs
With the proliferation of electronic health records and increased emphasis on Learning Healthcare Systems, healthcare teams are being tasked with responding to data-driven quality problems . Teams may deploy a variety of quality improvement (QI) strategies and systems redesign approaches to improve performance, depending on the complexity and scope of the problem. This description of the rationale, implementation strategy, and evaluation plan of the Protocol-guided Rapid Evaluation of Veterans Experiencing New Transient Neurological Symptoms (PREVENT) trial details an approach to developing and evaluating a multi-component QI intervention for a complex, time-sensitive clinical problem that involves several clinical disciplines and is consistent with the principles of the Learning Healthcare System model. This report adheres to the Revised Standards for Quality Improvement Reporting Excellence (SQUIRE 2.0) [2, 3].
The problem being addressed
Approximately 8500 Veterans with transient ischemic attack (TIA) or ischemic stroke are cared for in a Department of Veterans Affairs (VA) Emergency Department (ED) or inpatient ward annually in the United States . Patients with TIA generally present with transient neurological symptoms of presumed ischemic etiology . TIA patients at high risk of recurrent vascular events [6, 7, 8], however, interventions which deliver timely TIA care can reduce that risk by up to 70% [9, 10, 11, 12]. Despite the known benefits of timely TIA care, data from both selected private-sector United States hospitals (i.e., facilities that have implemented stroke quality improvement programs)  and from the VA healthcare system have identified gaps in TIA care quality. For example, only 51% of Veterans who were eligible received carotid imaging as part of their TIA care . Moreover, the majority of VA facilities do not have a TIA-specific protocol .
The objective of the PREVENT trial was to develop, implement, and evaluate a multi-component, QI intervention to improve the quality of care for Veterans with TIA that could be scaled to serve the full spectrum of VA medical centers, ranging from small facilities with few specialist resources to the most complex and well-resourced facilities with access to comprehensive academic medical centers. The Consolidated Framework for Implementation Research (CFIR) guided the development of the PREVENT intervention, its accompanying implementation strategies, and its evaluation plan [16, 17]. Our approach contributed to the development of a Learning Healthcare System and may be generalizable to QI interventions that target healthcare teams .
Within the VA, quality measurement and systems redesign are integrated into the healthcare system within administration and clinical operations [19, 20]. Although stroke care quality metrics are reported, there is currently no VA system-wide focus on TIA care quality. TIA is a clinical condition that is relatively common and for which there is a time-sensitive imperative to provide diagnostic and management processes of care. However, there is no existing VA quality measurement or “top-down” mandate for QI related to TIA care. Nevertheless, because of the demonstrable gaps in the quality of TIA care for Veterans, VA leadership, namely in neurology and emergency medicine, provided robust support for a TIA quality improvement program.
Quality improvement intervention development
The development of the PREVENT intervention [21, 22, 23, 24] was based on a systematic assessment of TIA care performance at VA facilities nationwide as well as critical barriers and facilitators of TIA care performance using four sources of information: baseline quality of care data , staff interviews , existing literature [25, 26, 27, 28], and validated electronic quality measures .
Baseline quality of care data
The first national benchmarking study of TIA care quality in the VA included patients cared for in any VA ED or an inpatient setting during federal-fiscal year 2014 . Among N = 8201 patients in 129 facilities, performance varied across elements of care from brain imaging within 2 days of presentation (88.9%) to high/moderate potency statin within 7 days post-discharge (47.2%). Performance also varied substantially across facilities. Performance was higher for admitted patients than for patients cared for only in EDs, with the greatest disparity for carotid artery imaging: 75.6% versus 25.3% (p < 0.0001). These data provided justification for developing a QI project to improve TIA care quality.
Several studies have demonstrated that providing timely diagnosis and management improves care and outcomes for patients with TIA [9, 10, 11, 12, 27, 28]. For example, three effectiveness studies included algorithms or protocols that facilitated the timely delivery of care for patients with TIA. Based on this research, PREVENT included algorithms and protocols to promote timely delivery of the guideline-concordant processes of care that have been associated with improved outcomes .
Validated electronic quality measures
Electronic quality measures were developed using electronic health record data and were validated against chart review . A random sample of 763 TIA or minor ischemic stroke patients cared for in 45 VA facilities was used to construct electronic versions of 31 existing quality measures . The measures with the most robust performance against chart review became the PREVENT measures .
Quality improvement intervention description
The PREVENT QI intervention targeted facility providers not individual patients. External facilitation was provided by the study team, which included a nurse (with quality management and clinical nursing experience), a general internist (with QI and stroke clinical care experience), implementation scientists (from diverse backgrounds including health psychology, education and medical anthropology), and a senior data scientist. The participating facility teams were diverse but generally included members from neurology, emergency medicine, nursing, pharmacy, and radiology; some teams also included hospitalists, primary care staff, education staff, telehealth staff, ophthalmologists, or systems redesign staff. The primary site champion was the person designated as being responsible for stroke care quality at the participating facility. Therefore, for the majority of sites, the champion was a neurologist, but at one site the champion was an ED nurse and at another site the role of champion was shared by staff from neurology and pharmacy. The PREVENT QI intervention included five components : quality of care reporting system , clinical programs , professional education , electronic health record tools, and  QI support including a virtual collaborative (Fig. 1).
Quality of care reporting system: audit and feedback
Several clinical programs were developed and shared on the PREVENT Hub. For example, a pharmacist-based TIA medication management protocol was developed to improve medication-related processes of TIA care (e.g., hypertension and hyperlipidemia management). The pharmacy protocol utilized existing VA pharmacy staff in the inpatient or ED settings with hand-offs to pharmacists embedded in the primary care teams. In addition, a templated note and checklist were created for VA primary care nurses. PREVENT site teams developed ED-based protocols for TIA patients which were also shared on the PREVENT Hub.
The PREVENT staff education materials were diverse, including: slide sets (with speaker notes) designed specifically for physicians and residents, pharmacists, and nurses; guidelines and article reprints; videos (one described the importance of providing timely TIA care and one demonstrated a clinical team reflecting on quality of care data, evaluating progress toward goals, and planning QI activities in response to data); as well as pocket-cards and posters. Locally-generated educational materials were also shared on the Hub.
Electronic health record tools
A variety of electronic health record tools were available for PREVENT sites to adapt including: order menus, note templates, and a patient identification tool. The note templates were developed using reminder dialogues to enable teams to monitor when templates were used. The patient identification tool was developed to identify individual TIA patients who were seen in a facility in the ED or inpatient ward so that the site teams could ensure that highest quality care was being delivered in real time (as opposed to waiting for retrospective data).
Quality improvement support & virtual collaborative
Active implementation of PREVENT involved a full-day kickoff meeting. The kickoff included all relevant staff members at a participating site and study team members, some participated in person and others participated via videoconference. The kickoff was designed to be fun, engaging, educational, and productive. The PREVENT study team members explicitly developed the agenda with the belief that the most important resource for the kickoff was the time and attention of the participating staff members, with the event providing a crucial opportunity for team formation (at many sites team members were meeting each other for the first time at the kickoff).
The kickoff began with presentations, videos, and activities to create a sense of excitement and empowerment about improving care and outcomes for patients with TIA. The facility team used the PREVENT Hub to explore their facility-specific quality of care data and identify processes of care with the largest gaps in quality for the greatest number of patients. Using approaches from systems redesign, facility team members brainstormed about barriers to providing highest quality of care, identified solutions to address barriers, ranked solutions on an impact-effort matrix, and developed a site-specific action plan that included high-impact/low-effort activities in the short-term plan and high-impact/high-effort activities in the long-term plan. Throughout the kickoff, the facility team was introduced to PREVENT components (e.g., videos from the education program and the pharmacy clinical protocol) as well as strategies for engaging in key QI activities such as reflecting and evaluating, goal setting, and planning.
Local QI plans were entered into the PREVENT Hub, and metrics were tracked allowing teams to monitor performance over time. PREVENT site teams could learn from the overall community by identifying which QI activities either did or did not achieve improvement in metrics at other sites.
During the one-year active implementation period, the teams joined monthly PREVENT collaborative conferences which served as a forum for facility team members to share progress on action plans, articulate goals for the next month, and review any new evidence or tools . The monthly collaborative conferences were conducted via a shared meeting platform that allowed for screen sharing and instant messaging; videoconferencing was also occasionally used. During each collaborative conference, invited speakers with expertise related to cerebrovascular risk factor management, VA healthcare administration, or systems redesign reviewed topics of interest using cases to stimulate discussion, identify barriers, and brainstorm about solutions. Participants received continuing education credits. At the end of the one-year active implementation period, the collaborative call was conducted via video-conference and was used to acknowledge the implementation accomplishments of the site which was being promoted from active implementation to sustainability. Facility leadership was invited to celebrate the successes of the local team.
Primary effectiveness outcome
The primary effectiveness outcome was the “without-fail” rate, defined as the proportion of Veterans with TIA who received all of the processes of care for which they were eligible from among seven processes of care: brain imaging, carotid artery imaging, neurology consultation, hypertension control, anticoagulation for atrial fibrillation, antithrombotics, and high/moderate potency statins . These seven measures were included in the without-fail rate because they are both guideline-recommended processes of care and they have been associated with improvements in TIA patient outcomes . The without-fail rate is sometimes also referred to as “defect-free” care [38, 39]. It is an all-or-none measure of quality, which assesses for an individual patient whether they either did (“passes” the without-fail measure), or did not (“fails” the without-fail measure) receive all of the elements of care for which they were eligible. The without-fail rate was calculated at the facility level based on electronic health record data using validated algorithms .
The secondary effectiveness outcomes included: the seven individual processes of care that were included in the without-fail measure, the consolidated measure of quality which describes the proportion of care patients received among the processes for which they were eligible (e.g., for a patient who received two processes of care but who was eligible for four processes of care, their consolidated quality measure would be 50%, whereas their without-fail rate would be 0%), and patient outcomes (i.e., 90-day recurrent stroke and 90-day all-cause mortality).
Quantitative analysis plan: effectiveness assessment
Generalized mixed-effects models at the patient level with random effects for sites were used to analyze the PREVENT intervention effects on the without-fail rate during the active implementation period compared with the baseline period . For the primary effectiveness analysis, the main comparison was the mean facility without-fail rate across the six sites during the baseline data period versus the active implementation data period; adjusting for wave and site variations. The primary analysis included the first TIA event per patient. In sensitivity analyses, we included all TIA events and we will also excluded patients ≥90-years old (because care for such patients may appropriately not include all of the processes of care which were included in the without-fail rate).
Several secondary effectiveness analyses were pre-specified, including : an examination of how the without-fail rate changed in the PREVENT sites compared with VA facilities matched on the basis of TIA patient volume, facility complexity (i.e., teaching status, intensive care unit level),and baseline without-fail rate (with six controls for each intervention site); this analysis allowed for consideration of temporal changes in care ; an examination of individual processes of care across the six sites from the baseline period to active implementation period (e.g., how did receipt of high or moderate potency statins change from baseline to active implementation) ; an assessment of change in the consolidated measure of quality from baseline to active implementation; and  a comparison of the 90-day recurrent stroke rate and the 90-day all-cause mortality rate, before versus after active implementation. For each of these secondary analyses, the multivariable models included adjustment for wave site variations, and baseline comorbidities. Specifically, individual risk-adjustment models were created for each process of care and for each patient outcome. The individual processes of care, the consolidated measure of quality, the 90-day recurrent stroke rate, and the 90-day mortality rate were considered secondary outcomes because the stepped-wedge study was designed to have adequate power (see Sample Size section below) to identify differences in the primary effectiveness outcome (the without-fail rate) and not the secondary outcomes.
Mixed methods evaluation plan: user satisfaction, implementation and sustainability assessments
Qualitative Data Collection Plan
FOCUS OF INQUIRY
Structure: TIA protocol; TIA providers
Process: how clinical teams use data to improve quality; local context
Formal, semi-structured, qualitative interviews
6 months into active implementation
12 months into active implementation
Audio-recorded & transcribed interviews
Providers who care for and support patients with TIA
Structure: Team composition
Process: Team formation; impact evaluation; action planning
Observations of team kickoffs for active implementation
After baseline at the start of active implementation
Structure: Clinical providers’ attendance and participation
Process: Community of care interactions; implementation progress
Observations of Virtual Collaborative Calls
Monthly 1 Hour Calls
Structure: Local front-line providers involved in TIA care
Process: Team dynamics; implementation progress; use of data
Observations of facility visits
Post Visit Debriefings
Audio-recorded & transcribed interviews & field notes
Structure: Role and service of key informants
Process: Use of implementation strategy; implementation progress
FAST* template: a rapid, systematic method for capturing key concepts across data sources
Structure: Facility team members engaged in quality improvement
Process: Facilitation contents and dose
External Facilitation Tracking Sheet
FAST Template and Facilitator notes
Providers who locally adapt PREVENT to improve quality of TIA care
Users’ assessment of the program
The assessment of satisfaction with the PREVENT program was evaluated using interview data, ARS, and survey data. Satisfaction was defined as program acceptability, the perception among front-line implementers that PREVENT was palatable or satisfactory based on content, complexity, or comfort. We derived the users’ assessment of the intervention using the intervention characteristics domain from CFIR. We sought to identify the components of the intervention that were most useful or most important to the facility team members.
Implementation outcomes and evaluation
PREVENT employed three primary implementation strategies : team activation via audit and feedback, reflecting and evaluating, planning, and goal setting ; external facilitation; and  building a community of practice. In addition, PREVENT allowed for local adaptation of the intervention components and took advantage of peer pressure while providing facilitation support to the site champion. The two primary implementation outcomes were the number of implementation activities completed during the one-year active implementation period and the final level of team organization (defined as the Group Organization [GO Score]) [45, 46] for improving TIA care at the end of the 12-month active implementation period. The number of implementation activities completed was scored for each site by the research team using a rubric designed for PREVENT. The GO Score [45, 46] was a measure of team activation on a 1–10 scale for improving TIA care based on specified provider practices. Scores between 1 and 3 denoted a beginning level of organization with no facility wide approach, 4–5 reflected a developing approach, 6–7 denoted basic proficiency, 8 indicated intermediate proficiency, and 9–10 reflected a TIA system that was implemented facility-wide and that could sustain key personnel turnover.
Using a mixed-methods approach grounded in the CFIR, we examined and evaluated the degree to which the sites engaged in the three primary implementation strategies; the association between implementation strategies and implementation success; contextual factors associated with implementation success; the association between implementation strategies and the without-fail rate; and the association between implementation outcomes and the without-fail rate. In addition, we described the dose, type, and temporal trends in external facilitation that was provided to each site during active implementation.
The sustainability analysis included both a comparison of the change in the without-fail rate from the baseline data period to the sustainability period and from the active implementation period to the sustainability period. We constructed mixed-effects models accounting for random effects for sites as described above for the effectiveness evaluation and explored whether sites with the greatest use of their own quality data demonstrated the greatest program sustainability.
Sites were invited to participate on the basis of demonstrated gaps in quality of care; specifically, if they had baseline without-fail rates of < 50%. All VA acute care facilities with at least ten eligible TIA patients per year were rank ordered in terms of the without-fail rate. Invitations were sent via email beginning with facilities with the greatest opportunity for improvement. Recruitment continued until six facilities agreed to participate. Although some stepped-wedge trials randomly assign facilities to waves (for example in a cluster randomized controlled trial design), PREVENT sites were allocated to waves pragmatically based on the ability to schedule baseline and kickoff meetings.
Power & Sample Size
Key strengths of the approach to developing this QI program involved grounding the program in data from multiple sources including interview data to understand the needs of front-line providers across a diverse set of facilities and across disciplines ; validation evidence identifying processes of care that could be obtained as electronic quality measures which facilitates ongoing performance measurement and scalability ; benchmarking data identifying the gaps in care that should serve as targets for quality improvement, especially processes with large opportunities for improvement for large numbers of potentially eligible patients ; and evidence from the existing literature about processes of care that are most robustly associated with improved patient outcomes [25, 26, 27, 28]. The strengths of the evaluation plan included both the grounding in the CFIR model and the explicit evaluation of implementation strategies across diverse local contexts.
The PREVENT program was positively aligned with the model of the Learning Healthcare System. In the Institute of Medicine’s book Best Care at Lower Cost, the Learning Healthcare System was described as an approach where “clinical informatics, incentives, and culture are aligned to promote continuous improvement and innovation, with best practices seamlessly embedded in the delivery process and new knowledge captured as an integral by-product of the delivery experience.”  Already recognized as an example of a stand-out organization that harnesses the power of data to improve the health of the populations it serves [48, 49], the VA was the first federal agency to endorse the Learning Healthcare System’s core values. The design of PREVENT advanced three aspects of a Learning Healthcare System. If PREVENT successfully improves TIA care quality, then we will work with our partners in VA central office to disseminate the program to all VA facilities.
Learning from data
The PREVENT Hub, unlike static performance dashboards, allowed teams to examine and interact with their performance data to explore hypotheses, plan QI activities, and evaluate change over time. Although audit and feedback has been demonstrated to be effective in QI, we have little insight into how teams use data to improve quality . The PREVENT study provided an opportunity to learn how teams use data to inform QI activities. The patient identification tool provided teams with patient-level, actionable information to identify patients in real-time to ensure that every patient received all the care they needed; this tool is generalizable to other time-sensitive clinical conditions where patients seek care in the ED or inpatient settings.
Learning from each other
Site teams participated in monthly collaborative calls to learn about relevant topics, share strategies for overcoming challenges to providing highest care quality, and cultivate a sense of community. PREVENT teams were multidisciplinary, providing opportunities to learn across disciplines. For example, although the role of pharmacist-delivered care is well recognized for many clinical conditions, it has been underutilized for the care of patients with stroke or TIA. Given that many TIA process of care involve medication management, collaboration with pharmacy staff offers great promise for delivering guideline-concordant care .
Sharing best practices
Facility-based teams shared tools and best practices in a rich and growing library of diverse resources.
Several limitations of the PREVENT program merit description. The primary limitation of PREVENT was the implementation only within VA hospitals which have the benefit of a unified electronic health record. If this program is found to be effective, then future research should evaluate its implementation in non-VA settings. Second, because several implementation strategies were deployed, it may be difficult to disentangle the unique effects of each strategy. However, we designed multiple data collection sources to capture the effects of each implementation strategy on implementation success using rigorous evaluation methodology. Third, making a diagnosis of TIA can be clinically challenging and some patients who receive a diagnosis code for TIA may well have an alternative diagnosis. Although we know that some of the patients who were coded as having a TIA did not have actually had a TIA, we have neither observed differential miss-classification either across facilities nor across time . In other words, potential TIA miscoding is likely to exist across all of the sites and will likely exist during baseline, active implementation, and sustainability phases. Therefore, it is unlikely that differential TIA miscoding will bias the examination of the effect of the intervention. If, however, the TIA miscoding rate was unexpectedly high, and patients were not getting TIA processes of care because they did not actually have a TIA, then the without-fail rate would be appropriately low. In this case, our ability to detect a change in the without-rail rate would be impaired. Fourth, although a six-site sample was sufficient to provide adequate power, future studies might include a larger number of facilities. Fifth, the PREVENT program targeted clinical teams at the participating sites; the clinicians were the subjects of the implementation and satisfaction evaluations. Future studies should consider how best to include patients’ perspectives in implementation evaluations. Finally, although we plan to deploy the program nationwide if the effectiveness analyses indicate that PREVENT improves TIA care quality, an assessment of scalability during national deployment is beyond the scope of the planned PREVENT research activities.
The promise of Learning Healthcare Systems involves the development of QI programs that are data-driven, meet the needs of stakeholders, and dynamically adapt to changes in performance and context. As illustrated by the PREVENT trial, that promise should likewise extend to program development and evaluation to assess not only whether a program works but also how and why it works.
All authors participated in the revision of the manuscript and have read and approved the final manuscript. All authors have agreed both to be personally accountable for their own contributions and to ensure that questions related to the accuracy or integrity of any part of the work, even ones in which the author was not personally involved, are appropriately investigated, resolved, and the resolution documented in the literature. DMB: obtained funding and was responsible for the overall design and conduct of the study including: intervention development; implementation plans; data collection; quantitative, qualitative, and mixed methods data analyses; interpretation of the results; and drafting and revising the manuscript. LJM : instrumental in the design of the intervention, implementation plans, quantitative data collection, quantitative data analysis, interpretation of the results, and revising the manuscript. EJM, NR, TD: instrumental in the design of the intervention, implementation plans and evaluation, qualitative and mixed methods data collection, qualitative and mixed methods data analysis, interpretation of the results, and revising the manuscript. BH: instrumental in the design of the intervention; participated in the qualitative data collection; participated in the quantitative, qualitative, and mixed methods data analysis; and revised the manuscript. AJP, YZ: instrumental in the development of the analysis plan, in the conduct of the quantitative and mixed methods data analysis, in the interpretation of the results, and revising the manuscript. JM: participated in obtaining funding, qualitative data collection, qualitative data analysis, and revising the manuscript. LM , JF: participated in intervention development; qualitative, quantitative, and mixed methods data analysis, and revising the manuscript. AJC: participated in the qualitative and mixed methods data analysis, and revising the manuscript. BG, MK: instrumental in intervention development, the interpretation of the qualitative and mixed methods analyses, and revising the manuscript. EC, DAL, JJS, MW: participated in study design; interpretation of quantitative, qualitative and mixed methods results; and revising the manuscript.
This work was supported by the Department of Veterans Affairs (VA), Health Services Research & Development Service (HSRD), Precision Monitoring to Transform Care (PRISM) Quality Enhancement Research Initiative (QUERI) (QUE 15–280). Support for VA/Centers for Medicare and Medicaid Service (CMS) data is provided by the VA Information Resource Center (SDR 02–237 and 98–004). The funding agencies had no role in the design of the study, data collection, analysis, interpretation, or in the writing of this manuscript.
Ethics approval and consent
PREVENT received human subjects review and approval from the Indiana University School of Medicine Institutional Review Board [IRB]) and the Richard L. Roudebush VA Medical Center Research and Development (R&D) committee. Staff members who participated in interviews provided oral informed consent consistent with the approval of the IRB.
Consent for publication
This manuscript does not include any individual person’s data.
The authors declare that they have no competing interests.
- 1.Hysong SJ, Francis J, Petersen LA. Motivating and engaging frontline providers in measuring and improving team clinical performance. BMJ Quality Safety. 2019.Google Scholar
- 2.Revised Standards for Quality Improvement Reporting Excellence (SQUIRE 2.0). The Health Foundation. http://squire-statement.org/index.cfm?fuseaction=Page.ViewPage&pageId=471. Published 2018. Accessed June 27, 2018.
- 4.Williams L. VA Stroke Quality Enhancement Research Initiave (QUERI) Annual Report and Strategic Plan. VA Health Services Research and Development Service. http://www.queri.research.va.gov/about/strategic_plans/str.pdf. Published 2013. Updated November 17, 2017. Accessed November 17, 2017.
- 20.Hagg H, Workman-Germann J, Flanagan M, et al. Implementation of systems redesign: approaches to spread and sustain adoption. Rockville, MD (USA): Agency for Healthcare Research and Quality; 2008.Google Scholar
- 22.Craig P, Dieppe P, Macintyre S, et al. Developing and evaluating complex interventions: the new Medical Research Council guidance. BMJ (Clinical research ed). 2008;337:a1655.Google Scholar
- 24.Medical Research Council (MRC). A Framework for Development and Evaluation of RCTs for Complex Interventions to Improve Health. London2000.Google Scholar
- 30.Bravata DM, Myers LJ, Cheng E, et al. Development and Validation of Electronic Quality Measures to Assess Care for Patients With Transient Ischemic Attack and Minor Ischemic Stroke. Circ Cardiovasc Qual Outcomes. 2017;10(9).Google Scholar
- 40.Li J, Zhang Y, Myers L, Bravata D. Power Calculation in Stepped-Wedge Cluster Randomized Trial with Reduced Intervention Sustainability Effect. J Biopharm Stat. 2019;In Press.Google Scholar
- 41.Hamilton A, Brunner J, Cain C, et al. Engaging multilevel stakeholders in an implementation trial of evidence-based quality improvement in VA women’s health primary care. TBM. 2017;7(3):478–85.Google Scholar
- 45.Miech E. The GO Score: A New Context-Sensitive Instrument to Measure Group Organization Level for Providing and Improving Care. Washington DC.2015.Google Scholar
- 47.IOM. Institute of Medicine. Best Care at Lower Cost: The Path to Continuously Learning Health Care in America. Washington, DC: The National Academies Press; 2013.Google Scholar
- 48.President’s Council of Advisors on Science and Technology. Realizing the full potential of health information technology to improve healthcare for Americans: the path forward, Report to the President. In. Washington, DC 2010:1–108.Google Scholar
Open AccessThis article is distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated.