BMC Health Services Research

, 18:209 | Cite as

Defining the external implementation context: an integrative systematic literature review

  • Dennis P. Watson
  • Erin L. Adams
  • Sarah Shue
  • Heather Coates
  • Alan McGuire
  • Jeremy Chesher
  • Joanna Jackson
  • Ogbonnaya I. Omenka
Open Access
Research article
Part of the following topical collections:
  1. Organization, structure and delivery of healthcare

Abstract

Background

Proper implementation of evidence-based interventions is necessary for their full impact to be realized. However, the majority of research to date has overlooked facilitators and barriers existing outside the boundaries of the implementing organization(s). Better understanding and measurement of the external implementation context would be particularly beneficial in light of complex health interventions that extend into and interact with the larger environment they are embedded within. We conducted a integrative systematic literature review to identify external context constructs likely to impact implementation of complex evidence-based interventions.

Methods

The review process was iterative due to our goal to inductively develop the identified constructs. Data collection occurred in four primary stages: (1) an initial set of key literature across disciplines was identified and used to inform (2) journal and (3) author searches that, in turn, informed the design of the final (4) database search. Additionally, (5) we conducted citation searches of relevant literature reviews identified in each stage. We carried out an inductive thematic content analysis with the goal of developing homogenous, well-defined, and mutually exclusive categories.

Results

We identified eight external context constructs: (1) professional influences, (2) political support, (3) social climate, (4) local infrastructure, (5) policy and legal climate, (6) relational climate, (7) target population, and (8) funding and economic climate.

Conclusions

This is the first study to our knowledge to use a systematic review process to identify empirically observed external context factors documented to impact implementation. Comparison with four widely-utilized implementation frameworks supports the exhaustiveness of our review process. Future work should focus on the development of more stringent operationalization and measurement of these external constructs.

Keywords

Implementation context External context Local context Outer setting Integrative review Systematic review 

Abbreviations

CFIR

Consolidated Framework for Implementation Research

EBI

Evidence-based intervention

EPIS

Exploration, Preparation, Implementation, Sustainment framework

MLF

Multi-level Framework Predicting Implementation Outcomes

Background

Considerable scientific effort has focused on identifying factors affecting translation of evidence-based interventions (EBIs) from research to practice. While conceptual models and empirical studies emphasize the importance of the external implementation context—i.e., factors existing outside the boundaries of the entity or entities leading the implementation of one or more EBIs—to the translation process, numerous factors point to limitations of its current conceptualization, including the absence of a comprehensive external context measure and its constituent factors. Moreover, current frameworks that generally define the external context are rooted in theoretical conceptualizations, rather than observed instances of external factors affecting implementation. The current paper highlights limitations of extant conceptualizations of the external implementation setting and then describes a integrative systematic literature review aimed at developing an inductive conceptualization of the external context.

While there is an extensive body of literature focused on the identification of facilitators and barriers organizations encounter when implementing EBIs [1, 2], empirical research in this area has focused largely on influences internal to implementing organizations. This is a considerable gap considering the recognized influence of both internal and external factors to the implementation process and because external factors are often antecedents of organizational readiness, drive organizational-level policy and processes, and pose greater difficulties in addressing than internal factors because they are typically beyond any single organization’s control to easily change [3, 4, 5, 6, 7, 8, 9]. Instrument reviews by both Clinton-McHarg et al. [10] and Lewis et al. [11] have demonstrated lack of consideration of the external context among validated implementation measures. While some instruments identified in these reviews measure selected aspects of the external context, no one instrument identified focuses explicitly on it. This is problematic because it leads researchers to one or a combination of the following options when seeking to understand the impact of the external context on the implementation process: (1) combine single items from multiple instruments; (2) use each identified instrument in its entirety; or (3) create untested, home-grown measures. All of these options cause obvious problems related to consistency, replicability, and comparability across implementation studies [11].

While defined within some existing implementation frameworks (e.g., [3, 4, 5, 6]), there are inconsistencies regarding what these theoretical guides consider the external context to comprise. For instance, while some frameworks include social climate factors related to the larger community within which the intervention is embedded (e.g., [5]), others do not (e.g., [3, 4]). Moreover, frameworks sometimes include important constructs but do not provide detailed operationalizations for them (e.g., [5, 6]). Apart from existing implementation frameworks, Birken et al. [8] have pointed to organizational theory as an area where researchers can look to better understand the external context. They specifically highlight how “[o]rganizational theories describe, explain, and predict the complex interaction between organizations and their external environments” (p. 2) and how these interactions influence organizational decisions and behavior [12, 13, 14, 15]. While useful, organizational theories are not implementation-specific and likely lack important insights necessary for understanding the noted impact of the broader implementation context on different aspects or stages of the implementation process [16]. Given the above stated issues, a stronger and more consistent operationalization of the external context would greatly benefit the field.

More well-defined definitions for the external context and its constituent parts would be particularly beneficial for studies of complex interventions made of multiple interrelating components that extend into and interact with larger systems and communities within which they are embedded [17, 18, 19, 20]. As an example, Housing First, a model for serving chronically homeless individuals with serious mental illness and substance use disorder [13], is a highly complex EBI because it requires interaction between multiple individuals (e.g., providers, case managers, landlords), organizations (e.g., government funders, non-profit service providers, property management) and systems (e.g., housing, medical, mental health, substance abuse) to be successful [19, 21, 22, 23]. As such, it requires significant relational coordination with external entities. Previous research has also demonstrated how external factors such as community stigma and broader politics often result in 'not in my back yard' attitudes that can negatively impact model implementation [24, 25]. While external factors such as these are captured to some extent through concepts like 'cosmopolitanism' [3], 'interorganizational networks' [4, 6], 'interorganizational relationships' [6], 'sociopolitical' [4], 'social climate' [5], and 'political climate' [5] found in existing implementation frameworks, no one framework captures them all.

Our goal in the current literature review was to identify a more exhaustive list of external context factors impacting the implementation of complex interventions than what is explicated in current models and frameworks produced through synthesis of pre-existing theory [3, 4, 5, 6, 26]. We employed an integrative review process that aimed to indictively develop a taxonomy of external context constructs  based in empirical observations existing in the identified literature.

Methods

We conducted an integrative literature review because of its usefulness for generating theory and classifications of constructs [27, 28, 29]. The standard process for an integrative review includes the following steps: (1) problem formulation, (2) data collection, (3) evaluation of data, (4) data analysis, and (5) interpretation and presentation of the results [29]. The primary problem motivating this review was the lack of a tool to measure the external implementation context for complex interventions. As such, our primary research question was: What external context factors have been demonstrated to impact the implementation of complex health interventions (and social service interventions with health implications) within the empirical literature? Because of the appropriateness of inductive processes for theory and construct development, we conducted an iterative literature review where sampling at each stage was informed by literature identified in prior stages [27, 30].

Data collection process (July 2014–July 2015)

To be included for initial screening, articles were required to: (1) be written in English; (2) describe empirically observed external context factors (i.e., facilitators or barriers existing outside the boundaries of a particular organization or organizations implementing the intervention) affecting the implementation of a complex intervention or interventions, i.e., “interventions that contain several interacting components” [18] (p. 1); and (3) describe an intervention with impact (or reasonably ascertainable impact) on client- or population-level outcomes. We excluded any article that (4) only discussed external context factors as theoretical barriers or facilitators or (5) focused on interventions we understood to only impact organizational- or staff-level outcomes. Qualitative, quantitative, and mixed method articles were all included. Additionally, if a review article was encountered at any stage in the search process, we screened the original articles discussed within the review for inclusion into our analysis. All identified articles were processed using Zotero bibliographic management software [31]. When an article met basic inclusion criteria or when there was not enough information to tell from the title or abstract, we loaded the article into MAXQDA qualitative analysis software for further review and potential coding [32].

We collected data in four stages: (1) identification of known literature, (2) journal search, (3) author search, and (4) database search (Fig. 1 presents an overview of this process). These stages were identified after consultation with our university’s public health librarian (HC), who developed a strategy to specifically start with a review of the recent literature narrowly defined by our initial problem definition and expand to be more expansive in scope with each stage. (Engagement of librarians in systematic review design is demonstrated to lead to more comprehensive search strategies utilizing more exhaustive techniques than typically followed [33].)
Fig. 1

Flow diagram with detailed overview of the literature identification and screening process. Inclusion criteria for search: (1) written in English; (2) describe empirically observed external context factors affecting the implementation of a complex intervention or interventions; and (3) describe an intervention with impact on client- or population-level outcomes. Search exclusion criteria: (1) discussed external context factors as theoretical barriers or facilitators; (2) focused on interventions we understood to only impact organizational- or staff-level outcomes

The need for a multi-stage process was evidenced by the librarian’s initial searches that suggested the literature was too widely dispersed across disciplines for a keyword search to be effective. As such, the iterative process developed was aimed at identifying different strands of literature using different terminology to express similar concepts, which could inform a final database search (e.g., journal and author specific keywords and time parameters). Therefore, our search followed a logic similar to that of snowball sampling frequently used in qualitative research [34]. Reflecting our iterative approach, coding and category development started in Stage 1 and continued throughout the data collection process.

The goal of Stage 1 (July 2014 through November, 2014) was to identify articles discussing issues related to implementation in the external context to serve as a foundation for the search. To accomplish this, the Principal Investigator (DPW) first identified 22 documents discussing external context issues in Housing First programming (an intervention he has expert knowledge of) and articles discussing more general issues related to the external implementation context that he was already familiar with. Three of these articles were literature reviews, from which we identified an additional 139 articles (n = 161). Only 10 of these articles met inclusion criteria to continue to the next stage.

The goal of Stage 2 (November 2014) was to identify communities studying and conversing about the implementation of complex interventions using articles identified in Stage 1 as a guide. We identified 16 journals likely to publish literature on the external implementation context from the Stage 1 key literature and DPW’s expert knowledge of the topic. Because our list of journals was interdisciplinary and terminology is often discipline specific, HC created a list of search terms specific to each journal based on keywords from articles published within them that were identified in Stage 1 (see Additional file 1). To keep the search narrow at this stage, we randomly chose to search issues published in the years 2008, 2009, 2012, and 2013. For those journals without keywords, we manually searched the table of contents.

In Stage 3 (December 2015 through January 2015), we identified the 25 most relevant authors for our search based on the frequency of their publications in Stage 1 and the relevance of their work to the review (see Additional file 1). Only first and senior authors of articles were considered (as they were most likely of all included authors to have a strong body of research related to the goals of the review). We next identified the complete publication history for selected authors using Scopus (Elsevier citation database) and a search of their curriculum vitaes, which assisted us in our final refinement of parameters for Stage 4 (i.e., the database search).

In Stage 4 (July 2015),1 we used database-specific controlled vocabulary and keywords used by authors identified in the previous stages to design the database search. We ran a series of searches in the following databases: PubMed, PsychINFO, CINAHL, and Academic Search Premier. We used the previously generated keywords and phrases to identify appropriate controlled vocabulary for each database (see Additional file 1). Both controlled vocabulary terms and keywords or phrases were combined with filters (e.g., English language, publication date, peer-reviewed articles) to focus the search results. A second search of PubMed was also conducted using the specific name of the At Home/Chez Soi Housing First project, as we were aware individuals studying the program were publishing a considerable amount of new implementation literature on this EBI at the time, which we wanted to ensure we captured since previously used search terms were not picking it up. As we had already coded articles from previous stages and were approaching saturation of themes, we decided to limit the number of the database results we would review. We first separated these into one of four categories based on the primary discipline of the journal they were published in (e.g., medicine, public health, interdisciplinary, other). We then randomly selected 20% of the articles in each category to be coded. If duplicate articles from previous stages or foreign language articles were found, we replaced the article with another randomly-selected one from the same category and database. Additionally, to avoid redundancy and oversaturation, we did not code any articles for which we had already identified five or more articles from that author in earlier stages.

A detailed flow chart outlining the article screening process is depicted in Fig. 1 (a simplified PRISMA flow diagram is included in Additional file 2). Through this entire process, we identified a total of 14,432 articles. Of these, 2676 were duplicate articles, 4427 (including those marked as non-empirical and foreign language) were rejected for not meeting inclusion criteria, 27 were review articles that helped us identify original works, and 12 did not have full text versions we could locate. There were an additional 6990 articles identified in Stage 4 that we did not screen because we were reaching saturation of themes at that point (see Data Analysis section below). This left a total of 217 unique articles, which were fully coded. Due to this large number of articles, we have chosen to focus only on 61 describing EBIs published after 2009 because: (1) there is a potential difference in factors affecting implementation of interventions demonstrated to be effective (i.e., EBIs) and (2) it is reasonable to expect attention to external factors would be more salient after 2009 because this is when the first article describing the Consolidated Framework for Implementation Research (CFIR), an implementation framework that includes an 'outter setting' domain, was published, and this framework has had considerable influence over the operationalization of constructs affecting the implementation process. Due to heterogeneity of criteria depending on discipline of origin (i.e., medicine, behavioral health, public health, criminal justice), we defined an intervention as an EBI if: (1) the article stated it was an EBI or that there was prior evidence of its efficacy/effectiveness; (2) we were familiar with the intervention as an EBI; or (3) we were able to find evidence of its status as an EBI through an online search.

Data analysis

We conducted an inductive thematic content analysis with the goal of establishing homogenous, well-defined, and mutually exclusive categories from which to develop our constructs [27, 35, 36, 37]. First, two researchers (DPW and ELA), developed a list of preliminary codes after reviewing the articles identified in Stage 1. Next, three research assistants (JC, JJ, and IO) and ELA coded instances where external context factors were discussed in the 217 unique articles (see above) using our preliminary codes. When a coder encountered a facilitator or barrier not fitting the initial list, we discussed it as a group and developed a new code if warranted. Approximately every 2 months during the process, we used 10% of the articles to establish interrater agreement by looking at the degree of overlap in codes in MAXQDA [38]. We did not move coding forward until interrater agreement was established at 80%. Evaluation of the data was conducted during the coding process: only one article was identified that did not appropriately explain external context factors to warrant its inclusion in the review.

DPW conducted a second round of analysis in which he reviewed segments coded by the other researchers to develop more exact and thorough categories, focusing on the 61 EBI articles published after 2009. This activity also served as a quality check on the previous round of coding, and all instances where he identified a passage of text that may have been inappropriately coded were discussed as a team and recoded if discussion warranted. This process resulted in the development of eight overarching constructs representing the external context, which are described in detail below.

Once coding was completed, we created a two-by-two matrix of all codes to ensure there was no substantial overlap between them that would require collapsing or redefining of categories. We also counted the number of articles each category appeared in to understand how strongly represented it was within the sample (we did not count the frequency of times the code appeared, as there was potential that multiple coding instances of the same category within a document might be reflective of a single issue).

Results

Of the 61 articles included in the analysis, 43 [39, 40, 41, 42, 43, 44, 45, 46, 47, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61, 62, 63, 64, 65, 66, 67, 68, 69, 70, 71, 72, 73, 74, 75, 76, 77, 78, 79, 80, 81] were research articles and 18 [82, 83, 84, 85, 86, 87, 88, 89, 90, 91, 92, 93, 94, 95, 96, 97, 98, 99] were non-research articles (e.g., practice articles, policy updates, issue pieces, case studies, and commentaries). Additionally, 48 articles [39, 40, 41, 43, 44, 48, 49, 50, 51, 52, 53, 55, 57, 58, 59, 60, 61, 62, 63, 64, 65, 66, 67, 68, 71, 72, 73, 74, 75, 76, 77, 78, 79, 80, 82, 83, 86, 87, 88, 89, 90, 91, 92, 93, 96, 97, 98, 99] discussed a single intervention; 6 [47, 54, 81, 84, 85, 94] discussed multiple interventions related to the same health problem; and 7 [42, 45, 46, 56, 69, 70, 95] discussed issues related to the implementation of interventions associated with a specific health problem more generally. Regarding the issues interventions sought to address: 27 [42, 43, 44, 47, 48, 49, 50, 52, 54, 55, 58, 59, 60, 61, 63, 68, 69, 73, 82, 86, 87, 91, 93, 94, 95, 97] discussed behavioral health, mental health, or substance use interventions; 16 [45, 51, 56, 57, 62, 64, 67, 72, 76, 79, 81, 85, 88, 89, 96, 99] discussed public health or prevention interventions; 7 [53, 65, 74, 75, 83, 90, 92] discussed homelessness interventions (with aspects having overlap with both public health and behavioral health); 6 [41, 46, 66, 71, 78, 80] discussed medical, primary, or integrated care interventions; 4 [39, 40, 77, 84] were interventions in the area of parenting and/or child welfare; and 1 [98] was a criminal justice intervention.

We identified a total of eight constructs listed in Fig. 2. The table also includes the number of articles each construct was mentioned in and the number of times the construct was coded as a barrier or facilitator in these articles. We define each construct below and provide examples from the sampled articles.
Fig. 2

Taxonomy of external context constructs identified, their definitions, and frequency of coding in sample. Barrier and facilitator counts refer to the total number of times the issue was mentioned within the sample and do not consider the coding of multiple mentions of the same barrier or facilitator within a single article. Therefore, the document count is a better indicator of the extent to which the construct was discussed within the sample

Professional influences

Five articles identified professional influences as impacting EBI implementation. We define professional influences as formal or informal norms, rules, policies, or standards guiding the profession or professionalization of individuals involved in the implementation. In their study of Assertive Community Treatment and Motivational Interviewing, Mcgraw et al. [94] provide an example of informal professional influences as a barrier to implementation when they discuss difficulties recruiting psychiatrists willing to work in a community setting because the work requires a “different mindset” than most psychiatrists have and it “takes a long time to find [a psychiatrist] willing to work outside of their comfort zone” (p. 202). Winickoff et al. [79] provide the only example of a formal professional influence when they discuss how an amendment to an American Medical Association policy recommending “clinicians treat people who smoke with the available tobacco dependence treatments regardless of the clinical context” (p. 114) facilitated the implementation of a nicotine replacement therapy intervention.

Political support

Political support, identified in 7 articles, refers to the extent of backing from public officials or special interest groups (e.g., lobbyists or representatives of an occupational group). Political actors were demonstrated to have either a negative or positive influence on implementation depending on whether they were in support or opposition to the intervention(s) in question. For instance, Menear et al. [63] described how advocacy for supported employment by “academics and foreign experts” (p. 1034) increased the intervention’s professionalization (e.g., the process a trade or occupation goes through to become a true profession), thus having a positive impact on implementation. Demonstrating how political support can act as a barrier, Knutagard and Kristiansen [92] described how support for traditional housing services by public figures and organizations frustrated implementation of Housing First programming in Sweden: “In some cases, we were told that municipal representatives thought that Housing First would compete with their existing services, which they believed worked in a satisfactory manner” (p. 103).

Social climate

The social climate refers to beliefs, values, customs, and practices of the larger community or system within which the intervention is set. Ten articles identified issues affecting implementation that were part of the social environment. Benjaminson [83] and Nelson et al. [65] both noted how a commitment to traditional treatment models within the local communities they studied led to difficulties implementing Housing First programming because it resulted in resistance to the intervention's underlying paradigm. Negative attitudes and stigma toward particular groups (e.g., minorities, people with mental illness, ex-offenders) were also demonstrated implementation barriers. Such was the case with Hasson et al.’s [55] study in which they reported “s[k]epticism…about people with mental illness working” (p. 339) to be a barrier to the implementation of supported employment in Sweden. Demonstrating how social climate factors related to the larger systems within which an intervention is embedded, Chamberlain et al. [84] discussed how conflict arose when implementing an American-designed child welfare intervention in England due to “cultural differences” (p. 282) that impacted attitudes toward evidence-based practices.

Local infrastructure

Fourteen documents discussed the local infrastructure, i.e., physical, technical, or service structures or resources, in the larger service system or community as impacting EBI implementation. Both Glisson et al. [52] and Nelson et al. [65] identified lack of local transportation as problematic for implementation of the interventions they respectively studied. Glisson et al. noted how lack of public transportation interfered with clients’ ability to access supportive services, while Nelson et al. discussed how it led to social isolation among clients who did not have alternate modes of transportation to interact with family and friends. In Glisson’s case, implementation was facilitated after the program took measures to address lack of public transportation in the community. A number of housing interventions discussed the importance of having available housing in the community as being integral to successful implementation [53, 65, 74, 83, 92]. Similarly, Schneider and Akhtar [97] noted lack of available jobs in the community to be a barrier to implementation of supported employment. Amodeo et al. [42] point to both system and community barriers to the implementation of substance abuse treatment services:

…the theme of “lack of concrete services” emerged, with respondent comments including the following: for homeless population, lack of housing is a barrier,” “for co-occurring psychiatric and physical disorders, very limited resources,” and “lack of resources in the county (no bus system, few jobs),” and “project had limited access to substance abuse treatment. (p. 386)

Lack of infrastructure and resources for training were noted to stall implementation. For instance, “[l]ack availability of training facilities” (p. 1301) within the larger service system was a noted barrier in van Bodegom-Vos et al.’s [78] study of a rheumatoid arthritis intervention, and Hyder et al.’s [89] study of a road traffic safety intervention identified lack of local capacity for providing training and technical assistance as a problem.

Policy and legal climate

The policy and legal climate, referring to external regulations in the form of rules, policies, and laws, demonstrated to impact EBI implementation were discussed in 20 articles. Overly complicated, strict, burdensome, and unclear policies, including “unnecessary red tape within service systems” [52] (p. 3), were demonstrated to interfere with implementation of both mental health and medical interventions [41, 49, 52, 78, 82]. Some policies were demonstrated to specifically prevent funding of services key to the intervention, as in the case of Collins et al.’s study of an HIV intervention that was prevented from receiving U.S. Centers for Disease Control and Prevention funding due to policies within the government organization. Alexander et al.’s study demonstrated how “misalignment between current payment systems [rules] and [patient-centered medical home] goals” (p. 149) interfered with implementation of patient-centered medical homes.

Conflicting/competing demands related to polices of different government agencies and/or multiple program funders were a noted barrier, as in the case of McGraw et al.’s [94] study when they state “competing demands of multiple funding sources and the requirement to collaborate with local agencies…complicated the implementation” (p. 208) of EBIs for mental health. In the case of supported employment, policies were demonstrated to impair intervention acceptability for supported employment clients when obtaining a job could lead to a loss of benefits they considered to be important [44, 55].

In some cases, the policy and legal context was demonstrated to be a coercive external force that could facilitate EBI implementation. One such example was in Greenwood et al.’s [53] study where the “legal duty [of]…local authorities to rehouse all homeless people” (p. 308) helped facilitate implementation of a Scottish Housing First program. In a second example, employment authorities applied sufficient pressure through contracts and funding schemes to move several employment agencies toward a supported employment model [63].

Relational climate

Twenty-six articles discussed the external relational climate, or the degree and/or quality of relationships with external entities (e.g., referral sources, partner organizations, regulating agencies, etc.) not involved in implementation but key to successful intervention delivery, as impacting implementation. Various aspects of the relational climate were discussed. Having buy-in or support from influential organizations was demonstrated to have a positive impact on implementation, as in the case of Lloyd et al.’s [93] study of Foundations of Learning, where gaining support of Head Start agencies was a noted facilitator. Lack of buy-in from homeless service providers who favored alternative, non-evidence-based approaches was demonstrated to be a barrier in Knutagard and Kristiansen’s [92] and Greenwood et al.’s [53] studies of Housing First programming.

Strong partnerships with outside entities were demonstrated to facilitate EBI implementation, while poor or tarnished relationships with partners were a demonstrated barrier. For instance, in their study of Housing First, Nelson et al. [65] pointed to “partnerships with government agencies and departments [enhancing] the project’s ability to secure access to housing”, while tarnished relationships with landlords led to loss of housing options as they chose to leave the program. Likewise, Robinson et al. [96] demonstrate the importance of partnerships for providing referral and recruitment opportunities when they state “[p]artnering with stakeholders…was important in helping to overcome agency deficiencies in their connections with the [target] community” (p. 215) in their study of Community PROMISE for HIV.

Target population

Factors associated with the intervention’s target population, i.e., those individuals the intervention was designed to serve or impact, were discussed in 30 articles, with the needs of the target population being the most mentioned issue related to this topic. Needs related to reading comprehension [67], developmental stage [67, 83], transportation [42, 67], mental health [59, 98], finances [83], and scheduling (largely related to work and childcare) [40, 62, 88] were all demonstrated to negatively impact members of the target population’s capacity to engage in a wide range of interventions. Parker et al. [67] provide an example of needs related to developmental stage in their study of a Positive Prevention program adapted for youth living with HIV/AIDS where “the study team was not prepared for the degree to which the youth were delayed in their ability to read and write, potentially as a result of cognitive development deficiencies due to HIV” (p. 144).

The target population’s ability to access the intervention was demonstrated to be important. Intervention access had overlap with scheduling needs, as conflicts with work, school, or public transportation schedules were all noted implementation barriers. Such was the case with Perlick et al.’s [68] study of multifamily group treatment in which a number of veterans refused participation due to “work- or school-related scheduling conflict or feeling too busy” (p. 536). Lack of sufficient health insurance and social benefits were also demonstrated implementation barriers, as in the case of El-Mallakh et al.’s [49] study of MedMAP, a psychotropic medication management intervention, in which participants “were unable to afford costly medications” (p. 521) and Benjaminson et al.’s [83] discussion of Housing First, in which they noted lack of cash benefits as a barrier to finding affordable housing for young adults.

The culture of the target population was demonstrated to be important, as in the case of Robinson et al.’s [96] study of Community PROMISE. Robinson and colleagues noted how the location of the program for African American clients (whose culture is stigmatizing of homosexuality) in an area known to provide services to the gay community “presented a barrier for some [clients] who did not identify as gay or bisexual or were not open about their sexual experiences” (p. 215). Additionally, Stergiopoulos et al. [75] noted how language, an aspect of culture, was a barrier during the implementation of a Housing First program serving diverse group of clients.

Several factors related to the target population’s motivation to engage with the intervention were identified. For instance, Fox et al. [50] noted that a number of clients did “not see the value of participating in short-term follow-up evaluation” (p. 608) as part of the behavioral health intervention known as Parenting Young Children. Other issues mentioned that could negatively impact motivation of the target population included stigma [62], mistrust of the system an intervention is embedded in [53], and feeling as though they did not have enough time to participate fully [68]. Related to and possibly underlying motivation in some instances, the preferences and beliefs of the target population were also demonstrated to be important. Target population preferences were a facilitator for Benjaminson et al.’s [83] study of a Housing First intervention that found youth participants preferred temporary accommodations provided by the program to “emergency shelters or random couch surfing” (p. 126). Hasson et al. [55] demonstrated how target population members' beliefs that intervention participation might lead to discontinuation of government benefits or that they were unprepared/incapable of work in a competitive marketplace prevented some of them from participating in supported employment.

Finally, an important barrier associated the target population was when individuals in that population were not available or difficult to locate. An interesting example of this was in Silva et al.’s [72] study of a breast cancer screening intervention being implemented in Brazil, which had difficulty locating patients due to the transient nature of migrant worker lives. In the case of some youth interventions, difficulty locating parents could negatively impact the ability to engage the target population, which was the case in Langley et al.’s [60] study of Cognitive Behavioral Intervention for Trauma in Schools.

Funding and economic climate

Discussed within 35 articles, aspects of the funding and economic climate, i.e., the character of the national, regional, or local economy and availability of funding, were the most frequently mentioned issue affecting intervention implementation. Issues with the labor market such as lack of skilled and experienced workers, high cost of skilled workers when they were able to be located, and high turnover were all noted barriers to implementation. In one example, Alexander et al. [41] discuss difficulties with staffing that arose during implementation of a patient-centered medical home intervention: “availability of primary care physicians was a major threat…because of increasing differentials in income and working conditions, fewer medical students were opting to go into primary care.” (p. 150).

The availability of a stable funding source aligned with the intervention and organizational processes were important. For clinical interventions, “insufficient [numbers] of patients with reimbursement coverage” [73] (p. 10) was demonstrated to be problematic. Problems related to funding availability were also impacted by changes in policies or larger economic shifts: “The severity of the economic crisis has contributed several obstacles…as funding for mental health continues to decline, providers have to locate areas to cut.” [95] (p. 464). Incentive and reimbursement structures misaligned with the intervention were also demonstrated to negatively impact implementation. In the article by Sanchez et al. [71], the authors describe how reimbursement issues negatively impacted implementation of an integrated care program:

…financial issues present substantial obstacles to integrated health care…respondents predominately identified lack of reimbursement for clinical care management and paraprofessional services, followed by lack of reimbursement for screening services and consultation between primary care and behavioral health providers. (p. 31)

Discussion

To our knowledge, this is the first study to use a systematic review of empirical literature to identify external context factors documented to impact EBI implementation. While the large number of articles meeting our inclusion criteria suggest there has been a focus on the external context within the literature, the reality is that the majority of external context findings studies identified were the result of passive or exploratory endeavors, rather than purposeful research questions seeking to understand external context factors.

As demonstrated in Table 1, the external context constructs identified overlap somewhat, but not completely, with existing theoretical models and frameworks including: the CFIR [3], Exploration, Preparation, Implementation, Sustainment (EPIS) [4], Integrated Promoting Action on Research Implementation in Health Services (i-PARiHS; revised version of the original PARiHS) [6], and the Multi-level framework (MLF) predicting implementation outcomes [5]. We have chosen to focus on these frameworks in our discussion because they are highly cited and/or applied widely within the implementation literature.
Table 1

Comparison of constructs evidenced through literature review with external factor constructs in existing frameworks

 

Consolidate Framework for Implementation Research (CFIR) [3]

Exploration, Preparation, Implementation, Sustainment (EPIS) [4]a

Integrated Promoting Action on Research Implementation in Health Services (i-PARiHS) [6]b

Multi-level framework (MLF) predicting implementation outcomes [5]

Professional influences

--

Interorganizational networks

--

--

Political support

--

Sociopolitical; Client advocacy

--

Political or social climate

Social climate

--

--

--

Political or social climate

Local infrastructure

--

--

--

Infrastructure

Policy & legal climate

External policies and incentives

--

Policy drivers & priorities; incentives & mandates; regulatory frameworks

Public policies

Relational climate

Cosmopolitanism

Interorganizational networks

Interorganizational networks & relationships

--

Target population

Patient needs and resources

Client advocacy

--

--

Economic & funding climate

--

Funding

--

Economic climate

No directly comparable construct or too broad to directly parallel to identified constructs

Peer pressure

Intervention developers; Leadership

Environmental (in) stability (definition unclear)

Physical environment

'--' = No directly comparable construct

aOnly the active implementation phase of the EPIS framework is considered here since this was the focus of the current literature review

bWe focus on the revised version of the PARiHS, as the original did not address the external context; The i-PARiHS is limited in its conceptualization of the external context, as it only considers the external health system

While not any one of these four frameworks fully encapsulates the constructs identified through our review, each construct is represented in one way or another when the frameworks are considered in combination. As such, using only one of these frameworks as a guide to understanding the external context raises the risk that a key aspect of it might be overlooked. One potential reason for the inconsistent representation of our constructs across the existing frameworks is the context of these frameworks’ developments. For instance, the CFIR, PARiHS, and MLF were developed to explain implementation in healthcare-specific settings, while EPIS was developed within the context of more social service-oriented programming [4, 5, 26, 100]. Furthermore, the revised i-PARiHS (for which the original version did not consider the external context [26]) is focused explicitly on the health system (as opposed to the broader community) as its external context [6]. Our approach, however, was broader in that it included multiple types of health-related interventions regardless of their setting (e.g., healthcare or social service).

Just as these individual frameworks did not completely capture our constructs, our review did not find evidence of all aspects of the outer context identified within them. While we did not come across any instances where external ‘peer pressure’, an aspect of the CFIR’s ‘outer setting’, was discussed as having an impact on implementation, it is important to note this is considered a substantial motivator for the adoption of new practices within the organizational literature [13]. The reason the EPIS constructs ‘leadership' and ‘intervention developers’ did not stand out as themes within our analysis is because we largely considered these to be aspects of the internal context. This is because (even if they are originally external to the organization) we understand their involvement with the implementation process to place them in a role that can be considered part of the organization. It is important to note the CFIR also does not consider leadership and intervention developers to be part of the external context, placing them instead in its ‘inner setting’ and ‘intervention characteristics’ domains respectively [101].

Additionally, the physical environment, which is part of the MLF that includes such aspects as weather, topography, and the condition of the built human environment [5], was not evidenced in our review. While distance between the intervention and the individual was sometimes cited as impacting access, this was discussed more as an issue having to do with transportation than the physical environment. It is possible the physical environment would have stood out as a construct in its own right if we had not restricted our review to interventions we were able to identify as EBIs. For instance, an article by Zarrett et al. [102], which was excluded from our review, discussed how weather variations posed problems for a school-based physical activity intervention that was largely facilitated outdoors. Likewise, Colon et al., [103] discussed quality of sidewalks as a barrier to a walking intervention targeting underserved African American communities. Restricting our review to EBIs also excluded a number of articles looking at interventions in developing countries where the physical environment may have played a more salient role during the implementation process. Broadening our criteria to look at other phases of implementation may have also impacted our results. For instance, the literature on scaling of interventions tends to more robustly consider the implementation context due to the need to develop partnerships and engage political support [104]. For instance, a synthesis of models and frameworks for scaling of public health interventions conducted by Milat et al. [105] identified active engagement of community members and political will as important elements of the scaling-up process. The lack of explicit focus on the external context within the literature impacted the quality of the data available for our analysis, as it often resulted in incomplete explanations through which external context factors impacted implementation. This was particularly the case with the quantitative studies in our sample, which would often list an external context factor such as 'policies' or 'funding' without any additional context or explanation.

Another difficulty we encountered during the analysis process was clearly defining the often-fuzzy boundaries between the external and internal context. As Damschroder et al. [3] note:

…the line between inner and outer setting is not always clear and the interface is dynamic and sometimes precarious. The specific factors considered ‘in’ or ‘out’ will depend on the context of the implementation effort. For example, outlying clinics may be part of the outer setting in one study, but part of the inner setting in another study. (p. 5)

Therefore, we had to make subjective value judgments as to where these boundaries lie given the information available within the article. Additionally, while we do point to the frequency of times a construct was mentioned as a facilitator or barrier, it is difficult to predict in which instances a construct will act as one or the other due to the contextual factors influencing their particular effects on an intervention [106]. For instance, political support may be desired when it is received from well-regarded individuals and better avoided when it comes from unpopular sources.

Despite our goal of establishing completely mutually exclusive categories, there is some minor overlap between constructs that could not be avoided. For instance, lack of public transportation is both a ‘local infrastructure’ issue and it creates a need for transportation assistance within the ‘target population’. Issues related to funding were noted within both the ‘policy and legal climate’ and ‘economic and funding climate’, as policies often dictate what funding can be used for and create cumbersome processes attached to it. Furthermore, the ‘policy and legal climate’ can at times have direct impacts on ‘professional influences’ through regulations and ‘local infrastructure’ by prioritizing or de-emphasizing resources for development or sustainability. Finally, stigma existing in the larger ‘social climate’ can be internalized by members of the ‘target population’, thus leading to perceptions and beliefs that can impact acceptability of an intervention [107].

In addition to the above noted challenges, our choice to limit the timespan of articles analyzed for this paper to post-CFIR literature and to exclude gray literature may have also limited our results. However, degree of saturation in our themes (e.g., the extent to which adding new articles yielded no new information) and the extent of overlap they have with the frameworks considered in Table 1 does support the exhaustiveness of our review process [34]. Our choice to conduct a review of empirical literature also has strengths over the development of similar constructs, as the well-known and previously discussed frameworks were developed based on reviews of other models and theories [3, 4, 5], which may not have been based in empirical evidence themselves. Due to limitations in the studies reviewed, we were unable to determine any causal links between constructs and outcomes. Furthermore, our findings do not preclude the existence of other factors which may have impacted outcomes in these or other studies. The link between specific contextual factors and outcomes is one that is yet to be established in implementation science and is of utmost importance when seeking to develop and test effective implementation strategies [108]. It is our hope that the findings presented in this paper will be a first step in developing stronger operationalizations necessary to establish the impact of external context factors on implementaiton outcomes. Finally, it is possible our iterative approach may have failed to capture some relevant articles. Though, we do not consider this problematic considering our goal was the development of comprehensive themes through qualitative saturation, rather than an exhaustive identification of articles.

Conclusion

In summary, we identified eight constructs representing the external context through our integrative systematic review process. This list of constructs was more exhaustive than those proposed in any single one of the four implementation frameworks we compared them to. The incomplete representation of the external context within these frameworks is not meant to invalidate the usefulness of pre-existing theory. Indeed, existing implementation frameworks are incredibly useful guides that the external context constructs identified in this paper might be used in combination with depending on the needs of the individual study. Future work should seek to further operationalize the constructs identified for stronger measurement of the external context in an effort to better understand how and to what degree they impact the implementation process. Finally, additional work focusing on the external context as it relates to implementation of non-EBIs and low-resource settings has potential to evidence additional constructs.

Footnotes

  1. 1.

    There was a large gap in time between this and the previous data collection stage because we were conducting preliminary analysis to inform our next steps.

Notes

Acknowledgements

We would like to thank our design interns Emily Abrams (designed Fig. 1) and Madison Anderson (designed icons used in Fig. 2), and design student Megan Catellier (designed Fig. 2). We would also like to thank Dr. Joshua Vest for his support and advice on this project.

Funding

Funding from the National Institute on Drug Abuse (NIDA; R34 DA036001) supported refinement of the study design, data collection, analysis, and interpretation, and writing of this manuscript. Opinioins expressed in this paper are those of the authors and do not necessarily represent the position of the funding organization.

Availability of data and materials

Data analyzed in this article are publicly available the databases discussed in our methods. A list of all articles screened in this review is available from the first author upon request.

Authors’ contributions

DPW is principal investigator for the larger study and developed the research questions guiding this review, provided oversite of all stages of the review, and wrote the majority of the manuscript. ELA supervised the literature review process and assisted with the final analysis of the data. HC guided the development of the literature review method. SS and AM critically reviewed the manuscript for intellectual content, and SS also assisted in critically comparing our findings with existing frameworks to guide the discussion section. ELA, JJ, OIO, and JC conducted majority of screening and coding of articles and provided critical input into the development of the final constructs during this process. All authors reviewed and approved the manuscript prior to submission.

Ethics approval and consent to participate

Ethics approval was not required for this study, as no human subjects data were collected or used.

Competing interests

The authors declare that they have no competing interests.

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Supplementary material

12913_2018_3046_MOESM1_ESM.pdf (253 kb)
Additional file 1: Detailed search information related to specific stages. This file includes tables containing detailed information related to Stage 2, Stage 3, and Stage 4 of the search. (PDF 253 kb)
12913_2018_3046_MOESM2_ESM.pdf (226 kb)
Additional file 2: PRISMA 2009 flow diagram. This file contains a flow chart conforming to PRISMA guidelines that describes article screening process. (PDF 226 kb)

References

  1. 1.
    Solomons NM, Spross JA. Evidence-based practice barriers and facilitators from a continuous quality improvement perspective: an integrative review. J Nurs Manag. 2011;19:109–20.CrossRefPubMedGoogle Scholar
  2. 2.
    Proctor EK, Landsverk J, Aarons G, Chambers D, Glisson C, Mittman B. Implementation research in mental health services: an emerging science with conceptual, methodological, and training challenges. Adm Policy Ment Health Ment Health Serv Res. 2009;36:24–34.CrossRefGoogle Scholar
  3. 3.
    Damschroder LJ, Aron DC, Keith RE, Kirsh SR, Alexander JA, Lowery JC. Fostering implementation of health services research findings into practice: a consolidated framework for advancing implementation science. Implement Sci. 2009;4:50.CrossRefPubMedPubMedCentralGoogle Scholar
  4. 4.
    Aarons GA, Hurlburt M, Horwitz SM. Advancing a conceptual model of evidence-based practice implementation in public service sectors. Adm Policy Ment Health Ment Health Serv Res. 2011;38:4–23.CrossRefGoogle Scholar
  5. 5.
    Chaudoir SR, Dugan AG, Barr CH. Measuring factors affecting implementation of health innovations: a systematic review of structural, organizational, provider, patient, and innovation level measures. Implement Sci. 2013;8:22.CrossRefPubMedPubMedCentralGoogle Scholar
  6. 6.
    Harvey G, Kitson A. PARIHS revisited: from heuristic to integrated framework for the successful implementation of knowledge into practice. Implement Sci. 2016;11:33.CrossRefPubMedPubMedCentralGoogle Scholar
  7. 7.
    Baumann AA, Cabassa LJ, Wiltsey SS. Adaptation in dissemination and implementation science. In: Brownson RC, Colditz GA, Proctor EK, editors. Dissemination and implementation research in health: translating science to practice. 2nd ed. New York: Oxford University Press; 2018. p. 286–300.Google Scholar
  8. 8.
    Birken SA, Bunger AC, Powell BJ, Turner K, Clary AS, Klaman SL, et al. Organizational theory for dissemination and implementation research. Implement Sci. 2017;12:62.CrossRefPubMedPubMedCentralGoogle Scholar
  9. 9.
    Boaz A, Baeza J, Fraser A. Does the ‘diffusion of innovations’ model enrich understanding of research use? Case studies of the implementation of thrombolysis services for stroke. J Health Serv Res Policy. 2016;21:229–34.CrossRefPubMedGoogle Scholar
  10. 10.
    Clinton-McHarg T, Yoong SL, Tzelepis F, Regan T, Fielding A, Skelton E, et al. Psychometric properties of implementation measures for public health and community settings and mapping of constructs against the consolidated framework for implementation research: a systematic review. Implement Sci. 2016;11:148.CrossRefPubMedPubMedCentralGoogle Scholar
  11. 11.
    Lewis CC, Stanick CF, Martinez RG, Weiner BJ, Kim M, Barwick M, et al. The Society for Implementation Research Collaboration Instrument Review Project: a methodology to promote rigorous evaluation. Implement Sci. 2015;10:2.CrossRefPubMedPubMedCentralGoogle Scholar
  12. 12.
    Williamson OE. The economics of organization: the transaction cost approach. Am J Sociol. 1981;87:548–77.CrossRefGoogle Scholar
  13. 13.
    DiMaggio PJ, Powell WW. The Iron cage revisited: institutional isomorphism and collective rationality in organizational fields. Am Sociol Rev. 1983;48:147–60.CrossRefGoogle Scholar
  14. 14.
    Morgan G. Images of organization. Updated edition. Thousand Oaks: SAGE Publications, Inc; 2006.Google Scholar
  15. 15.
    Hillman AJ, Withers MC, Collins BJ. Resource dependence theory: a review. J Manag. 2009;35:1404–27.Google Scholar
  16. 16.
    Palinkas LA, Saldana L, Chou C-P, Chamberlain P. Use of research evidence and implementation of evidence-based practices in youth-serving systems. Child Youth Serv Rev. 2017;83(Supplement C):242–7.CrossRefPubMedGoogle Scholar
  17. 17.
    Angeles RN, Dolovich L, Kaczorowski J, Thabane L. Developing a theoretical framework for complex community-based interventions. Health Promot Pract. 2014;15:100–8.CrossRefPubMedGoogle Scholar
  18. 18.
    Craig P, Dieppe P, Macintyre S, Michie S, Nazareth I, Petticrew M, et al. Developing and evaluating complex interventions: the new Medical Research Council guidance. BMJ. 2008;337.Google Scholar
  19. 19.
    May C. A rational model for assessing and evaluating complex interventions in health care. BMC Health Serv Res. 2006;6:86.CrossRefPubMedPubMedCentralGoogle Scholar
  20. 20.
    Plsek PE, Greenhalgh T. The challenge of complexity in health care. BMJ. 2001;323:625–8.CrossRefPubMedPubMedCentralGoogle Scholar
  21. 21.
    Dickson-Gomez J, Convey M, Hilario H, Corbett AM, Weeks M. Unofficial policy: access to housing, housing information and social services among homeless drug users in Hartford, Connecticut. Subst Abuse Treat Prev Policy. 2007;2:8.CrossRefPubMedPubMedCentralGoogle Scholar
  22. 22.
    Durlak JA, DuPre EP. Implementation matters: a review of research on the influence of implementation on program outcomes and the factors affecting implementation. Am J Community Psychol. 2008;41:327–50.CrossRefPubMedGoogle Scholar
  23. 23.
    Greenhalgh T, Robert G, MacFarlane F, Bate P, Kyriakidou O. Diffusion of innovations in service organizations: systematic review and recommendations. Milbank Q. 2004;82:581–629.CrossRefPubMedPubMedCentralGoogle Scholar
  24. 24.
    Watson DP, Shuman V, Kowalsky J, Golembiewski E, Brown M. Housing first and harm reduction: a rapid review and document analysis of the US and Canadian open-access literature. Harm Reduct J. 2017;14:30.CrossRefPubMedPubMedCentralGoogle Scholar
  25. 25.
    Tsemberis S. Housing first: ending homelessness, promoting recovery, and reducing costs. In: Brody KD, Hall Jamieson K, Taylor SE, editors. How to house the homeless. New York: Russel Sage Foundation; 2010. p. 37–56.Google Scholar
  26. 26.
    Kitson A, Harvey G, McCormack B. Enabling the implementation of evidence based practice: a conceptual framework. Qual Health Care QHC. 1998;7:149–58.CrossRefPubMedGoogle Scholar
  27. 27.
    Gough D, Thomas J, Oliver S. Clarifying differences between review designs and methods. Syst Rev. 2012;1:28.CrossRefPubMedPubMedCentralGoogle Scholar
  28. 28.
    Russell CL. An overview of the integrative research review. Prog Transplant Aliso Viejo. 2005;15:8–13.CrossRefGoogle Scholar
  29. 29.
    Torraco RJ. Writing integrative literature reviews: guidelines and examples. Hum Resour Dev Rev. 2005;4:356–67.CrossRefGoogle Scholar
  30. 30.
    Victor L. Systematic reviewing. Soc Res UPDATE. 2008;54.Google Scholar
  31. 31.
    Roy Rosenzweig Center fo History and New Media. Zotero | Home.Google Scholar
  32. 32.
    VERBI Software-Consult-Sozialforschung GmbH. MAXQDA, software for qualitative data analysis. Berlin: VERBI Soft; 1989.Google Scholar
  33. 33.
    Koffel JB. Use of recommended search strategies in systematic reviews and the impact of librarian involvement: a cross-sectional survey of recent authors. PLoS One. 2015;10  https://doi.org/10.1371/journal.pone.0125931.
  34. 34.
    Patton MQ. Qualitative research and evaluation methods. Thousand Oaks: Sage Publications; 2002.Google Scholar
  35. 35.
    Booth A, Sutton A, Papaioannou D. Systematic approaches to a successful literature review. Los Angeles: Sage; 2016.Google Scholar
  36. 36.
    Hsieh H-F, Shannon SE. Three approaches to qualitative content analysis. Qual Health Res. 2005;15:1277–88.CrossRefPubMedGoogle Scholar
  37. 37.
    Kuckartz U. Qualitative text analysis: a guide to methods, practice and using software. Sage; 2014. Google Scholar
  38. 38.
    Morse JM. Qualitative health research: creating a new discipline. Walnut Creek: Left Coast Press; 2012.Google Scholar
  39. 39.
    Aarons GA, Fettes DL, Hurlburt MS, Palinkas LA, Gunderson L, Willging CE, et al. Collaboration, negotiation, and coalescence for interagency-collaborative teams to scale-up evidence-based practice. J Clin Child Adolesc Psychol. 2014;43:915–28.CrossRefPubMedPubMedCentralGoogle Scholar
  40. 40.
    Aarons GA, Miller EA, Green AE, Perrott JA, Bradway R. Adaptation happens: a qualitative case study of implementation of the incredible years evidence-based parent training programme in a residential substance abuse treatment programme. J Child Serv. 2012;7:233–45.CrossRefGoogle Scholar
  41. 41.
    Alexander JA, Cohen GR, Wise CG, Green LA. The policy context of patient centered medical homes: perspectives of primary care providers. J Gen Intern Med. 2012;28:147–53.CrossRefPubMedPubMedCentralGoogle Scholar
  42. 42.
    Amodeo M, Lundgren L, Cohen A, Rose D, Chassler D, Beltrame C, et al. Barriers to implementing evidence-based practices in addiction treatment programs: comparing staff reports on motivational interviewing, adolescent community reinforcement approach, assertive community treatment, and cognitive-behavioral therapy. Eval Program Plann. 2011;34:382–9.CrossRefPubMedGoogle Scholar
  43. 43.
    Becker DR, Drake RE, Bond GR. The IPS supported employment learning collaborative. Psychiatr Rehabil J. 2014;37:79–85.CrossRefPubMedGoogle Scholar
  44. 44.
    Bejerholm U, Larsson L, Hofgren C. Individual placement and support illustrated in the Swedish welfare system: a case study. J Vocat Rehabil. 2011;35:59–72.Google Scholar
  45. 45.
    Cilenti D, Brownson RC, Umble K, Erwin PC, Summers R. Information-seeking behaviors and other factors contributing to successful implementation of evidence-based practices in local health departments. J Public Health Manag Pract JPHMP. 2012;18:571–6.CrossRefPubMedGoogle Scholar
  46. 46.
    Dadich A, Hosseinzadeh H. Healthcare reform: implications for knowledge translation in primary care. BMC Health Serv Res. 2013;13  https://doi.org/10.1186/1472-6963-13-490.
  47. 47.
    Drake RE, Frey W, Bond GR, Goldman HH, Salkever D, Miller A, et al. Assisting social security disability insurance beneficiaries with schizophrenia, bipolar disorder, or major depression in returning to work. Am J Psychiatry. 2013;170:1433–41.CrossRefPubMedGoogle Scholar
  48. 48.
    El-Mallakh P, Howard PB, Rayens MK, Roque AP, Adkins S. Organizational fidelity to a medication management evidence-based practice in the treatment of schizophrenia. J Psychosoc Nurs Ment Health Serv. 2013;51:35–44.CrossRefPubMedGoogle Scholar
  49. 49.
    El-Mallakh P, Howard PB, Bond GR, Roque AP. Challenges of implementing a medication management evidence-based practice in a community mental health setting: results of a qualitative study. Issues Ment Health Nurs. 2014;35:517–25.CrossRefPubMedGoogle Scholar
  50. 50.
    Fox RA, Mattek RJ, Gresl BL. Evaluation of a university-community partnership to provide home-based, mental health services for children from families living in poverty. Community Ment Health J. 2013;49:599–610.CrossRefPubMedGoogle Scholar
  51. 51.
    Gannon M, Qaseem A, Snooks Q, Snow V. Improving adult immunization practices using a team approach in the primary care setting. Am J Public Health. 2012;102:e46–52.CrossRefPubMedPubMedCentralGoogle Scholar
  52. 52.
    Glisson C, Schoenwald SK, Hemmelgarn A, Green P, Dukes D, Armstrong KS, et al. Randomized trial of MST and ARC in a two-level evidence-based treatment implementation strategy. J Consult Clin Psychol. 2010;78:537.CrossRefPubMedPubMedCentralGoogle Scholar
  53. 53.
    Greenwood RM, Stefancic A, Tsemberis S, Busch-Geertsema V. Implementations of housing first in europe: successes and challenges in maintaining model fidelity. Am J Psychiatr Rehabil. 2013;16:290–312.CrossRefGoogle Scholar
  54. 54.
    Guerrero EG. Managerial challenges and strategic solutions to implementing organizational change in substance abuse treatment for Latinos. Adm Soc Work. 2013;37:286–96.CrossRefGoogle Scholar
  55. 55.
    Hasson H, Andersson M, Bejerholm U. Barriers in implementation of evidence-based practice: supported employment in Swedish context. J Health Organ Manag. 2011;25:332–45.CrossRefPubMedGoogle Scholar
  56. 56.
    Jacobs JA, Dodson EA, Baker EA, Deshpande AD, Brownson RC. Barriers to evidence-based decision making in public health: a national survey of chronic disease practitioners. Public Health Rep Wash DC 1974. 2010;125:736–42.CrossRefPubMedPubMedCentralGoogle Scholar
  57. 57.
    Keay L, Hunter K, Brown J, Simpson JM, Bilston LE, Elliott M, et al. Evaluation of an education, restraint distribution, and fitting program to promote correct use of age-appropriate child restraints for children aged 3 to 5 years: a cluster randomized trial. Am J Public Health. 2012;102:e96–e102.CrossRefPubMedPubMedCentralGoogle Scholar
  58. 58.
    Knaeps J, DeSmet A, Van Audenhove C. The IPS fidelity scale as a guideline to implement supported employment. J Vocat Rehabil. 2012;37:13–23.Google Scholar
  59. 59.
    Kostick KM, Whitley R, Bush PW. Client-centeredness in supported employment: specialist and supervisor perspectives. J Ment Health. 2010;19:523–31.CrossRefPubMedGoogle Scholar
  60. 60.
    Langley AK, Nadeem E, Kataoka SH, Stein BD, Jaycox LH. Evidence-based mental health programs in schools: barriers and facilitators of successful implementation. School Ment Health. 2010;2:105–13.CrossRefPubMedPubMedCentralGoogle Scholar
  61. 61.
    Lee MY, Teater B, Greene GJ, Solovey AD, Grove D, Scott Fraser J, et al. Key processes, ingredients and components of successful systems collaboration: working with severely emotionally or behaviorally disturbed children and their families. Adm Policy Ment Health Ment Health Serv Res. 2012;39:394–405.CrossRefGoogle Scholar
  62. 62.
    Mahoney M, Bien M, Comfort M. Adaptation of an evidence-based HIV prevention intervention for women with incarcerated partners: expanding to community settings. AIDS Educ Prev. 2013;25:1–13.CrossRefPubMedGoogle Scholar
  63. 63.
    Menear M, Reinharz D, Corbière M, Houle N, Lanctôt N, Goering P, et al. Organizational analysis of Canadian supported employment programs for people with psychiatric disabilities. Soc Sci Med. 2011;72:1028–35.CrossRefPubMedGoogle Scholar
  64. 64.
    Myers JJ, Bradley-Springer L, Dufour M-SK, Koester KA, Beane S, Warren N, et al. Supporting the integration of HIV testing into primary care settings. Am J Public Health. 2012;102:e25–32.CrossRefPubMedPubMedCentralGoogle Scholar
  65. 65.
    Nelson G, Stefancic A, Rae J, Townley G, Tsemberis S, Macnaughton E, et al. Early implementation evaluation of a multi-site housing first intervention for homeless people with mental illness: a mixed methods approach. Eval Program Plann. 2014;43:16–26.CrossRefPubMedGoogle Scholar
  66. 66.
    Nutting PA, Crabtree BF, Stewart EE, Miller WL, Palmer RF, Stange KC, et al. Effect of facilitation on practice outcomes in the National Demonstration Project model of the patient-centered medical home. Ann Fam Med. 2010;8(Suppl 1):S33–44. S92CrossRefPubMedPubMedCentralGoogle Scholar
  67. 67.
    Parker L, Maman S, Pettifor A, Chalachala JL, Edmonds A, Golin CE, et al. Feasibility analysis of an evidence-based positive prevention intervention for youth living with HIV/AIDS in Kinshasa, Democratic Republic of the Congo. AIDS Educ Prev. 2013;25:135–50.CrossRefPubMedPubMedCentralGoogle Scholar
  68. 68.
    Perlick DA, Straits-Troster K, Strauss JL, Norell D, Tupler LA, Levine B, et al. Implementation of multifamily group treatment for veterans with traumatic brain injury. Psychiatr Serv. 2013;64:534–40.CrossRefPubMedGoogle Scholar
  69. 69.
    Powell BJ, Hausmann-Stabile C, McMillen JC. Mental health clinicians’ experiences of implementing evidence-based treatments. J Evid-Based Soc Work. 2013;10:396–409.CrossRefPubMedGoogle Scholar
  70. 70.
    Rapp CA, Etzel-Wise D, Marty D, Coffman M, Carlson L, Asher D, et al. Barriers to evidence-based practice implementation: results of a qualitative study. Community Ment Health J. 2010;46:112–8.CrossRefPubMedGoogle Scholar
  71. 71.
    Sanchez K, Thompson S, Alexander L. Current strategies and barriers in integrated health care: a survey of publicly funded providers in Texas. Gen Hosp Psychiatry. 2010;32:26–32.CrossRefPubMedGoogle Scholar
  72. 72.
    Silva TB, Mauad EC, Carvalho AL, Jacobs LA, Shulman LN. Difficulties in implementing an organized screening program for breast cancer in Brazil with emphasis on diagnostic methods. Rural Remote Health. 2013;13:2321.PubMedGoogle Scholar
  73. 73.
    Solberg LI, Crain AL, Jaeckels N, Ohnsorg KA, Margolis KL, Beck A, et al. The DIAMOND initiative: implementing collaborative care for depression in 75 primary care clinics. Implement Sci. 2013;8:135.CrossRefPubMedPubMedCentralGoogle Scholar
  74. 74.
    Stefancic A, Henwood BF, Melton H, Shin S-M, Lawrence-Gomez R, Tsemberis S. Implementing housing first in rural areas: pathways Vermont. Am J Public Health. 2013;103:S206–9.CrossRefPubMedPubMedCentralGoogle Scholar
  75. 75.
    Stergiopoulos V, O’Campo P, Gozdzik A, Jeyaratnam J, Corneau S, Sarang A, et al. Moving from rhetoric to reality: adapting housing first for homeless individuals with mental illness from ethno-racial groups. BMC Health Serv Res. 2012;12:345.CrossRefPubMedPubMedCentralGoogle Scholar
  76. 76.
    Taylor EC, Nickel NC, Labbok MH. Implementing the ten steps for successful breastfeeding in hospitals serving low-wealth patients. Am J Public Health. 2012;102:2262–8.CrossRefPubMedPubMedCentralGoogle Scholar
  77. 77.
    Turner KM, Nicholson JM, Sanders MR. The role of practitioner self-efficacy, training, program and workplace factors on the implementation of an evidence-based parenting intervention in primary care. J Prim Prev. 2011;32:95–112.CrossRefPubMedGoogle Scholar
  78. 78.
    van Bodegom-Vos L, Verhoef J, Dickmann M, Kleijn M, van Vliet I, Hurkmans E, et al. A qualitative study of barriers to the implementation of a rheumatoid arthritis guideline among generalist and specialist physical therapists. Phys Ther. 2012;92:1292–305.CrossRefPubMedGoogle Scholar
  79. 79.
    Winickoff JP, Nabi-Burza E, Chang Y, Finch S, Regan S, Wasserman R, et al. Implementation of a parental tobacco control intervention in pediatric practice. Pediatrics. 2013;132:109–17.CrossRefPubMedPubMedCentralGoogle Scholar
  80. 80.
    Wise CG, Alexander JA, Green LA, Cohen GR, Koster CR. Journey toward a patient-centered medical home: readiness for change in primary care practices. Milbank Q. 2011;89:399–424.CrossRefPubMedPubMedCentralGoogle Scholar
  81. 81.
    Ziedonis DM, Wang X, Li T, Kim SS, Tonelli ME, Li S, et al. Addressing tobacco through organizational change in a hospital-based mental health Center in China: the intervention and lessons learned in a pilot implementation project. J Dual Diagn. 2012;8:148–57.CrossRefGoogle Scholar
  82. 82.
    Becker DR, Drake RE, Bond GR, Nawaz S, Haslett WR, Martinez RA. A national mental health learning collaborative on supported employment. Psychiatr Serv. 2011;62:704–6.CrossRefPubMedGoogle Scholar
  83. 83.
    Benjaminsen L. Policy review up-date: results from the housing first based Danish homelessness strategy. Eur J Homelessness Vol. 2013;7.Google Scholar
  84. 84.
    Chamberlain P, Roberts R, Jones H, Marsenich L, Sosna T, Price JM. Three collaborative models for scaling up evidence-based practices. Adm Policy Ment Health Ment Health Serv Res. 2012;39:278–90.CrossRefGoogle Scholar
  85. 85.
    Collins Jr CB, Hearn KD, Whittier DN, Freeman A, Stallworth JD, Phields M. Implementing packaged HIV-prevention interventions for HIV-positive individuals: considerations for clinic-based and community-based interventions. Public Health Rep. 2010;125(SUPPL. 1):55–63.CrossRefGoogle Scholar
  86. 86.
    Drake RE, Becker DR. Why not implement supported employment? Psychiatr Serv. 2011;62:1251.CrossRefPubMedGoogle Scholar
  87. 87.
    Feinberg E, Silverstein M, Ferreira-Cesar Z. Integrating mental health Services for Mothers of children with autism. Psychiatr Serv. 2013;64:930.CrossRefPubMedGoogle Scholar
  88. 88.
    Foster GA, Alviar A, Neumeier R, Wootten A. A tri-service perspective on the implementation of a centering pregnancy model in the military. J Obstet Gynecol Neonatal Nurs. 2012;41:315–21.CrossRefPubMedGoogle Scholar
  89. 89.
    Hyder AA, Allen KA, Pietro GD, Adriazola CA, Sobel R, Larson K, et al. Addressing the implementation gap in global road safety: exploring features of an effective response and introducing a 10-country program. Am J Public Health. 2012;102:1061–7.CrossRefPubMedPubMedCentralGoogle Scholar
  90. 90.
    Keller C, Goering P, Hume C, Macnaughton E, O’Campo P, Sarang A, et al. Initial implementation of housing first in five Canadian cities: how do you make the shoe fit, when one size does not fit all? Am J Psychiatr Rehabil. 2013;16:275–89.CrossRefGoogle Scholar
  91. 91.
    Knox KL, Stanley B, Currier Glenn W, Brenner L, Ghahramanlou-Holloway M, Brown G. An emergency department-based brief intervention for veterans at risk for suicide (SAFE VET). Am J Public Health. 2012;102:S33–7.CrossRefPubMedPubMedCentralGoogle Scholar
  92. 92.
    Knutagaard M, Kristiansen A. Not by the book: the emergence and translation of housing first in Sweden. Eur J Homelessness. 2013;7:93.Google Scholar
  93. 93.
    Lloyd CM, Morris PA, Portilla XA. Implementing the foundations of learning project: considerations for preschool intervention research. J Prev Interv Community. 2014;42:282–99.CrossRefPubMedGoogle Scholar
  94. 94.
    McGraw SA, Larson MJ, Foster SE, Kresky-Wolff M, Botelho EM, Elstad EA, et al. Adopting best practices: lessons learned in the collaborative initiative to help end chronic homelessness (CICH). J Behav Health Serv Res. 2010;37:197–212.CrossRefPubMedGoogle Scholar
  95. 95.
    Rapp CA, Goscha RJ, Carlson LS. Evidence-based practice implementation in Kansas. Community Ment Health J. 2010;46:461–5.CrossRefPubMedGoogle Scholar
  96. 96.
    Robinson BE, Galbraith JS, Lund SM, Hamilton AR, Shankle MD. The process of adaptation of a community-level, evidence-based intervention for HIV-positive African American men who have sex with men in two cities. AIDS Educ Prev. 2012;24:206–27.CrossRefPubMedGoogle Scholar
  97. 97.
    Schneider J, Akhtar A. Implementation of individual placement and support: the Nottingham experience. Psychiatr Rehabil J. 2012;35:325.CrossRefPubMedGoogle Scholar
  98. 98.
    Tew J, Atkinson R. The Chromis programme: from conception to evaluation. Psychol Crime Law. 2013;19:415–31.CrossRefGoogle Scholar
  99. 99.
    Whittaker R, Matoff-Stepp S, Meehan J, Kendrick J, Jordan E, Stange P, et al. Text4baby: development and implementation of a national text messaging health information service. Am J Public Health. 2012;102:2207–13.CrossRefPubMedPubMedCentralGoogle Scholar
  100. 100.
    Damschroder LJ, Hagedorn HJ. A guiding framework and approach for implementation research in substance use disorders treatment. Psychol Addict Behav. 2011;25:194–205.CrossRefPubMedGoogle Scholar
  101. 101.
    Main Page - CFIR Wiki. http://cfirwiki.net/wiki/index.php?title=Main_Page. Accessed 4 May 2017.
  102. 102.
    Zarrett N, Skiles B, Wilson DK, McClintock L. A qualitative study of staff’s perspectives on implementing an after school program promoting youth physical activity. Eval Program Plann. 2012;35:417–26.CrossRefPubMedGoogle Scholar
  103. 103.
    Coulon SM, Wilson DK, Griffin S, St. George SM, Alia KA, Trumpeter NN, et al. Formative process evaluation for implementing a social marketing intervention to increasewalking among African Americans in the positive action for Today’s health trial. Am J Public Health. 2012;102:2315–21.CrossRefPubMedPubMedCentralGoogle Scholar
  104. 104.
    Willis CD, Riley BL, Stockton L, Abramowicz A, Zummach D, Wong G, et al. Scaling up complex interventions: insights from a realist synthesis. Health Res Policy Syst. 2016;14:88.CrossRefPubMedPubMedCentralGoogle Scholar
  105. 105.
    Milat AJ, Bauman A, Redman S. Narrative review of models and success factors for scaling up public health interventions. Implement Sci. 2015;10:113.CrossRefPubMedPubMedCentralGoogle Scholar
  106. 106.
    May CR, Johnson M, Finch T. Implementation, context and complexity. Implement Sci. 2016;11:141.CrossRefPubMedPubMedCentralGoogle Scholar
  107. 107.
    Brooke-Sumner C, Petersen I, Asher L, Mall S, Egbe CO, Lund C. Systematic review of feasibility and acceptability of psychosocial interventions for schizophrenia in low and middle income countries. BMC Psychiatry. 2015;15  https://doi.org/10.1186/s12888-015-0400-6.
  108. 108.
    Hamilton AS, Mittman BS. Implementation science in health care. In: Brownson RC, Colditz GA, Proctor EK, editors. Dissemination and implementation research in health: translating science to practice. 2nd ed. New York: Oxford University Press; 2018. p. 385–400.Google Scholar

Copyright information

© The Author(s). 2018

Open AccessThis article is distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated.

Authors and Affiliations

  • Dennis P. Watson
    • 1
  • Erin L. Adams
    • 2
  • Sarah Shue
    • 3
  • Heather Coates
    • 4
  • Alan McGuire
    • 5
  • Jeremy Chesher
    • 6
  • Joanna Jackson
    • 7
  • Ogbonnaya I. Omenka
    • 7
  1. 1.Department of Social and Behavioral SciencesIndiana University Richard M. Fairbanks School of Public HealthIndianapolisUSA
  2. 2.Department of PsychologyIndiana University Purdue University-IndianapolisIndianapolisUSA
  3. 3.Indiana University-Purdue University Indianapolis, School of Health and Rehabilitation SciencesIndianapolisUSA
  4. 4.Indiana University-Purdue University Indianapolis, University Library, Center for Digital ScholarshipIndianapolisUSA
  5. 5.Richard L. Roudebush VAIndianapolisUSA
  6. 6.Department of Environmental Health SciencesIndiana University Richard M. Fairbanks School of Public HealthIndianapolisUSA
  7. 7.Department of Health Policy and ManagementIndiana University Richard M. Fairbanks School of Public HealthIndianapolisUSA

Personalised recommendations