Addressing the Adherence-Adaptation Debate: Lessons from the Replication of an Evidence-Based Sexual Health Program in School Settings
Whether high adherence to programs is necessary to achieve program outcomes is an area of great debate. The objectives of this study were to determine the frequency, type, and rationale of adaptations made in the implementation of an evidence-based program and to determine program outcomes for intervention program participants, as compared to comparison participants, by the level of adaptations. A total of 1608 participants in 45 classrooms participated. Percent adaptations was calculated by classroom. Thematic qualitative analysis was used to categorize types and rationales for adaptations. Program outcomes by level of adaptations were determined using logistic regression analyses and mean differences. Propensity score matching methods were used to create comparability between adaptation subgroup participants and comparison participants. Adaptations ranged from 2 to 97% across classrooms, with mean adaptations of 63%. Thematic analysis revealed that the adaptations made were related to delivery of content, rather than to the content itself and in response to participant needs and setting constraints. Program outcomes did not appear to be reduced for the high-adaptation subgroup. Understanding both rationale (intent) and type of adaptation made is crucial to understanding the complexity of adaptations. These finding support the argument for allowing facilitators some flexibility and autonomy to adapt the delivery of prescribed content to participant needs and setting constraints.
KeywordsProgram implementation Evidence-based program Sexual and reproductive health programs Adolescents Program implementation fidelity
Its contents are solely the responsibility of the author and do not necessarily represent the official views of the funder. J. Parekh was supported by the National Institute of Allergy and Infectious Disease (NIAID T32 AI050056-12).
Compliance with Ethical Standards
Conflict of Interest
The authors declare that they have no conflict of interest.
All procedures performed in studies involving human participants were in accordance with the ethical standards of Johns Hopkins University’s IRB and with the 1964 Helsinki declaration and its later amendments or comparable ethical standards.
Informed consent was obtained from all individual participants included in the study.
This project is supported by the Administration for Children and Families’ Family and Youth Services Bureau (FYSB), utilizing federal funds HHS-2010-ACF-ACYF-PREP-0125.
- ATLASti. Version 1.0.24. (1999). Berlin, Germany: Scientific Sofware Development. Retrieved from: https://atlasti.com/.
- Bell, S. G., Newcomer, S. F., Bachrach, C., Borawski, E., Jemmott III, J. B., Morrison, D., … Zimmerman, R. (2007). Challenges in replicating interventions. Journal of Adolescent Health, 40, 514–520. https://doi.org/10.1016/j.jadohealth.2006.09.005.
- Berman, P., & McLaughlin, M. W. (1978). Federal programs supporting educational change, Vol. VIII: Implementing and sustaining innovations. Santa Monica: RAND Corporation, R-1589/8-HEW, 1978. Retrieved from: https://www.rand.org/pubs/reports/R1589z8.html.
- Blakely, C. H., Mayer, J. P., Gottschalk, R. G., Schmitt, N., Davidson, W. S., Roitman, D. B., & Emshoff, J. G. (1987). The fidelity-adaptation debate: Implications for the implementation of public sector social programs. American Journal of Community Psychology, 15, 253–268. https://doi.org/10.1007/BF00922697.CrossRefGoogle Scholar
- Castro, F. G., Barrera, M., Jr., & Martinez, C. R., Jr. (2004). The cultural adaptation of prevention interventions: resolving tensions between fidelity and fit. Prevention Science, 5, 41–45. https://doi.org/10.1023/B:PREV.0000013980.12412.cd.CrossRefGoogle Scholar
- Century, J., Freeman, C., & Rudnick, M. (2008). Measuring and accumulating knowledge about fidelity of implementation (FOI) of science instructional materials. Retrieved from: http://cemsesite.cemseprojects.org/research-and-evaluation/research/foi/narst-framework.pdf.
- Ebbole, T. (2007). Evidence-based programs and practices: what does it all mean? Retrieved from: https://www.childhelp.org/wordpress/wp-content/uploads/2015/07/Research-Review-Evidence-Based-Programs-and-Practices-What-Does-It-All-Mean-Childrens-Services-Council.pdf.
- Ennett, S. T., Haws, S., Ringwalt, C. L., Vincus, A. A., Hanley, S., Bowling, J. M., Rohrbach, L. A. (2011). Evidence-based practice in school substance use prevention: Fidelity of implementation under real-world conditions. Health Education Research, 26, 361–371.Google Scholar
- Feldman Farb, A., Margolis, A. L. (2016). The teen pregnancy prevention program (2010-2015): Synthesis of impact findings. American Journal of Public Health, 106, S9–S15.Google Scholar
- Firpo-Triplett, R., & Fuller T.. (2012). General adaptation guidance: a guide to adapting evidence-based sexual health curricula. ETR Associates and CDC Division of Reproductive Health. Retrieved from: http://recapp.etr.org/recapp/documents/programs/GeneralAdaptationGuidanceFINAL.pdf.
- Fixsen, D. L., Naoom, S. F., Blase, K. A., Friedman, R. M., & Wallace, F. (2005). Implementation research: a synthesis of the literature. Tampa, FL: University of South Florida, Louis de La Parte Florida Mental Health Institute, The National Implementation Research Network (FMHI Publication #231). Retrieved from: http://centerforchildwelfare2.fmhi.usf.edu/kb/Implementation/Implementation%20Research%20-%20A%20Synthesis%20of%20Literature%20%20-%202005.pdf.
- Kam, C.-M., Greenberg, M. T., & Walls, C. T. (2003). Examining the role of implementation quality in school-based prevention using the PATHS curriculum. Promoting alternative THinking skills curriculum. Prevention Science: The Official Journal of the Society for Prevention Research, 4, 55–63.CrossRefGoogle Scholar
- Margolis, A. L., & Roper, A. Y. (2014). Practical experience from the office of adolescent health’s large scale implementation of an evidence-based teen pregnancy prevention program. Journal of Adolescent Health, 54, S10–S14. https://doi.org/10.1016/j.jadohealth.2013.11.026.CrossRefPubMedGoogle Scholar
- McGraw, S. A., Sellers, D. E., Johnson, C. C., Stone, E. J., Bachman, K. J., Bebchuk, J., … Edmundson, E. W. (1996). Using process data to explain outcomes an illustration from the child and adolescent trial for cardiovascular health (CATCH). Evaluation Review, 20, 291–312. https://doi.org/10.1177/0193841X9602000304.
- McHugh, R. K., Murray, H. W., & Barlow, D. H. (2009). Balancing fidelity and adaptation in the dissemination of empirically-supported treatments: the promise of transdiagnostic interventions. Behaviour Research and Therapy, 47, 946–953. https://doi.org/10.1016/j.brat.2009.07.005.CrossRefPubMedPubMedCentralGoogle Scholar
- Metz, A., Naoom, S. F., Halle, T., & Bartley, L. (2015). An integrated stage-based framework for implementation of early childhood programs and systems (OPRE Research Brief OPRE 201548). Washington, DC: Office of Planning, Research and Evaluation, Administration for Children and Families, U.S. Department of Health and Human Services.Google Scholar
- Mmari, K. N., Blum, R. W., Organization, W. H., et al. (2005). Risk and protective factors affecting adolescent reproductive health in developing countries. Retrieved from: http://apps.who.int/iris/handle/10665/43341.
- Nelson, M. C., Cordray, D. S., Hulleman, C. S., Darrow, C. L., & Sommer, E. C. (2012). A procedure for assessing intervention fidelity in experiments testing educational and behavioral interventions. The Journal of Behavioral Health Services & Research, 39, 374–396. https://doi.org/10.1007/s11414-012-9295-x.CrossRefGoogle Scholar
- Schoenfelder, E. N., Sandler, I. N., Millsap, R. E., Wolchik, S. A., Berkel, C., & Ayers, T. S. (2013). Caregiver responsiveness to the family bereavement program: what predicts responsiveness? What does responsiveness predict? Prevention Science: The Official Journal of the Society for Prevention Research, 14, 545–556. https://doi.org/10.1007/s11121-012-0337-7.CrossRefGoogle Scholar
- Spoth, R., Guyll, M., Trudeau, L., & Goldberg-Lillehoj, C. (2002). Two studies of proximal outcomes and implementation quality of universal preventive interventions in a community-university collaboration context. Journal of Community Psychology, 30, 499–518. https://doi.org/10.1002/jcop.10021.CrossRefGoogle Scholar
- State personal responsibility education program fact sheet | Family and Youth Services Bureau | Administration for Children and Families. (n.d.). Retrieved September 16, 2015, from: http://www.acf.hhs.gov/programs/fysb/resource/prep-fact-sheet.
- Sterne, J. A. C., White, I. R., Carlin, J. B., Spratt, M., Royston, P., Kenward, M. G., et al. (2009). Multiple imputation for missing data in epidemiological and clinical research: potential and pitfalls. BMJ, 338, b2393–b2393. https://doi.org/10.1136/bmj.b2393.CrossRefPubMedPubMedCentralGoogle Scholar
- Stuart, E. A., & Green, K. M. (2008). Using full matching to estimate causal effects in nonexperimental studies: examining the relationship between adolescent marijuana use and adult outcomes. Developmental Psychology, 44, 395–406. https://doi.org/10.1037/0012-16126.96.36.1995.CrossRefPubMedPubMedCentralGoogle Scholar
- Wamoyi, J., & Wight, D. (2014). ‘Mum never loved me.’ How structural factors influence adolescent sexual and reproductive health through parent–child connectedness: a qualitative study in rural Tanzania. African Journal of AIDS Research, 13, 169–178. https://doi.org/10.2989/16085906.2014.945387.CrossRefPubMedPubMedCentralGoogle Scholar
- Wang, B., Deveaux, L., Knowles, V., Koci, V., Rolle, G., Lunn, S., Li, X., Stanton, B. (2015). Fidelity of Implementation of an Evidence-Based HIV Prevention Program among Bahamian Sixth Grade Students. Prevention Science, 16, 110–121.Google Scholar