Advertisement

Prevention Science

, Volume 20, Issue 7, pp 1074–1088 | Cite as

Addressing the Adherence-Adaptation Debate: Lessons from the Replication of an Evidence-Based Sexual Health Program in School Settings

  • Jenita ParekhEmail author
  • Elizabeth Stuart
  • Robert Blum
  • Valerie Caldas
  • Brooke Whitfield
  • Jacky M. Jennings
Article

Abstract

Whether high adherence to programs is necessary to achieve program outcomes is an area of great debate. The objectives of this study were to determine the frequency, type, and rationale of adaptations made in the implementation of an evidence-based program and to determine program outcomes for intervention program participants, as compared to comparison participants, by the level of adaptations. A total of 1608 participants in 45 classrooms participated. Percent adaptations was calculated by classroom. Thematic qualitative analysis was used to categorize types and rationales for adaptations. Program outcomes by level of adaptations were determined using logistic regression analyses and mean differences. Propensity score matching methods were used to create comparability between adaptation subgroup participants and comparison participants. Adaptations ranged from 2 to 97% across classrooms, with mean adaptations of 63%. Thematic analysis revealed that the adaptations made were related to delivery of content, rather than to the content itself and in response to participant needs and setting constraints. Program outcomes did not appear to be reduced for the high-adaptation subgroup. Understanding both rationale (intent) and type of adaptation made is crucial to understanding the complexity of adaptations. These finding support the argument for allowing facilitators some flexibility and autonomy to adapt the delivery of prescribed content to participant needs and setting constraints.

Keywords

Program implementation Evidence-based program Sexual and reproductive health programs Adolescents Program implementation fidelity 

Notes

Funding

Its contents are solely the responsibility of the author and do not necessarily represent the official views of the funder. J. Parekh was supported by the National Institute of Allergy and Infectious Disease (NIAID T32 AI050056-12).

Compliance with Ethical Standards

Conflict of Interest

The authors declare that they have no conflict of interest.

Ethical Approval

All procedures performed in studies involving human participants were in accordance with the ethical standards of Johns Hopkins University’s IRB and with the 1964 Helsinki declaration and its later amendments or comparable ethical standards.

Informed Consent

Informed consent was obtained from all individual participants included in the study.

Funding Information

This project is supported by the Administration for Children and Families’ Family and Youth Services Bureau (FYSB), utilizing federal funds HHS-2010-ACF-ACYF-PREP-0125.

References

  1. ATLASti. Version 1.0.24. (1999). Berlin, Germany: Scientific Sofware Development. Retrieved from: https://atlasti.com/.
  2. Barrera, M., Berkel, C., & Castro, F. G. (2017). Directions for the advancement of culturally adapted preventive interventions: local adaptations, engagement, and sustainability. Prevention Science, 18, 640–648.  https://doi.org/10.1007/s11121-016-0705-9.CrossRefPubMedGoogle Scholar
  3. Bell, S. G., Newcomer, S. F., Bachrach, C., Borawski, E., Jemmott III, J. B., Morrison, D., … Zimmerman, R. (2007). Challenges in replicating interventions. Journal of Adolescent Health, 40, 514–520.  https://doi.org/10.1016/j.jadohealth.2006.09.005.
  4. Berkel, C., Mauricio, A. M., Schoenfelder, E., & Sandler, I. N. (2011a). Putting the pieces together: an integrated model of program implementation. Prevention Science, 12, 23–33.  https://doi.org/10.1007/s11121-010-0186-1.CrossRefPubMedPubMedCentralGoogle Scholar
  5. Berkel, C., Mauricio, A. M., Schoenfelder, E., & Sandler, I. N. (2011b). Putting the pieces together: an integrated model of program implementation. Prevention Science, 12, 23–33.  https://doi.org/10.1007/s11121-010-0186-1.CrossRefPubMedPubMedCentralGoogle Scholar
  6. Berman, P., & McLaughlin, M. W. (1978). Federal programs supporting educational change, Vol. VIII: Implementing and sustaining innovations. Santa Monica: RAND Corporation, R-1589/8-HEW, 1978. Retrieved from: https://www.rand.org/pubs/reports/R1589z8.html.
  7. Blake, S. M., Simkin, L., Ledsky, R., Perkins, C., & Calabrese, J. M. (2001). Effects of a parent-child communications intervention on young adolescents’ risk for early onset of sexual intercourse. Family Planning Perspectives, 33, 52–61.CrossRefPubMedGoogle Scholar
  8. Blakely, C. H., Mayer, J. P., Gottschalk, R. G., Schmitt, N., Davidson, W. S., Roitman, D. B., & Emshoff, J. G. (1987). The fidelity-adaptation debate: Implications for the implementation of public sector social programs. American Journal of Community Psychology, 15, 253–268.  https://doi.org/10.1007/BF00922697.CrossRefGoogle Scholar
  9. Bumbarger, B., & Perkins, D. (2008). After randomised trials: issues related to dissemination of evidence-based interventions. Journal of Children’s Services, 3, 55–64.CrossRefGoogle Scholar
  10. Carroll, C., Patterson, M., Wood, S., Booth, A., Rick, J., & Balain, S. (2007a). A conceptual framework for implementation fidelity. Implementation Science, 2, 40.  https://doi.org/10.1186/1748-5908-2-40.CrossRefGoogle Scholar
  11. Carroll, C., Patterson, M., Wood, S., Booth, A., Rick, J., & Balain, S. (2007b). A conceptual framework for implementation fidelity. Implementation Science, 2, 40.  https://doi.org/10.1186/1748-5908-2-40.CrossRefGoogle Scholar
  12. Castro, F. G., Barrera, M., Jr., & Martinez, C. R., Jr. (2004). The cultural adaptation of prevention interventions: resolving tensions between fidelity and fit. Prevention Science, 5, 41–45.  https://doi.org/10.1023/B:PREV.0000013980.12412.cd.CrossRefGoogle Scholar
  13. Century, J., Freeman, C., & Rudnick, M. (2008). Measuring and accumulating knowledge about fidelity of implementation (FOI) of science instructional materials. Retrieved from: http://cemsesite.cemseprojects.org/research-and-evaluation/research/foi/narst-framework.pdf.
  14. Cronbach, L. J. (1951). Coefficient alpha and the internal structure of tests. Psychometrika, 16(3), 297–334.  https://doi.org/10.1007/BF02310555.CrossRefGoogle Scholar
  15. Cross, W., & West, J. (2011). Examining implementer fidelity: conceptualising and measuring adherence and competence. Journal of Children’s Services, 6, 18–33.  https://doi.org/10.5042/jcs.2011.0123.CrossRefPubMedPubMedCentralGoogle Scholar
  16. Dane, A. V., & Schneider, B. H. (1998). Program integrity in primary and early secondary prevention: Are implementation effects out of control? Clinical Psychology Review, 18, 23–45.CrossRefPubMedGoogle Scholar
  17. Durlak, J. A., & DuPre, E. P. (2008). Implementation matters: a review of research on the influence of implementation on program outcomes and the factors affecting implementation. American Journal of Community Psychology, 41, 327–350.  https://doi.org/10.1007/s10464-008-9165-0.CrossRefGoogle Scholar
  18. Dusenbury, L. (2003). A review of research on fidelity of implementation: implications for drug abuse prevention in school settings. Health Education Research, 18, 237–256.  https://doi.org/10.1093/her/18.2.237.CrossRefGoogle Scholar
  19. Elliott, D. S., & Mihalic, S. (2004). Issues in disseminating and replicating effective prevention programs. Prevention Science: The Official Journal of the Society for Prevention Research, 5, 47–53.CrossRefGoogle Scholar
  20. Ennett, S. T., Haws, S., Ringwalt, C. L., Vincus, A. A., Hanley, S., Bowling, J. M., Rohrbach, L. A. (2011). Evidence-based practice in school substance use prevention: Fidelity of implementation under real-world conditions. Health Education Research, 26, 361–371.Google Scholar
  21. Feldman Farb, A., Margolis, A. L. (2016). The teen pregnancy prevention program (2010-2015): Synthesis of impact findings. American Journal of Public Health, 106, S9–S15.Google Scholar
  22. Fetters, M. D., Curry, L. A., & Creswell, J. W. (2013). Achieving integration in mixed methods designs—principles and practices. Health Services Research, 48(6 Pt 2), 2134–2156.  https://doi.org/10.1111/1475-6773.12117.CrossRefPubMedPubMedCentralGoogle Scholar
  23. Firpo-Triplett, R., & Fuller T.. (2012). General adaptation guidance: a guide to adapting evidence-based sexual health curricula. ETR Associates and CDC Division of Reproductive Health. Retrieved from: http://recapp.etr.org/recapp/documents/programs/GeneralAdaptationGuidanceFINAL.pdf.
  24. Fixsen, D. L., Naoom, S. F., Blase, K. A., Friedman, R. M., & Wallace, F. (2005). Implementation research: a synthesis of the literature. Tampa, FL: University of South Florida, Louis de La Parte Florida Mental Health Institute, The National Implementation Research Network (FMHI Publication #231). Retrieved from: http://centerforchildwelfare2.fmhi.usf.edu/kb/Implementation/Implementation%20Research%20-%20A%20Synthesis%20of%20Literature%20%20-%202005.pdf.
  25. Gottfredson, D. C., Gottfredson, G. D., & Hybl, L. G. (1993). Managing adolescent behavior a multiyear, multischool study. American Educational Research Journal, 30, 179–215.  https://doi.org/10.3102/00028312030001179.CrossRefGoogle Scholar
  26. Hsieh, H.-F., & Shannon, S. E. (2005). Three approaches to qualitative content analysis. Qualitative Health Research, 15(9), 1277–1288.CrossRefPubMedGoogle Scholar
  27. Kam, C.-M., Greenberg, M. T., & Walls, C. T. (2003). Examining the role of implementation quality in school-based prevention using the PATHS curriculum. Promoting alternative THinking skills curriculum. Prevention Science: The Official Journal of the Society for Prevention Research, 4, 55–63.CrossRefGoogle Scholar
  28. Kelsey, M., & Layzer, J. (2014a). Implementing three evidence-based program models: early lessons from the teen pregnancy prevention replication study. Journal of Adolescent Health, 54, S45–S52.  https://doi.org/10.1016/j.jadohealth.2013.12.024.CrossRefPubMedGoogle Scholar
  29. Kelsey, M., & Layzer, J. (2014b). Implementing three evidence-based program models: early lessons from the teen pregnancy prevention replication study. Journal of Adolescent Health, 54, S45–S52.  https://doi.org/10.1016/j.jadohealth.2013.12.024.CrossRefPubMedGoogle Scholar
  30. Kershner, S., Flynn, S., Prince, M., Potter, S. C., Craft, L., & Alton, F. (2014). Using data to improve fidelity when implementing evidence-based programs. Journal of Adolescent Health, 54, S29–S36.  https://doi.org/10.1016/j.jadohealth.2013.11.027.CrossRefPubMedGoogle Scholar
  31. Margolis, A. L., & Roper, A. Y. (2014). Practical experience from the office of adolescent health’s large scale implementation of an evidence-based teen pregnancy prevention program. Journal of Adolescent Health, 54, S10–S14.  https://doi.org/10.1016/j.jadohealth.2013.11.026.CrossRefPubMedGoogle Scholar
  32. McGraw, S. A., Sellers, D. E., Johnson, C. C., Stone, E. J., Bachman, K. J., Bebchuk, J., … Edmundson, E. W. (1996). Using process data to explain outcomes an illustration from the child and adolescent trial for cardiovascular health (CATCH). Evaluation Review, 20, 291–312.  https://doi.org/10.1177/0193841X9602000304.
  33. McHugh, R. K., Murray, H. W., & Barlow, D. H. (2009). Balancing fidelity and adaptation in the dissemination of empirically-supported treatments: the promise of transdiagnostic interventions. Behaviour Research and Therapy, 47, 946–953.  https://doi.org/10.1016/j.brat.2009.07.005.CrossRefPubMedPubMedCentralGoogle Scholar
  34. Metz, A., Naoom, S. F., Halle, T., & Bartley, L. (2015). An integrated stage-based framework for implementation of early childhood programs and systems (OPRE Research Brief OPRE 2015­48). Washington, DC: Office of Planning, Research and Evaluation, Administration for Children and Families, U.S. Department of Health and Human Services.Google Scholar
  35. Mmari, K. N., Blum, R. W., Organization, W. H., et al. (2005). Risk and protective factors affecting adolescent reproductive health in developing countries. Retrieved from: http://apps.who.int/iris/handle/10665/43341.
  36. Moore, J. E., Bumbarger, B. K., & Cooper, B. R. (2013). Examining adaptations of evidence-based programs in natural contexts. The Journal of Primary Prevention, 34, 147–161.  https://doi.org/10.1007/s10935-013-0303-6.CrossRefPubMedGoogle Scholar
  37. Nelson, M. C., Cordray, D. S., Hulleman, C. S., Darrow, C. L., & Sommer, E. C. (2012). A procedure for assessing intervention fidelity in experiments testing educational and behavioral interventions. The Journal of Behavioral Health Services & Research, 39, 374–396.  https://doi.org/10.1007/s11414-012-9295-x.CrossRefGoogle Scholar
  38. Philliber, S. (2015). Evaluating teen pregnancy prevention programs: decades of evolving strategies and practices. Societies, 5, 631–645.  https://doi.org/10.3390/soc5030631.CrossRefGoogle Scholar
  39. Rosenbaum, P. R., & Rubin, D. B. (1983). The central role of the propensity score in observational studies for causal effects. Biometrika, 70, 41–55.  https://doi.org/10.1093/biomet/70.1.41.CrossRefGoogle Scholar
  40. Schoenfelder, E. N., Sandler, I. N., Millsap, R. E., Wolchik, S. A., Berkel, C., & Ayers, T. S. (2013). Caregiver responsiveness to the family bereavement program: what predicts responsiveness? What does responsiveness predict? Prevention Science: The Official Journal of the Society for Prevention Research, 14, 545–556.  https://doi.org/10.1007/s11121-012-0337-7.CrossRefGoogle Scholar
  41. Spoth, R., Guyll, M., Trudeau, L., & Goldberg-Lillehoj, C. (2002). Two studies of proximal outcomes and implementation quality of universal preventive interventions in a community-university collaboration context. Journal of Community Psychology, 30, 499–518.  https://doi.org/10.1002/jcop.10021.CrossRefGoogle Scholar
  42. State personal responsibility education program fact sheet | Family and Youth Services Bureau | Administration for Children and Families. (n.d.). Retrieved September 16, 2015, from: http://www.acf.hhs.gov/programs/fysb/resource/prep-fact-sheet.
  43. Sterne, J. A. C., White, I. R., Carlin, J. B., Spratt, M., Royston, P., Kenward, M. G., et al. (2009). Multiple imputation for missing data in epidemiological and clinical research: potential and pitfalls. BMJ, 338, b2393–b2393.  https://doi.org/10.1136/bmj.b2393.CrossRefPubMedPubMedCentralGoogle Scholar
  44. Stirman, S., Miller, C. J., Toder, K., & Calloway, A. (2013). Development of a framework and coding system for modifications and adaptations of evidence-based interventions. Implementation Science, 8, 65.  https://doi.org/10.1186/1748-5908-8-65.CrossRefPubMedGoogle Scholar
  45. Stuart, E. A., & Green, K. M. (2008). Using full matching to estimate causal effects in nonexperimental studies: examining the relationship between adolescent marijuana use and adult outcomes. Developmental Psychology, 44, 395–406.  https://doi.org/10.1037/0012-1649.44.2.395.CrossRefPubMedPubMedCentralGoogle Scholar
  46. Wamoyi, J., & Wight, D. (2014). ‘Mum never loved me.’ How structural factors influence adolescent sexual and reproductive health through parent–child connectedness: a qualitative study in rural Tanzania. African Journal of AIDS Research, 13, 169–178.  https://doi.org/10.2989/16085906.2014.945387.CrossRefPubMedPubMedCentralGoogle Scholar
  47. Wang, B., Deveaux, L., Knowles, V., Koci, V., Rolle, G., Lunn, S., Li, X., Stanton, B. (2015). Fidelity of Implementation of an Evidence-Based HIV Prevention Program among Bahamian Sixth Grade Students. Prevention Science, 16, 110–121.Google Scholar

Copyright information

© Society for Prevention Research 2019

Authors and Affiliations

  1. 1.The Department of Population, Family and Reproductive HealthThe Johns Hopkins Bloomberg School of Public HealthBaltimoreUSA
  2. 2.The Johns Hopkins Center for Child & Community HealthJohns Hopkins Bayview Medical CenterBaltimoreUSA
  3. 3.The Department of Mental HealthThe Johns Hopkins Bloomberg School of Public HealthBaltimoreUSA
  4. 4.Child Trends, Inc.BethesdaUSA

Personalised recommendations