Skip to main content

Evaluating Fidelity and Effectiveness of Interventions

  • Chapter
Handbook of Injury and Violence Prevention

The implementation and dissemination of injury control interventions in community settings often follows the publication of findings from formal research trials. In the absence of strong evidence for an intervention, public health administrators must decide to solve problems by creating new solutions or implementing interventions that appear to be promising and model current public health best practice. In either scenario, program officers need to understand if the decision to implement the intervention led to an important public health effect in their community setting. The evaluation of the effectiveness of an intervention or program is particularly important if there are few previous studies of the program’s effectiveness in real-world settings because the future diffusion of the program to other sites depends on its demonstrated effectiveness. The evaluation of the fidelity of the program implementation process is also an important task for program officials. Fidelity is defined as the accuracy or exactness of the replication of an intervention, based on the prototype model. Program administrators must be careful to monitor the fidelity of the program components as they have been developed in previous research models. Failure to monitor the quality of implementation of the intervention procedures could lead to reduced effectiveness of the program.

The purpose of this chapter is to review approaches toward the evaluation of program effectiveness and fi delity. The intended audience for this chapter is primarily injury control and public health program administrators and their staff. In this chapter, we describe two specifi c scenarios frequently faced by program administrators. The fi rst is the development of an evaluation section of a funding proposal from a state government department of health to disseminate an injury prevention intervention. The second involves planning for a site visit to evaluate an injury prevention program in a state or local health department. Both require extensive preparation and coordination of people and activities. Both require careful attention to program implementation and evaluation strategies. We believe that the execution of these scenarios illustrates some of the practical principles regarding the evaluation of program fi delity and effectiveness.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

eBook
USD 16.99
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 69.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 109.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  • Amherst H. Wilder Foundation. (2005). The Wilder collaboration factors inventory. Retrieved September 13, 2005, from http://surveys.wilder.org/public_cfi/index.php

  • Aos, S., Lieb, R., Mayfield, J., Miller, M., & Pennucci, A. (2004). Benefits and costs of prevention and early intervention programs for youth. Olympia: Washington State Institute for Public Policy

    Google Scholar 

  • Broughton, W. (1991). Qualitative methods in program evaluation. American Journal of Health Promotion, 5, 461–465

    PubMed  CAS  Google Scholar 

  • Center for the Advancement of Collaborative Strategies in Health. (2004). Partnership self-assessment tool. Retrieved September 13, 2005, from http://www.PartnershipTool.net/overview.htm

  • Center for Civic Partnerships. (2001). Sustainability toolkit. Retrieved September 13, 2005, from http://www.civicpartnerships.org/

  • Center for the Study and Prevention of Violence. (2005). Blueprints for violence prevention. Retrieved September 13, 2005, from http://www.colorado.edu/cspv/blueprints

  • Christoffel, T., & Scavo Gallagher, S. (1999). Injury prevention and public health. Gaithersburg, MD: Aspen Publishers

    Google Scholar 

  • Crump, C. E., & Letourneau, R. J. (2002). Developing a process to evaluate a national injury prevention program: The Indian Health Service injury prevention program. In A. Steckler & L. Linnan (Eds.), Process evaluation for public health interventions (pp. 321–357). San Francisco: Josey-Bass

    Google Scholar 

  • DiGuiseppi, C., Roberts, I., Wade, A., Sculpher, M., Edwards, P., Godward, C., Pan, H., & Slater, S. (2002). Incidence of fires and related injuries after giving out free smoke alarms: Cluster randomised controlled trial. British Medical Journal, 325, 995

    Article  Google Scholar 

  • Doll, L., Bartenfeld, T., & Binder, S. (2003). Evaluation of interventions designed to prevent and control injuries. Epidemiologic Reviews, 25, 51–59

    Article  PubMed  Google Scholar 

  • Douglas, M. R., Mallonee, S., & Istre, G. R. (1999). Estimating the proportion of homes with functioning smoke alarms: A comparison of telephone survey and household survey results. American Journal of Public Health, 89, 1112–1114

    Article  PubMed  CAS  Google Scholar 

  • The Evaluation Center of Western Michigan University. (2005). Evaluation checklists. Retrieved September 13, 2005, from http://www.wmich.edu/evalctr/checklists/checklistmenu.htm

  • Fazzini, T. M., Perkins. R., & Grossman D. (2000). Ionization and photoelectric smoke alarms in rural Alaskan homes. Western Journal of Medicine, 173, 89–92

    Article  PubMed  CAS  Google Scholar 

  • International Association for Public Participation. Public participation spectrum. Retrieved December 15, 2005, from http://iap2.org/practitionertools/spectrum.pdf

  • International Association for Public Participation. IAP2 Home page. Retrieved December 15, 2005, from http://www.iap2.org

  • Lee, A. (2002). Preliminary test results on lithium batteries used in residential smoke alarms. Retrieved September 13, 2005, from http://www.cpsc.gov/LIBRARY/FOIA/FOIA02/os/LithiumFinal.pdf

  • Mallonee, S., Istre, G.R., Rosenberg, M., Reddish-Douglas, M., Jordan, F., Silverstein, P., & Tunneil, W. (1996). Surveillance and prevention of residential-fire injuries. New England Journal of Medicine, 335, 27–31

    Article  PubMed  CAS  Google Scholar 

  • Nagy, J. (2003). Getting grants and financial resources: Developing a plan for financial sustainability. Retrieved September 13, 2005, from http://ctb.ku.edu/tools/en/chapter_1042.htm

  • Nurse-Family Partnership. (2005a). Fact sheet: Overview of nurse-family partnership. Retrieved September 3, 2005, from http://www.nursefamilypartnership.org/resources/files/PDF/Fact_Sheets/NFPOverview.pdf

  • Nurse-Family Partnership. (2005b). Nurse-Family Partnership Home page. Retrieved September 3, 2005, from http://www.nursefamilypartnership.org

  • Olds, D. L., Kitzman, H., Cole, R., Robinson, J., Sidora, K., Luckey, D. W., Henderson, C. R. J., Houks, C., Bondy, J., & Holmberg, J. (2004). Effects of nurse home-visiting on maternal life course and child development: Age 6 follow-up results of a randomized trial. Pediatrics, 114, 1550–1559

    Article  PubMed  Google Scholar 

  • Pietrazak, J., Ramler, M., Renner, T., Ford, L., & Gilbert, N. (1990). Practical program evaluation: Examples from child abuse prevention. Thousand Oaks, CA: Sage

    Google Scholar 

  • Plotnick, R., & Deppman, L. (1999). Using benefit-cost analysis to assess child abuse prevention and intervention programs. Child Welfare, 78, 381–407

    PubMed  CAS  Google Scholar 

  • Quinn Patton, M. (2002). Utilization-focused evaluation checklist. Retrieved September 13, 2005, from http://www.wmich.edu/evalctr/checklists/ufechecklist.htm

  • Sewell, M., & Marczak, M. S. (1998). Using cost analysis in evaluation. Retrieved September 3, 2005, from http://ag.arizona.edu/fcs/cyfernet/cyfar/Costben2.htm

  • State and Territorial Injury Prevention Directors’ Association. (2003). Safe States, 2003 edition. Atlanta, GA: Author

    Google Scholar 

Other Resources

  • Centers for Disease Control and Prevention. (2000). Demonstrating your program’s worth: A primer on evaluation for programs to prevent unintentional injury. Retrieved September 13, 2005, from http://www.cdc.gov/ncipc/pub-res/demonstr.htm

  • Milstein, B., Wetterhall S., & CDC Evaluation Working Group. (2003). Introduction to evaluation: A framework for program evaluation: A gateway to tools. Retrieved September 13, 2005, from http://ctb.ku.edu/tools/en/chapter_1036.htm

  • National Highway Traffic Safety Administration, Department of Transportation. (1999). The art of appropriate evaluation: A guide for highway safety program managers (NHTSA, DOT HS 808-894). Washington, DC: National Highway Traffic Safety Administration

    Google Scholar 

  • University of Texas-Houston Health Science Center. (2000). Practical evaluation of public health programs: PHTN course VC-0017 workbook. Retrieved September 13, 2005, from http://www.cdc.gov/eval/workbook.PDF

Download references

Author information

Authors and Affiliations

Authors

Corresponding authors

Correspondence to Lawrence R. Berger MD, MPH or David C. Grossman PhD, MPH .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2008 Springer Science+Business Media, LLC

About this chapter

Cite this chapter

Berger, L.R., Grossman, D.C. (2008). Evaluating Fidelity and Effectiveness of Interventions. In: Doll, L.S., Bonzo, S.E., Sleet, D.A., Mercy, J.A. (eds) Handbook of Injury and Violence Prevention. Springer, Boston, MA. https://doi.org/10.1007/978-0-387-29457-5_26

Download citation

  • DOI: https://doi.org/10.1007/978-0-387-29457-5_26

  • Publisher Name: Springer, Boston, MA

  • Print ISBN: 978-0-387-85769-5

  • Online ISBN: 978-0-387-29457-5

  • eBook Packages: MedicineMedicine (R0)

Publish with us

Policies and ethics