Skip to main content

Advertisement

Log in

Translating research into evidence-based practice in juvenile justice: brand-name programs, meta-analysis, and key issues

  • Published:
Journal of Experimental Criminology Aims and scope Submit manuscript

Abstract

Objectives

To investigate the utility of two main approaches for translating research into evidence-based practice in juvenile justice: (a) brand-name programs that are identified by lists of various expert groups and come with implementation and quality assurance packages offered by program developers; and (b) results of large-scale meta-analyses that offer a number of generalized strategies (or generics) for improving existing programs.

Methods

Informed by prospect theory, a first-stage analytic decision-tree model was developed that included three comparable evidence-based programs (two brand names and one generic). Implementation success was a key factor, and analyses were conducted under two conditions.

Results

Under the first condition, where brand-name programs have a large advantage in implementation success over generic programs, it was found that the brand-name programs had the highest expected values. Under the second condition, which considered the role of Lipsey et al.’s (2010) Standardized Program Evaluation Protocol, it was found that all three programs produced highly favorable expected values.

Conclusions

Brand-name programs and meta-analyses represent two rigorous and transparent approaches for advancing evidence-based practice in juvenile justice. State governments should consider the merits of both approaches through a decision-tree model, paying particular attention to implementation success as well as financial costs and benefits derived from rigorous cost–benefit analysis.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3

Similar content being viewed by others

Notes

  1. Mears (2010) reminds us that when viewed from an evaluation research perspective, evidence-based policy includes much more than the use of effective programs; it also includes “whether a policy is needed, whether it rests on solid theory, whether it is implemented well, and whether it is cost-efficient” (p. 43, emphasis in original).

  2. Importantly, both brand-name and generic programs are derived from studies that use the highest quality research designs: randomized controlled experiments and rigorous quasi-experiments. For example, for Blueprints to certify a brand-name program as proven (“model”) one of the criteria is that it must have been evaluated with an experimental design. Similarly, Lipsey’s meta-analyses only include the highest quality studies (e.g., Lipsey 2009).

  3. Lipsey and his colleagues (2010) note that one other approach can involve direct evaluation of individual programs. Here, the implication is that an evaluation that demonstrates that a program is effective can be influential on a much larger scale. Because this approach does not involve bringing together the accumulated research evidence on a particular intervention type or program, it is not the focus here.

  4. More detailed information on the application of the scoring system of these four factors can be found in Lipsey et al. (2010, pp. 30–32).

  5. One potential cautionary note for drawing upon this theoretical framework is that the nature of evaluation research has changed considerably in the last 20 years. At the time Shadish and his colleagues were formulating these views, there were no brand-name programs for juvenile offenders, and meta-analysis in criminology was only in its infancy. Incremental change meant using the findings from a program evaluation (more than likely just one) in an effort to improve upon an existing program. Major policy change meant using the findings from a program evaluation (again, more than likely just one) to replace an existing program or policy. It may be, however, that evaluation research is just operating at a different scale today, and that the change we are witnessing has more to do with within-group rather than between-group differences.

  6. The first step of the model involves estimating each program’s effectiveness. Meta-analytic techniques are used. Step two looks at whether previously measured program results can be replicated in the state. Step three involves an assessment of program costs, estimated on the basis of what it would cost the Washington State government to implement a similar program (if the program was not already operating in the state). Step four involves monetizing each program’s effects on crime. Savings to the justice system and crime victims are estimated. The final step involves calculating the economic contribution of the program, expressed as a benefit-to-cost ratio. Based on this, programs can be judged on their independent and comparative monetary value.

  7. More recent updates published by WSIPP no longer include the “other family-based therapy” program. This was confirmed by the director of WSIPP, Aos (personal communication 2012).

  8. The adjusted effect sizes for all three programs were considerably lower than the weighted mean effect sizes: FFT (−0.325 versus −0.586); MST (−0.169 versus −0.322); and other family-based therapy (−0.160 versus −0.273) (Aos et al. 2004b, pp. 17, 19).

References

  • Alexander, J. F., & Sexton, T. L. (2002). Functional family therapy: A model for treating high-risk, acting-out youth. In F. W. Kaslow & J. L. Lebow (Eds.), Comprehensive handbook of psychotherapy: Integrative/eclectic (Vol. 4, pp. 111–132). Hoboken: Wiley.

    Google Scholar 

  • Andrews, D. A., & Bonta, J. (2010). The psychology of criminal conduct. Cincinnati: Anderson.

    Google Scholar 

  • Andrews, D. A., & Dowden, C. (2006). Risk principle of case classification in correctional treatment: a meta-analytic investigation. International Journal of Offender Therapy and Comparative Criminology, 50, 88–100.

    Article  Google Scholar 

  • Andrews, D. A., Zinger, I., Hoge, R. D., Bonta, J., Gendreau, P., & Cullen, F. T. (1990). Does correctional treatment work? A clinically relevant and psychologically informed meta–analysis. Criminology, 28, 369–404.

    Article  Google Scholar 

  • Aos, S. (2012). Personal communication with the first author, June 4, 2012.

  • Aos, S., & Drake, E. K. (2010). WSIPP’s benefit-cost tool for states: Examining policy options in sentencing and corrections. Olympia: Washington State Institute for Public Policy.

    Google Scholar 

  • Aos, S., Barnoski, R., & Lieb, R. (1998). Preventive programs for young offenders: effective and cost-effective. Overcrowded Times, 9(2)(1), 7–11.

    Google Scholar 

  • Aos, S., Lieb, R., Mayfield, J., Miller, M. G., & Pennucci, A. (2004a). Benefits and costs of prevention and early intervention programs for youth. Olympia: Washington State Institute for Public Policy.

    Google Scholar 

  • Aos, S., Lieb, R., Mayfield, J., Miller, M. G., & Pennucci, A. (2004b). Benefits and costs of prevention and early intervention programs for youth: Technical appendix. Olympia: Washington State Institute for Public Policy.

    Google Scholar 

  • Barnett, W. S. (1996). Lives in the balance. Ypsilanti: High/Scope Press.

    Google Scholar 

  • Bumbarger, B. K. (2012). Creating a state-level prevention support system for the dissemination of evidence-based programs (EBPs). Paper presented at Blueprints conference, San Antonio, Texas, April 11–13, 2012.

  • Bumbarger, B. K., Perkins, D. F., & Greenberg, M. T. (2010). Taking effective prevention to scale. In B. Doll, W. Pfohl, & J. Yoon (Eds.), Handbook of youth prevention science (pp. 433–444). New York: Routledge.

    Google Scholar 

  • Chamberlain, P., & Reid, J. B. (1998). Comparison of two community alternatives to incarceration for chronic juvenile offenders. Journal of Consulting and Clinical Psychology, 66, 624–633.

    Article  Google Scholar 

  • Coalition for Evidence-Based Policy (2010). Top tier evidence initiative: a validated resource, used by Congress and the Executive Branch, to identify social program models supported by definitive evidence of effectiveness. Available at: http://toptierevidence.org/wordpress/wp-content/uploads/TopTierProjectOverview-June2010.pdf.

  • Dodge, K. A. (2001). The science of youth violence prevention: progressing from developmental epidemiology to efficacy to effectiveness to public policy. American Journal of Preventive Medicine, 20(1S), 63–70.

    Article  Google Scholar 

  • Dowden, C., & Andrews, D. A. (1999). What works for female offenders: a meta-analytic review. Crime and Delinquency, 45, 438–452.

    Article  Google Scholar 

  • Dowden, C., & Andrews, D. A. (2000). Effective correctional treatment and violent reoffending: a meta-analysis. Canadian Journal of Criminology, 42, 449–468.

    Google Scholar 

  • Drake, E. K., Aos, S., & Miller, M. G. (2009). Evidence-based public policy options to reduce crime and criminal justice costs: implications in Washington State. Victims and Offenders, 4, 170–196.

    Article  Google Scholar 

  • Elliott, D. S., & Mihalic, S. F. (2004). Issues in disseminating and replicating effective prevention programs. Prevention Science, 5, 47–52.

    Article  Google Scholar 

  • Farrington, D. P., & Welsh, B. C. (2007). Saving children from a life of crime: Early risk factors and effective interventions. New York: Oxford University Press.

    Google Scholar 

  • Fixsen, D. L., Blase, K. A., Timbers, G. D., & Wolf, M. M. (2001). In search of program implementation: 792 replications of the teaching-family model. In G. A. Bernfeld, D. P. Farrington, & A. W. Lescheid (Eds.), Offender rehabilitation in practice: Implementing and evaluating effective programs (pp. 149–166). Chichester: Wiley.

    Google Scholar 

  • Fixsen, D. L., Blase, K. A., Naoom, S. F., & Wallace, F. (2009). Core implementation components. Research on Social Work Practice, 19, 531–540.

    Article  Google Scholar 

  • Fixsen, D. L., Blase, K. A., Metz, A., & van Dyke, M. (2013). Statewide implementation of evidence-based programs. Exceptional Children, 79, 213–230.

    Google Scholar 

  • Gendreau, P. (1996). Offender rehabilitation: what we know and what needs to be done. Criminal Justice and Behavior, 23, 144–161.

    Article  Google Scholar 

  • Gendreau, P., & Andrews, D. A. (1996). Correctional program assessment inventory (CPAI) (6th ed.). Saint John: University of New Brunswick.

    Google Scholar 

  • Gendreau, P., Goggin, C., & Smith, P. (2001). Implementation guidelines for correctional programs in the real world. In G. A. Bernfeld, D. P. Farrington, & A. W. Lescheid (Eds.), Offender rehabilitation in practice: Implementing and evaluating effective programs (pp. 247–268). Chichester: Wiley.

    Google Scholar 

  • Greenwood, P. W. (2006). Changing lives: Delinquency prevention as crime-control policy. Chicago: University of Chicago Press.

    Google Scholar 

  • Greenwood, P. W., & Welsh, B. C. (2012). Promoting evidence-based practice in delinquency prevention at the state level: principles, progress, and policy directions. Criminology & Public Policy, 11, 493–513.

    Article  Google Scholar 

  • Hannson, S. O. (2005). Decision theory: A brief introduction. Stockholm: Department of Philosophy and the History of Technology, Royal Institute of Technology.

    Google Scholar 

  • Henggeler, S. W., Schoenwald, S. K., Borduin, C. M., Rowland, M. D., & Cunningham, P. B. (1998). Multisystemic treatment of antisocial behavior in children and adolescents. New York: Guilford.

    Google Scholar 

  • Howell, J. C. (2009). Preventing and reducing juvenile delinquency: A comprehensive framework (2nd ed.). Thousand Oaks: Sage.

    Google Scholar 

  • Howell, J. C., & Lipsey, M. W. (2012). Research-based guidelines for juvenile justice programs. Justice Research and Policy, 14, 17–34.

    Article  Google Scholar 

  • Kahneman, D., & Tversky, A. (1979). Prospect theory: an analysis of decision under risk. Econometrica, 47, 263–292.

    Article  Google Scholar 

  • Kirkwood, C. W. (2002). Decision tree primer. Available at: http://www.public.asu.edu/~kirkwood/DAStuff/decisiontrees/index.html.

  • Landenberger, N. A., & Lipsey, M. W. (2005). The positive effects of cognitive-behavioral programs for offenders: a meta-analysis of factors associated with effective treatment. Journal of Experimental Criminology, 1, 451–476.

    Article  Google Scholar 

  • Latessa, E. J., & Holsinger, A. (1998). The importance of evaluating correctional programs: assessing outcome and quality. Corrections Management Quarterly, 2, 22–29.

    Google Scholar 

  • Lee, S., Aos, S., Drake, E. K., Pennucci, A., Miller, M. G., & Anderson, L. (2012). Return on investment: Evidence-based options to reduce statewide outcomes. Olympia: Washington State Institute for Public Policy.

    Google Scholar 

  • Levy, J. S. (1992). An introduction to prospect theory. Political Psychology, 13, 171–186.

    Article  Google Scholar 

  • Lipsey, M. W. (2008). The Arizona standardized program in evaluation protocol (SPEP) for assessing the effectiveness of programs for juvenile probationers. Nashville: Vanderbilt University.

    Google Scholar 

  • Lipsey, M. W. (2009). The primary factors that characterize effective interventions with juvenile offenders: a meta-analytic overview. Victims and Offenders, 4, 124–147.

    Article  Google Scholar 

  • Lipsey, M. W., & Cullen, F. T. (2007). The effectiveness of correctional rehabilitation: a review of systematic reviews. Annual Review of Law and Social Science, 3, 297–320.

    Article  Google Scholar 

  • Lipsey, M. W., & Howell, J. C. (2012). A broader view of evidence-based programs reveals more options for state juvenile justice systems. Criminology & Public Policy, 11, 515–523.

    Article  Google Scholar 

  • Lipsey, M. W., Howell, J. C., & Tidd, S. T. (2007). The standardized program evaluation protocol (SPEP): A practical approach to evaluating and improving juvenile justice programs in North Carolina. Nashville: Vanderbilt University.

    Google Scholar 

  • Lipsey, M. W., Howell, J. C., Kelly, M. R., Chapman, G., & Carver, D. (2010). Improving the effectiveness of juvenile justice programs: A new perspective on evidence-based practice. Washington, DC: Center for Juvenile Justice Reform, Georgetown University.

    Google Scholar 

  • Lowenkamp, C. T., & Latessa, E. J. (2004). Investigating the relationship between program integrity and correctional program effectiveness. The Ohio Corrections Research Compendium, 2, 208–213.

    Google Scholar 

  • Lowenkamp, C. T., Latessa, E. J., & Smith, P. (2006). Does correctional program quality really matter? The impact of adhering to the principles of effective intervention. Criminology and Public Policy, 5, 575–594.

    Article  Google Scholar 

  • Mears, D. P. (2007). Towards rational and evidence-based crime policy. Journal of Criminal Justice, 35, 667–682.

    Article  Google Scholar 

  • Mears, D. P. (2010). American criminal justice policy: An evaluation approach to increasing accountability and effectiveness. New York: Cambridge University Press.

    Book  Google Scholar 

  • Mears, D. P., Cochran, J. C., Greenman, S. J., Bhati, A. S., & Greenwald, M. A. (2011). Evidence on the effectiveness of juvenile court sanctions. Journal of Criminal Justice, 39, 509–520.

    Article  Google Scholar 

  • Olds, D. L. (2007). Preventing crime with prenatal and infancy support of parents: the nurse-family partnership. Victims and Offenders, 2, 205–225.

    Article  Google Scholar 

  • Pealer, J. A., & Latessa, E. J. (2004). Applying the principles of effective intervention to juvenile correctional programs. Corrections Today, 66(7), 26–29.

    Google Scholar 

  • Pew Center on the States (2012). Results first: helping states assess the costs and benefits of policy options and use that data to make decisions based on results. Available at: www.pewcenteronthestates.org. Retrieved May 17, 2012.

  • Rhoades, B. L., Bumbarger, B. K., & Moore, J. E. (2012). The role of a state-level prevention support system in promoting high-quality implementation and sustainability of evidence-based programs. American Journal of Community Psychology, 50, 386–401.

    Article  Google Scholar 

  • Shadish, W. R. (1987). Program micro- and macrotheories: A guide for social change. In L. Bickman (Ed.), Using program theory in evaluation (pp. 93–109). San Francisco: Jossey-Bass.

    Google Scholar 

  • Shadish, W. R., Cook, T. D., & Leviton, L. C. (1991). Foundations of program evaluation: Theories of practice. Newbury Park: Sage.

    Google Scholar 

  • Sherman, L. W., Farrington, D. P., Welsh, B. C., & MacKenzie, D. L. (Eds.). (2006). Evidence-based crime prevention (rev. ed.). New York: Routledge.

  • Tversky, A., & Kahneman, D. (1986). Rational choice and the framing of decisions. Journal of Business, 59(4), S251–S278.

    Article  Google Scholar 

  • U.S. Department of Health and Human Services. (2001). Youth violence: A report of the surgeon general. Rockville: Author.

    Google Scholar 

  • Weisburd, D., Lum, C. M., & Petrosino, A. (2001). Does research design affect study outcomes in criminal justice? The Annals of the American Academy of Political and Social Science, 578, 50–70.

    Article  Google Scholar 

  • Welsh, B. C., Sullivan, C. J., & Olds, D. L. (2010). When early crime prevention goes to scale: a new look at the evidence. Prevention Science, 11, 115–125.

    Article  Google Scholar 

Download references

Acknowledgments

We are especially grateful to the editor and the anonymous reviewers for insightful comments. We also wish to thank Mark Lipsey and Buddy Howell for excellent comments on an earlier draft.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Brandon C. Welsh.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Welsh, B.C., Rocque, M. & Greenwood, P.W. Translating research into evidence-based practice in juvenile justice: brand-name programs, meta-analysis, and key issues. J Exp Criminol 10, 207–225 (2014). https://doi.org/10.1007/s11292-013-9182-3

Download citation

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11292-013-9182-3

Keywords

Navigation