Skip to main content

The production of criminological experiments revisited: the nature and extent of federal support for experimental designs, 2001–2013

Abstract

Objectives

To assess the nature and extent of funding for randomized experiments in criminology and criminal justice from the National Institute of Justice (NIJ) since 2000.

Methods

Based on data from official records of grant awards made by NIJ between fiscal years 2001 and 2013, we categorized awards based on whether they were for randomized experiments, non-experimental evaluation research, non-evaluation social science research, social science program support, forensic science and technology research, or forensic science and technology support.

Results

While the bulk of NIJ funding goes to forensic science and technology support, among the 800 social science awards we found a total of 99 awards for experiments. Support for the use of experimental designs increased during this 13-year period and was substantially greater than the support for the use of experimental designs in the 1990s. The awards for experiments between 2001 and 2013 went to a variety of researchers and research organizations and addressed a wide array of criminal justice program areas.

Conclusions

Our findings document a marked increase in funding for experiments in recent years compared to the 1991–2000 period, when just 21 awards were made for experimental work. These findings suggest that NIJ has responded to a series of critiques regarding the methodological quality of funded projects by placing a greater emphasis on high-quality social science research.

This is a preview of subscription content, access via your institution.

Fig. 1
Fig. 2
Fig. 3
Fig. 4

Notes

  1. See http://www.nasonline.org/about-nas/mission/

  2. This critique of the methodology of evaluation studies was not only a problem with NIJ-funded research. Logan (1972), for example, identified methodological issues in all 96 of the correctional and prevention evaluations he reviewed.

  3. In an appendix, the Panel (NRC 1977: 254) summarized the assessment of several state planning agencies officials that NIJ was “useful—but not very helpful” to planning agencies.

  4. While Garner and Visher (2003) reported zero experimental awards for fiscal year 2000, we found one experiment funded in that year. The results we present in Fig. 4 reflect this one award and so differ slightly from table 1 in Garner and Visher (2003).

  5. See http://nij.gov/publications/Pages/annual-reports.aspx

  6. The data we received were derived from OJP’s Grants Management System (GMS), which did not include information on all grants from 2001–2003. We identified and added a total of 144 awards available through NIJ.gov that were not part of our initial database. We also interestingly identified 65 grants in our database that we could not locate on NIJ.gov.

  7. Awards from these two sources only include grants and cooperative agreements. We were unable to obtain a listing of awards made through contracts and interagency agreements. We faced similar limitations in our data collection efforts to those noted by the NRC (National Research Council 2010). The Committee commented on its inability to obtain a total sample of all money spent by NIJ: “For example, some data were available for only a limited number of years, and some data relating to the use of contracts or interagency agreements, known sources of funding for research in certain areas (e.g., body armor), often were not provided as part of the financial history submitted to the committee or available publicly. Although we were able to construct a broadly representative picture of NIJ’s funding history, we are aware that it is at best an approximate picture” (National Research Council 2010: 46). The Department of Justice (2009) audit of NIJ grants and contracts from 2005–2007 references issues in systematic recordkeeping that may continue to contribute to deficiencies in data available on all awards.

  8. While we did not examine the methodology of the 179 awards in the comparison group in detail; 33 of these awards mention using a quasi-experimental design in the abstract, while 9 describe time series approaches, and 18 mention the use of propensity score matching for identifying a comparison group.

  9. See more at http://www.nij.gov/topics/forensics/Pages/welcome.aspx and http://www.nij.gov/topics/technology/body-armor/Pages/welcome.aspx

  10. See more on NIJ funding for laboratory enhancement at: http://nij.gov/funding/Pages/laboratory-enhancement.aspx

  11. A listing of these 99 awards for experiments is available from the authors.

  12. We include in this total of 99 experimental awards two fairly large grants that proposed to use random assignment, but which, based on conversations with the principal investigators, we know did not ultimately lead to an experiment. If we remove these two awards, our average award size decreases, but still remains a sizable $578,389.

  13. We attempted to collect information on the principal investigator for each experiment. Because this information is not consistently reported in GMS, we were unable to locate all principal investigators, which limits our ability to assess the extent to which individual investigators at these universities and research centers are receiving multiple awards for experimental work.

  14. See http://www.policefoundation.org/content/randomized-experiments-at-police-foundation for more on randomized experiments conducted by the Police Foundation.

  15. We contacted the principal investigators for two of the three awards without a final report or scholarly article. The final report for one of the awards is forthcoming and the other award did not have a final report because funding was removed due to issues with recruiting subjects for the experiment.

  16. See http://www.nij.gov/funding/Pages/post-award-reporting.aspx#finalsummary for more on this change.

  17. See http://www.whitehouse.gov/sites/default/files/omb/memoranda/2012/m-12-14.pdf

  18. See http://www.whitehouse.gov/sites/default/files/omb/memoranda/2013/m-13-17.pdf

  19. See http://nij.gov/funding/awards/Pages/welcome.aspx

  20. See more at: http://www.nij.gov/funding/Pages/bridging-research-and-practice-program.aspx

References

  • Baker, S. H., & Rodriguez, O. (1977). Random time quota selection – an alternative to random selection in experimental evaluations. New York: Vera Institute of Justice.

    Google Scholar 

  • Baker, S. H., & Sadd, S. (1979). The court employment project evaluation - final report. New York: Vera Institute of Justice.

    Google Scholar 

  • Braga, A. A., Welsh, B. C., & Bruinsma, G. J. N. (2013). Integrating experimental and observational methods to improve criminology and criminal justice policy. In B. C. Welsh, A. A. Braga, & G. J. N. Bruinsma (Eds.), Experimental criminology: Prospects for advancing science and public policy (pp. 277–298). New York: Cambridge University Press.

    Chapter  Google Scholar 

  • Clear, T. C. (2010). Policy and evidence: the challenge of the american society of criminology: 2009 presidential address to theAmerican society of criminology. Criminology, 48, 1–25.

    Article  Google Scholar 

  • Coalition for Evidence-Based Policy. (2002). Bringing evidence-driven progress to education: a recommended strategy for the U.S. Department of Education. Washington, DC: Coalition for Evidence-Based Policy.

  • Department of Justice. (2009). U.S. Department of Justice audit of the National Institute of Justice’s practices for awarding grants and contracts in fiscal years 2005 through 2007. Washington, DC: U.S. Department of Justice Office of the Inspector General Audit Division.

  • Farrington, D. P. (2006). Key longitudinal- experimental studies in criminology. Journal of Experimental Criminology, 2, 121–141.

    Article  Google Scholar 

  • Farrington, D. P., & MacKenzie, D. L. (2013). Long-term follow-ups of experimental interventions. Journal of Experimental Criminology, 9, 385–388.

    Article  Google Scholar 

  • Farrington, D. P., & Petrosino, A. (2001). The Campbell collaboration crime and justice group. The Annals of the American Academy of Political and Social Science, 578, 35–49.

    Article  Google Scholar 

  • Farrington, D. P., Gottfredson, D., Sherman, L. W., & Welsh, B. (2002). The Maryland scientific methods scale. In L. W. Sherman, D. P. Farrington, B. Welsh, & D. MacKenzie (Eds.), Evidence-based crime prevention (pp. 13–21). New York: Routledge.

    Google Scholar 

  • Garner, J. H., & Maxwell, C. D. (2000). What are the lessons of the police arrest studies? Journal of Aggression, Maltreatment & Trauma, 4, 83–114.

    Article  Google Scholar 

  • Garner, J. H., & Visher, C. A. (1988). Policy experiments come of age. NIJ Reports, 211, 2–8.

    Google Scholar 

  • Garner, J. H., & Visher, C. A. (2003). The production of criminological experiments. Evaluation Review, 27, 316–335.

    Article  Google Scholar 

  • General Accounting Office (2002a). Drug courts: Better DOJ data collection and evaluation efforts needed to measure impact of drug court programs. Washington: U.S. General Accounting Office.

  • General Accounting Office (2002b). Justice impact evaluations: one Byrne evaluation was rigorous; all reviewed violence against women office evaluations were problematic. Washington: U.S. General Accounting Office.

  • General Accounting Office (2003). Justice outcome evaluations. Design and implementation of studies require more NIJ attention. Washington: U.S. General Accounting Office.

  • Herk, M. (2009). The coalition for evidence-based policy: Its role in advancing evidence-based reform, 2004–2009. New York: William T. Grant Foundation.

    Google Scholar 

  • Laub, J. H. (2011). The national institute of justice response to the report of the national research council: Strengthening the national institute of justice. Washington: National Institute of Justice, U.S. Department of Justice.

    Google Scholar 

  • Lempert, R. O., & Visher, C. A. (1987). Randomized field experiments in criminal justice: Workshop proceedings. Washington: National Institute of Justice, U.S. Department of Justice.

    Google Scholar 

  • Logan, C. H. (1972). Evaluation research in crime and delinquency: A reappraisal. Journal of Criminal Law, Criminology, and Police Science, 63, 378–387.

    Article  Google Scholar 

  • Lum, C., Koper, C., & Telep, C. W. (2011). The Evidence-Based Policing Matrix. Journal of Experimental Criminology, 7, 3–26.

    Article  Google Scholar 

  • Mears, D. P. (2007). Towards rational and evidence-based crime policy. Journal of Criminal Justice, 35, 667–682.

    Article  Google Scholar 

  • Mullen, J., Carlson, K., Earle, R., Blew, C., & Li, L. (1974). Pre-trial services: An evaluation of policy related research. Cambridge: Abt Associates.

    Google Scholar 

  • National Institute of Justice. (2002a). National drug court evaluation multi-site longitudinal impact study. Washington: National Institute of Justice, U.S. Department of Justice.

    Google Scholar 

  • National Institute of Justice. (2002b). Solicitation for the evaluation of the serious and violentoffender reentry initiative. Washington: National Institute of Justice, U.S. Department of Justice.

    Google Scholar 

  • National Institute of Justice. (2013). Building and enhancing criminal justice researcher practitioner partnerships FY 2013. Washington: National Institute of Justice, U.S. Department of Justice.

    Google Scholar 

  • National Research Council (1977). Understanding crime: An evaluation of the National Institute of Law Enforcement and Criminal Justice. S. O. White & S. Krislov (Eds.). Washington, DC: The National Academies Press.

  • National Research Council (1979). The rehabilitation of criminal offenders: problems and prospects. Panel on Research on Rehabilitative Techniques. L. Sechrest, S. O. White, & E. D. Brown (Eds.). Washington, DC: The National Academies Press.

  • National Research Council (1986). Criminal careers and “career criminals”, vol. 1. Panel on Research on Criminal Careers. A. Blumstein, J. Cohen. J. A. Roth, & C. A. Visher (Eds.). Washington, DC: The National Academies Press.

  • National Research Council. (1978). Deterrence and incapacitation: estimating the effects of criminal sanctions on crime rates. Panel on Research on Deterrent and Incapacitative Effects. A. Blumstein, J. Cohen, & D. Nagin (Eds.). Washington, DC: The National Academies Press

  • National Research Council. (1993). Understanding and preventing violence, vol. 1. Panel on the Understanding and Control of Violent Behavior. A. J. Reiss, Jr. & J. A. Roth (Eds.). Washington, DC: The National Academies Press.

  • National Research Council. (2005). Improving evaluation of anticrime programs. Committee on Improving Evaluation of Anti-Crime Programs. M. W. Lipsey, (Ed.). Washington, DC: The National Academies Press.

  • National Research Council. (2010). Strengthening the National Institute of Justice. Committee on Assessing the Research Program of the National Institute of Justice. C. F. Wellford, B. M. Chemers, & J. A. Schuck, (Eds.). Washington, DC: The National Academies Press.

  • Nuttall, C. (2003). The Home Office and random allocation experiments. Evaluation Review, 27, 267–289.

    Article  Google Scholar 

  • Pager, D. (2003). The mark of a criminal record. American Journal of Sociology, 108, 937–975.

    Article  Google Scholar 

  • Palmer, T., & Petrosino, A. (2003). The “experimenting agency”: the California Youth Authority Research Division. Evaluation Review, 27, 228–266.

    Article  Google Scholar 

  • Roman, J. K., Reid, S. E., Chalfin, A. J., & Knight, C. R. (2009). The DNA field experiment: a randomized trial of the cost-effectiveness of using DNA to solve property crimes. Journal of Experimental Criminology, 5, 345–369.

    Article  Google Scholar 

  • Sampson, R. J., Raudenbush, S. W., & Earls, F. (1997). Neighborhoods and violent crime: a multilevel study of collective efficacy. Science, 277, 918–924.

    Article  Google Scholar 

  • Sampson, R. J., Winship, C., & Knight, C. (2013). Translating causal claims: principles and strategies for policy-relevant criminology. Criminology and Public Policy, 12, 587–616.

    Article  Google Scholar 

  • Sherman, L. W. (2013). The rise of evidence-based policing: Targeting, testing, and tracking. In M. Tonry (Ed.), Crime and Justice in America, 1975–2025 (42, pp. 377–451). Chicago: University of Chicago Press.

    Google Scholar 

  • Sherman, L. W., & Berk, R. A. (1984). The specific deterrent effects of arrest for domestic assault. American Sociological Review, 49, 261–272.

    Article  Google Scholar 

  • Sherman, L., Gottfredson, D., MacKenzie, D., Eck, J., Reuter, P., & Bushway, S. (1997). Preventing crime: What works, what doesn’t, what’s promising. Washington: National Institute of Justice, U.S. Department of Justice.

    Google Scholar 

  • Visher, C. A., & Weisburd, D. (1998). Identifying what works: recent trends in crime prevention strategies. Crime, Law and Social Change, 28, 223–242.

    Article  Google Scholar 

  • Wallace, J. W. (2011). Review of the coalition for evidence-based policy. New York: MDRC.

    Google Scholar 

  • Weisburd, D. (2010). Justifying the use of non-experimental methods and disqualifying the use of randomized controlled trials: Challenging folklore in evaluation research in crime and justice. Journal of Experimental Criminology, 6, 209–227.

    Article  Google Scholar 

  • Welsh, B. C. (2006). Evidence-based policing for crime prevention. In D. Weisburd & A. A. Braga (Eds.), Police innovation: Contrasting perspectives (pp. 305–321). New York: Cambridge University Press.

    Google Scholar 

  • Zimring, F. E. (1974). Measuring the impact of pretrial diversion from the criminal justice system. The University of Chicago Law Review, 41, 224–241.

    Article  Google Scholar 

Download references

Acknowledgments

We thank Dorothy Lee in the Office of General Counsel in the Office of Justice Programs for her assistance in providing award data from the Grants Management System and the content specialists in the National Criminal Justice Reference Service for their help in obtaining final reports for awards. Thanks also to Ronald Hubbard for his research assistance.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Cody W. Telep.

Additional information

Journal of Experimental Criminology 10th Anniversary Special Issue

Rights and permissions

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

Telep, C.W., Garner, J.H. & Visher, C.A. The production of criminological experiments revisited: the nature and extent of federal support for experimental designs, 2001–2013. J Exp Criminol 11, 541–563 (2015). https://doi.org/10.1007/s11292-015-9239-6

Download citation

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s11292-015-9239-6

Keywords

  • Awards
  • Federal funding
  • Grants
  • National Institute of Justice
  • Randomized experiments