Advertisement

Improving the Evidence Base in Security: Systematic Reviews and Meta-Analysis

  • Matthew Manning

Abstract

For more than 30 years researchers have been moving away from subjective narrative reviews to inform decision-making, and adopting more objective techniques such as systematic reviews and meta-analysis. The shift from subjective to objective analysis has resulted in an improved evidence base. This improvement has arguably led to better decisions because objective evidence is made available that contains less bias. Moreover, the extant literature is captured and analysed to produce scientific evidence regarding the issue or problem at hand. Systematic reviews and meta-analyses are now a matter of course in health, education and psychology, but have only recently been adopted and used to inform crime and justice policy. This chapter explains, in detail, the role of systematic reviews and meta-analysis, and how the methods produce scientific evidence regarding our knowledge of security and its effectiveness. A discussion regarding the pros and cons of these methods is provided based on the author’s experience in publishing (Manning et al., 2010; Mazerolle et al., 2013) and reviewing meta-analyses. A general discussion of the meta-analytic technique is provided using a case study of street lighting in the United States. Street lighting was selected as an example because it is a topic of interest to those studying and practising security. Further, the meta-analysis conducted by Welsh and Farrington (2008b) is a good example of a high-quality application of the method. This example also demonstrates the utility of meta-analysis and its importance in decision-making and policy development in the area of security and more generally crime science.

Keywords

Control Area Crime Prevention Narrative Review Average Effect Size Street Lighting 
These keywords were added by machine and not by the authors. This process is experimental and the keywords may be updated as the learning algorithm improves.

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

References

  1. Bennett, T., Holloway, K. and Farrington, D. (2008). The Effectiveness of Neighborhood Watch.. The Campbell Collaboration, 4, 1–48.Google Scholar
  2. Bonell, C., Fletcher, A., Morton, M., Lorenc, T. and Moore, L. (2012). Realist Randomised Controlled Trials: A New Approach to Evaluating Complex Public Health Interventions.. Social Science and Medicine, 75, 2299–2306.CrossRefGoogle Scholar
  3. Bonell, C., Fletcher, A., Morton, M., Lorenc, T. and Moore, L. (2013). Methods Don’t Make Assumptions, Researchers Do: A Response to Marchal et al.. Social Science and Medicine, 94, 81–82.CrossRefGoogle Scholar
  4. Borenstein, M., Hedges, L., Higgins, J. and Rothstein, H. (2009). Introduction to Meta-Analysis. West Sussex, UK: John Wiley & Sons Ltd.CrossRefGoogle Scholar
  5. Braga, A. (2007). The Effects of Hot Spots Policing on Crime.. Campbell Systematic Reviews, 3, 1–36.Google Scholar
  6. Brown, P., Brunnhuber, K., Chalkidou, K., Chalmers, I., Clarke, C. and Fenton, M. (2006). How to Formulate Research Recommendations.. BMJ, 333, 804–806.CrossRefGoogle Scholar
  7. Buschman, T.J., Siegel, M., Roy, J.E. and Miller, E.K. (2011). Neural Substrates of Cognitive Capacity Limitations.. Proceedings of the National Academy of Social Sciences of the United States of America, 108, 11252–11255.CrossRefGoogle Scholar
  8. Centre for Reviews and Dissemination (2009). Systematic Reviews: CRD’s Guidance for Undertaking Reviews in Health Care. York, UK: CRD, University of York.Google Scholar
  9. Chalmers, I. (2007). The Lethal Consequences of Failing to Make Use of All Relevant Evidence about the Effects of Medical Treatments: The Need for Systematic Reviews, in Rothwell, P. (ed.) Treating Individuals. London: Lancet. pp. 37–58.Google Scholar
  10. Durlak, J.A. and Lipsey, M.W. (1991). A Practitioner’s Guide to Meta-Analysis.. American Journal of Community Psychology, 19, 291–332.Google Scholar
  11. Eysenck, H.J. (1978). An Exercise in Mega-Silliness. American Psychologist, 33, 517.CrossRefGoogle Scholar
  12. Farrington, D. and Welsh, B. (2002). Effects of Improved Street Lighting on Crime: A Systematic Review. London: Home Office Research, Development and Statistics Directorate.Google Scholar
  13. Glass, G., McGaw, B. and Smith, M.L. (1981). Meta-Analysis in Social Research. Newbury Park: Sage.Google Scholar
  14. Kontopantelis, E. and Reeves, D. (2012). Performance of Statistical Methods for Meta-Analysis When True Study Effects Are Non-Normally Distributed: A Simulation Study.. Statistical Methods in Medical Research, 21, 409–426.CrossRefGoogle Scholar
  15. Kontopantelis, E., Springate, D. and Reeves, D. (2013). A Re-Analysis of the Cochrane Library Data: The Dangers of Unobserved Heterogeneity in Meta-Analyses.. PLoS ONE, 8, 1–12.CrossRefGoogle Scholar
  16. Lipsey, M.W. and Wilson, D.B. (2001). Practical Meta-Analysis (Vol. 49). Thousand Oaks, CA: Sage Publications.Google Scholar
  17. Lum, C., Kennedy, L. and Sherley, A. (2006). The Effectiveness of Counter-Terrorism Strategies.. Campbell Systematic Reviews, 2, 1–50.Google Scholar
  18. Manning, M., Homel, R. and Smith, C. (2010). A Meta-Analysis of the Effects of Early Developmental Prevention Programs in at-Risk Populations on Non-Health Outcomes in Adolescence.. Children and Youth Services Review, 32, 506–519.CrossRefGoogle Scholar
  19. Marchal, B., Westhorp, G., Wong, G., Van Belle, S., Greenhalgh, T., Kegels, G., Pawson, R. (2013). Realistic RCTs of Complex Interventions — An Oxymoron.. Social Science and Medicine, 94, 124–128.CrossRefGoogle Scholar
  20. Mazerolle, L., Bennett, S., Davis, J., Sargeant, E. and Manning, M. (2013). Legitimacy in Policing. The Campbell Collaboration, 9, 1–51Google Scholar
  21. Pawson, R. and Tilley, N. (1997). Realistic Evaluation. London: Sage.Google Scholar
  22. Pawson, R. and Tilley, N. (2001). Realistic Evaluation Bloodlines.. American Journal of Evaluation, 22, 317–324.CrossRefGoogle Scholar
  23. Pease, K. (1999). A Review of Street Lighting Evaluations: Crime Reduction Effects, in Painter, K. and Tilley, N. (eds.). Surveillance of Public Space: CCTV, Street Lighting and Crime Prevention. Monsey, NY: Criminal Justice Press. pp. 47–76.Google Scholar
  24. Rachman, S.J. and Wilson, G.T. (1980). The Effects of Psychological Therapy. 2nd ed. Oxford: Pergaman.Google Scholar
  25. Rosenthal, R. (1979). The ‘File Drawer Problem’ and Tolerance for Null Results.. Psychological Bulletin, 86, 638–641.CrossRefGoogle Scholar
  26. Rosenthal, R. (1994). Parametric Measures of Effect Size, in Cooper, H. and Hedges, L. (eds.) The Handbook of Research Synthesis. New York: Russell Sage Foundation. pp. 231–244.Google Scholar
  27. Shadish, W. and Sweeney, R. (1991). Mediators and Moderators in Meta-Analysis: There’s a Reason We Don’t Let Dodo Birds Tell Us Which Psychotherapies Should Have Prizes.. Journal of Consulting and Clinical Psychology, 59, 883–893.CrossRefGoogle Scholar
  28. Sharpe, D. (1997). Of Apples and Oranges, File Drawers and Garbage: Why Validity Issues in Meta-Analysis Will Not Go Away.. Clinical Psychology Review, 17, 881–901.CrossRefGoogle Scholar
  29. Smith, M. and Glass, G. (1977). Meta-Analysis of Psychotherapy Outcome Studies.. American Psychologist, 32, 752–760.CrossRefGoogle Scholar
  30. Welsh, B. and Farrington, D. (2003). Effects of Improved Street Lighting on Crime: Protocol for a Systematic Review.. Campbell Collaboration, 13, 1–51.Google Scholar
  31. Welsh, B. and Farrington, D. (2007). Improved Street Lighting and Crime prevention. Stockholm: The Swedish National Council for Crime Prevention.Google Scholar
  32. Welsh, B. and Farrington, D. (2008a). Effects of Closed Circuit Television Surveillance on Crime.. The Campbell Collaboration, 4, 1–73.Google Scholar
  33. Welsh, B. and Farrington, D. (2008b). Effects of Improved Street Lighting on Crime.. Campbell Collaboration, 13, 1–51.Google Scholar
  34. Wolf, F.M. (1986). Meta-Analysis: Quantitative Methods for Research Synthesis. Newbury Park, CA: Sage Publications.Google Scholar
  35. Wortman, P.M. (1994). Judging Research Quality, in Cooper, H. and Hedges, L. (eds.) The Handbook of Research Synthesis. New York: Russell Sage Foundation. pp. 97–110.Google Scholar

Copyright information

© Matthew Manning 2014

Authors and Affiliations

  • Matthew Manning

There are no affiliations available

Personalised recommendations