Advertisement

Using Meta-Analysis in the Social Sciences to Improve Environmental Policy

  • Alexander Maki
  • Mark A. Cohen
  • Michael P. Vandenbergh
Chapter
Part of the World Sustainability Series book series (WSUSE)

Abstract

Policymakers have recently looked to the social sciences for effective strategies to address environmental issues, including how to change people’s environmental behaviors. During that time, social scientists have been challenged to improve how they assess, summarize, and convey the state of environmental social science. Meta-analysis, the quantitative review of existing research using data from multiple studies, is one method researchers use to assess the state of knowledge and share best practices. Development of new data reporting standards and systems would improve not only environmental social science, but also the interface between environmental social sciences and policymakers. In particular, dynamic meta-analyses, or frequently updated meta-analyses, would ensure that policymakers have access to up-to-date findings and would allow policymakers to examine subsets of studies that best approximate relevant contexts for new policies. These new standards for conducting and reporting meta-analyses would allow environmental social scientists to more effectively inform policy, and would help policymakers understand and assess the latest developments in the field.

Keywords

Meta-analysis Environmental policy Social sciences Behavior change 

References

  1. Abraham, C., & Michie, S. (2008). A taxonomy of behavior change techniques used in interventions. Health Psychology, 27, 379–387.CrossRefGoogle Scholar
  2. Abrahamse, W., & Steg, L. (2013). Social influence approaches to encourage resource conservation: A meta-analysis. Global Environmental Change, 23, 1773–1785.CrossRefGoogle Scholar
  3. Arrowsmith, J. (2011). Phase II failures: 2008−2010. Nature Reviews Drug Discovery, 10, 1.CrossRefGoogle Scholar
  4. Aschwanden, C. (2016). Democrats—and republicans—are growing more worried over climate change. http://fivethirtyeight.com/features/democrats-and-republicans-are-growing-more-worried-over-climate-change/.
  5. Bamberg, S., & Möser, G. (2007). Twenty years after Hines, Hungerford, and Tomera: A new meta-analysis of psycho-social determinants of pro-environmental behaviour. Journal of Environmental Psychology, 27, 14–25.CrossRefGoogle Scholar
  6. Bangdiwala, S. I., Bhargava, A., O’Connor, D. P., Robinson, T. N., Michie, S., Murray, D. M., et al. (2016). Statistical methodologies to pool across multiple intervention studies. TBM, 6, 228–235.Google Scholar
  7. Barker, P. M., Reid, A., & Schall, M. W. (2016). A framework for scaling up health interventions: Lessons from large-scale improvement initiatives in Africa. Implementation Science, 11, 1–11.Google Scholar
  8. Bierer, B. E., Li, R., Barnes, M., & Sim, I. (2016). A global, neutral platform for sharing trial data. The New England Journal of Medicine, 374, 2411–2413.CrossRefGoogle Scholar
  9. Carnall, M., Dale, L., & Lekov, A. (2016). The economic effect of efficiency programs on energy consumers and producers. Energy Efficiency, 9, 647–662.CrossRefGoogle Scholar
  10. Carrico, A. R., Vandenbergh, M. P., Stern, P. C., & Dietz, T. (2015). US climate policy needs behavioural science. Nature Climate Change, 5, 177–179.CrossRefGoogle Scholar
  11. Carrico, A. R., Vandenbergh, M. P., Stern, P. C., Gardner, G. T., Dietz, T., & Gilligan, J. M. (2011). Energy and climate change: Key lessons for implementing the behavioral wedge. Journal of Energy & Environmental Law, 18452, 61–67.Google Scholar
  12. Castelnuovo, G., Pietrabissa, G., Cattivelli, R., Manzoni, G. M., & Molinari, E. (2016). Not only clinical efficacy in psychological treatments: Clinical psychology must promote cost-benefit, cost-effectiveness, and cost-utility analysis. Frontiers in Psychology 7. doi: 10.3389/fpsyg.2016.00563.
  13. Chang, AC., & Li, P. (2015). Is economics research replicable? Sixty published papers from thirteen journals say “usually not.” Finance and Economics Discussion Series 2015-083, Washington: Board of Governors of the Federal Reserve System. http://dx.doi.org/10.17016/FEDS.2015.083.
  14. Clayton, S., Devine-Wright, P., Stern, P. C., Whitmarsh, L., Carrico, A., Steg, L., et al. (2015). Psychological research and global climate change. Natura Climate Change, 5, 640–646.CrossRefGoogle Scholar
  15. Cohen, M. A., & Tubb, A. (2016). The impact of environmental regulation on firm and country competitiveness: A meta-analysis of the porter hypothesis. http://ssrn.com/abstract=2692919.
  16. Cumming, G. (2014). The new statistics: Why and how. Psychological Science, 25, 7–29.CrossRefGoogle Scholar
  17. Damschroder, L. J., Goodrich, D. E., Kim, H. M., Holleman, R., Gillon, L., Kirsh, S., et al. (2016). Development and validation of the ASPIRE-VA coaching fidelity checklist (ACFC): A tool to help ensure delivery of high-quality management interventions. Translational Behavioral Medicine, 6, 369–395.CrossRefGoogle Scholar
  18. Darley, J. M., & Beniger, J. R. (1981). Diffusion of energy-conserving innovation. Journal of Social Issues, 37, 150–171.CrossRefGoogle Scholar
  19. Delmas, M. A., Fischlein, M., & Asenio, O. I. (2013). Information strategies and energy conservation behavior: A meta-analysis of experimental studies from 1975 to 2012. Energy Policy, 61, 729–739.CrossRefGoogle Scholar
  20. Dietz, T., Gardner, G. T., Gilligan, J., Stern, P. C., & Vandenbergh, M. P. (2009). Household actions can provide a behavioral wedge to rapidly reduce US carbon emissions. Proceedings of the National Academy of Sciences of the United States of America, 106, 18452–18456.CrossRefGoogle Scholar
  21. Gifford, R. (2014). Environment psychology matters. Annual Review of Psychology, 65, 541–579.CrossRefGoogle Scholar
  22. Gillingham, K., Kotchen, M. J., Rapson, D. S., & Wagner, G. (2013). The rebound effect is overplayed. Nature, 493, 475–476.CrossRefGoogle Scholar
  23. Halletatte, S., & Mach, K. J. (2016). Make climate-change assessments more relevant. Nature, 534, 613–615.CrossRefGoogle Scholar
  24. Hines, J. M., Hungerford, H. R., & Tomera, A. N. (1986/1987). Analysis and synthesis of research on responsible environmental behavior: A meta-analysis. Journal of Environmental Education, 18, 1–8.Google Scholar
  25. Ho, F. Y., Yeung, W., Ng, T. H, & Chan C. S. (2016). The efficacy and cost-effectiveness of stepped care prevention and treatment for depressive and/or anxiety disorders: A systematic review and meta-analysis. Scientific Reports 6. doi: 10.1038/srep29281.
  26. Hornsey, M. J., Harris, E. A., Bain, P. G., & Fielding, K. S. (2016). Meta-analyses of the determinants and outcomes of belief in climate change. Nature Climate Change. doi: 10.1038/NCLIMATE2943.Google Scholar
  27. Ioannidis, J. P. A. (2016). The mass production of redundant, misleading, and conflicted systematic reviews and meta-analyses. The Milbank Quarterly, 94, 485–514.CrossRefGoogle Scholar
  28. Karlin, B., Zinger, J. F., & Ford, R. (2015). The effects of feedback on energy conservation: A meta-analysis. Psychological Bulletin, 141, 1205–1227.CrossRefGoogle Scholar
  29. Kenny, D. A. (2014). Data to text. Retrieved from. http://davidakenny.net/dtt/datatotext.htm#MA.
  30. Lokhorst, A. M., Werner, C., Staats, H., van Dijk, E., & Gale, J. L. (2013). Commitment and behavior change: A meta-analysis and critical review of commitment-making strategies in environmental research. Environment and Behavior, 45, 3–34.CrossRefGoogle Scholar
  31. Macaskill, P., Walter, S. D., & Irwig, L. (2001). A comparison of methods to detect publication bias in meta-analysis. Statistics in Medicine, 20, 641–654.CrossRefGoogle Scholar
  32. Maki, A., Burns, R. J., Ha, L., & Rothman, A. J. (2016). Paying people to protect the environment: A meta-analysis of financial incentive interventions to promote proenvironmental behavior. Environmental Psychology, 47, 242–255.CrossRefGoogle Scholar
  33. Merson, L., Gaye, O., & Guerin, P. J. (2016). Avoiding data dumpsters—toward equitable and useful data sharing. The New England Journal of Medicine, 374, 2414–2415.CrossRefGoogle Scholar
  34. Michie, S., Richardson, M., Johnston, M., Abraham, C., Francis, J., Hardeman, W., et al. (2013). The behavior change technique taxonomy (v1) of 93 hierarchically clustered techniques: Building an international consensus for the reporting of behavior change interventions. Annals of Behavioral Medicine, 46, 81–95.CrossRefGoogle Scholar
  35. Moher, D., Liberati, A., Tetzlaff, J., Altman, D. G., & The PRISMA Group. (2009). Preferred reporting items for systematic reviews and meta-analyses: The PRISMA statement. PLOS Medicine, 6, e1000097.CrossRefGoogle Scholar
  36. Moher, D., Tsertsvadze, A., Tricco, A., Eccles, M., Grimshaw, J., Sampson, M., & Barrowman, N. (2008). When and how to update systematics reviews. Cochrane database of systematic reviews 1. doi: 10.1002/14651858.MR000023.pub3.
  37. Noonan, D. S., Hsieh, L. C., & Matisoff, D. (2011). Spatial effects in energy-efficient residential HVAC technology adoption. Environment and Behavior, 45, 476–503.CrossRefGoogle Scholar
  38. Open Science Collaboration. (2012). An open, large-scale, collaborative effort to estimate the reproducibility of psychological science. Perspective on Psychological Science, 7, 657–660.CrossRefGoogle Scholar
  39. Open Science Collaboration. (2015). Estimating the reproducibility of psychological science. Science, 349, 943–951.CrossRefGoogle Scholar
  40. Osbaldiston, R., & Schott, J. P. (2012). Environmental sustainability and behavioral science: Meta-analysis of proenvironmental behavior experiments. Environment and Behavior, 44, 257–299.CrossRefGoogle Scholar
  41. Page, M. J., & Moher, D. (2016). Mass production of systematic reviews and meta-analyses: An exercise in meta-silliness? The Milbank Quarterly, 94, 515–519.CrossRefGoogle Scholar
  42. Panko, B. (2016). Industry-backed project aims to become one-stop shop for clinical research data. Science. doi: 10.1126/science.aag0554.Google Scholar
  43. Ressing, M., Blettner, M., & Klug, S. J. (2009). Systematic literature reviews and meta-analyses. Deutsches Ärzteblatt International, 106, 456–463.Google Scholar
  44. Rothman, A. J. (2000). Toward a theory-based analysis of behavioral maintenance. Health Psychology, 19, 64–69.CrossRefGoogle Scholar
  45. Rothstein, H. R., Sutton, A. J., & Borenstein, M. (2005). Publication bias in meta-analysis. In H. R. Rothstein, A. J. Sutton, & M. Borenstein (Eds.), Publication bias in meta-analysis—prevention, assessment, and adjustments hoboken (pp. 1–7). NJ: Wiley.CrossRefGoogle Scholar
  46. Scheibehenne, B., Jamil, T., & Wagenmakers, E. (2016). Bayesian evidence synthesis can reconcile seemingly inconsistent results: The case of hotel towel reuse. Psychological Science, 27, 1043–1046.CrossRefGoogle Scholar
  47. Schultz, P. W. (2015). Strategies for promoting proenvironmental behavior: Lots of tools but few instructions. European Psychologist, 19, 107–117.CrossRefGoogle Scholar
  48. Smith, K. P., & Christakis, N. A. (2008). Social networks and health. Annual Review of Sociology, 34, 405–429.CrossRefGoogle Scholar
  49. Steel, B., List, P., Lach, D., & Shindler, B. (2004). The role of scientists in the environmental policy process: A case study from the American west. Environmental Science & Policy, 7, 1–13.CrossRefGoogle Scholar
  50. Sutton, A. J., & Higgins, J. P. T. (2008). Recent developments in meta-analysis. Statistics in Medicine, 27, 625–650.CrossRefGoogle Scholar
  51. Thøgersen, J. (1999). Spillover processes in the development of a sustainable consumption pattern. Journal of Economic Psychology, 20, 53–81.CrossRefGoogle Scholar
  52. Truelove, H. B., Carrico, A. R., Weber, E. U., Raimi, K. T., & Vandenbergh, M. P. (2014). Positive and negative spillover of pro-environmental behavior: An integrative review and theoretical framework. Global Environmental Change, 29, 127–138.CrossRefGoogle Scholar
  53. Uman, L. S. (2011). Systematic reviews and meta-analyses. Journal of the Canadian Academy of Child and Adolescent Psychiatry, 20, 57–59.CrossRefGoogle Scholar
  54. Vandenbergh, M. P. (2013). Private environmental governance. Cornell Law Review, 99, 134–137.Google Scholar
  55. Vandenbergh, M. P., & Gilligan, J. M. (2015). Beyond gridlock. Columbia Journal of Environmental Law, 40, 217–303.Google Scholar
  56. Wells, K., & Littell, J. H. (2009). Study quality assessment in systematic reviews of research on intervention effects. Research on Social Work Practice, 19, 52–62.CrossRefGoogle Scholar
  57. Whelan, M. E., Goode, A. D., Eakin, E. G., Veerman, J. L., Winkler, E. A. H., Hickman, I. J., et al. (2016). Feasibility, effectiveness and cost-effectiveness of a telephone-based weight loss program delivered via a hospital outpatient setting. Translational Behavioral Medicine, 6, 386–395.CrossRefGoogle Scholar
  58. Whitehead, A. (1997). A prospectively planned cumulative meta-analysis applied to a series of concurrent clinical trials. Statistics in Medicine, 16, 2901–2913.CrossRefGoogle Scholar
  59. Zhang, J. W., Piff, P. K., Iyer, R., Koleva, S., & Keltner, D. (2014). An occasion for unselfing: Beautiful nature leads to prosociality. Journal of Environmental Psychology, 37, 61–72.CrossRefGoogle Scholar

Copyright information

© Springer International Publishing AG 2018

Authors and Affiliations

  • Alexander Maki
    • 1
  • Mark A. Cohen
    • 2
  • Michael P. Vandenbergh
    • 3
  1. 1.Vanderbilt Institute of Energy and Environment and the Climate Change Research NetworkVanderbilt UniversityNashvilleUSA
  2. 2.Owen Graduate School of ManagementVanderbilt UniversityNashvilleUSA
  3. 3.Vanderbilt University Law SchoolVanderbilt UniversityNashvilleUSA

Personalised recommendations