Abstract
This chapter examines how evaluations and evaluative thinking can help in the social epidemiologic study of complex interventions. There is increasing interest within the field of social epidemiology in studying interventions, as well as increasing pressure from funders and decision makers to make research more relevant for addressing social problems. Within the field of evaluation, there is a parallel move towards embracing the study of complex interventions − the very kinds of interventions that will almost invariably be the focus of social epidemiology. Using the example of interventions that seek to address health inequities in urban settings, we introduce a framework of steps through which evaluations can impact such health inequities. Rather than discussing a series of tools and methods, we use these steps to describe the importance of thinking evaluatively in addressing complex social problems. Specifically, we highlight a realist approach to evaluation. This approach focuses not only on whether an intervention works, but also on how it works, for whom and under what conditions (Pawson and Tilley 1997). This perspective marks a significant departure from traditions of other branches of epidemiology, such as clinical epidemiology, where the whether question is paramount and the how question is less important, often because of the uniformity and simplicity of interventions (e.g., administration of a drug). Research within epidemiology on social interventions has been relatively uncommon to date, and this chapter seeks to provide some guidance to expanding the literature on the health effects social interventions by engaging with cutting-edge theory on thinking evaluatively.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
Abbreviations
- RCT:
-
randomized controlled trial
- SES:
-
socioeconomic status
References
Babiak KM (2009) Criteria of effectiveness in multiple cross-sectoral interorganizational relationships. Eval Program Plann 32:1–12
Berkman L (2004) Introduction: seeing the forest and the trees − from observation to experiments in social epidemiology. Epidemiol Rev 26:2–6
Bourdages J, Sauvageau L, Lepage C (2003) Factors in creating sustainable intersectoral community mobilization for prevention of heart and lung disease. Health Promot Int 18:135–144
Cook TD (2000) The false choice between theory-based evaluation and experimentation. New Dir Eval 87:27–34
Culyer AJ (2007) Equity of what in health care? Why the traditional answers don’t help policy – and what to do in the future. Healthc Pap 8:12–26
Fox J (1996) Promoting independent assessments of MDB anti-poverty investments: bringing civil society in. IDR Rep 12:1–9
Gerberding JL (2005) Protecting health: the new research imperative. JAMA 294:1403–1406
Henry GT, Mark MM (2003) Beyond use: understanding evaluation’s influence on attitudes and actions. Am J Eval 24:293–314
Kaufman JS, Poole C (2000) Looking back on “Causal Thinking in the Health Sciences.”. Annu Rev Public Health 21:101–119
Leischow SJ, Milstein B (2006) Systems thinking and modeling for public health practice. Am J Public Health 96:403–405
Mark MM, Henry GT (2004) The mechanisms and outcomes of evaluation influence. Evaluation 10:35–57
Mark MM, Henry GT, Julnes G (2000) Evaluation: an integrated framework for understanding, guiding, and improving policies and programs. Jossey Bass, San Francisco
Massoud MR, Nielsen GA, Nolan K et al (2006) A framework for spread: from local improvements to system-wide change. IHI Innovation Series white paper. Institute for Healthcare Improvement, Cambridge
Mayne J (2001) Addressing attribution through contribution analysis: using performance measures sensibly. Can J Program Eval 16:1–24
Milstein B, Jones A, Homer JB et al (2007) Charting plausible futures for diabetes prevalence in the United States: a role for system dynamics simulation modeling. Prev Chronic Dis 4:A52
Morell JA (2010) Evaluation in the face of uncertainty: anticipating surprise and responding to the inevitable. Guilford Press, New York
Patton MQ (2010) Developmental evaluation: applying complexity concepts to enhance innovation and use. Guilford Press, New York
Pawson R (2006) Evidence-based policy: a realist perspective. Sage Publications, London
Pawson R, Sridharan S (2009) Theory-driven evaluation of public health programmes. In: Killoran A, Kelly M (eds) Evidence-based public health: effectiveness and efficiency. Oxford University Press, Oxford
Pawson R, Tilley N (1997) An introduction to scientific realist evaluations. In: Chelimsky E, Shadish WR (eds) Evaluation for the 21st century: a handbook. Sage Publications, Thousand Oaks
Pawson R, Greenhalgh T, Harvey G et al (2004) Realist synthesis: an introduction. In: ESRC research methods programme: RMP methods paper series. University of Manchester, Manchester
Petticrew MP, Chalabi Z, Jones D (2011) To RCT or not to RCT: deciding when more evidence is needed for public health policy and practice. J Epidemiol Community Health. J Epidemiol Community Health. Published online on 7 April 2011. http://jech.bmj.com/content/early/2011/06/07/jech.2010.116483.short
Riley BL, MacDonald J, Mansi O et al (2008) Is reporting on interventions a weak link in understanding how and why they work? A preliminary exploration using community heart health exemplars. Implemen Sci 3:27
Rog D, Boback N, Barton-Villagrana H et al (2004) Sustaining collaboratives: a cross-site analysis of The National Funding Collaborative on Violence Prevention. Eval Program Plann 27:249–261
Sen A (2002) Why health equity? Health Econ 11:659–666
Shapira P, Kingsley G, Youtie J (1997) Manufacturing partnerships: evaluation in the context of government reform. Eval Program Plann 20:103–112
Smith A, Spenlehauer V (1994) Policy evaluation meets harsh reality: instrument of integration or preserver of disintegration? Eval Program Plann 17:277–287
Sridharan S, Gillespie D (2004) Sustaining problem-solving capacity in collaborative networks. Criminol Public Policy 3:221–250
Sridharan S, Nakaima A (2011) Ten steps to making evaluations matter. Eval Program Plann 34(2):135–146
Sridharan S, Campbell B, Zinzow H (2006) Developing a stakeholder-driven timeline of change for evaluations of social programs. Am J Eval 27:148–162
Sridharan S, Gardner B, Nakaima A (2009) Steps towards incorporating health inequities into a performance measurement and management framework: analysis of the hospital health equity plans. Toronto Central Local Health Integration Network, Toronto. http://www.torontoevaluation.ca/tclhin/PDF/LHIN%20final%20October%2015%20send.pdf. Accessed 11 Apr 2011
Stange KC (2009) The problem of fragmentation and the need for integrative solutions. Ann Fam Med 7:100–103
Treasury Board of Canada Secretariat (2009) Archived [2009-03-31] − policy on evaluation. http://www.tbs-sct.gc.ca/pol/doc-eng.aspx?id=12309. Accessed 11 Apr 2011
Watt G, O’Donnell C, Sridharan S (2011) Building on Julian Tudor Hart’s example of anticipatory care. Prim Health Care Res Dev 12:3–10
Whitehead M (1992) The concepts and principles of equity and health. Int J Health Serv 22:429–445
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2012 Springer Science+Business Media B.V.
About this chapter
Cite this chapter
Sridharan, S., Dunn, J.R., Nakaima, A. (2012). Addressing Health Equities in Social Epidemiology: Learning from Evaluation(s). In: O’Campo, P., Dunn, J. (eds) Rethinking Social Epidemiology. Springer, Dordrecht. https://doi.org/10.1007/978-94-007-2138-8_12
Download citation
DOI: https://doi.org/10.1007/978-94-007-2138-8_12
Published:
Publisher Name: Springer, Dordrecht
Print ISBN: 978-94-007-2137-1
Online ISBN: 978-94-007-2138-8
eBook Packages: Biomedical and Life SciencesBiomedical and Life Sciences (R0)