Advertisement

Health Services and Outcomes Research Methodology

, Volume 19, Issue 4, pp 241–258 | Cite as

Assessing the impacts of governance reforms on health services delivery: a quasi-experimental, multi-method, and participatory approach

  • Alan ZarychtaEmail author
  • Krister P. Andersson
  • Elisabeth D. Root
  • Jane Menken
  • Tara Grillos
Article
  • 97 Downloads

Abstract

Despite considerable advances in developing new and more sophisticated impact evaluation methodologies and toolkits, policy research continues to suffer from persistent challenges in achieving the evaluation trifecta: identifying effects, isolating mechanisms, and influencing policy. For example, evaluation studies are routinely hampered by problems of establishing valid counterfactuals due to endogeneity and selection effects with respect to policy reform. Additionally, robust evaluation studies often must contend with heterogeneity in treatment, staggered timing, and variation in uptake. And finally, on practical grounds, researchers frequently struggle to involve policymakers and practitioners throughout the research process in order to engender the type of trust needed for policy influence. While it can be difficult to generalize about appropriate evaluation methodologies across contexts, prominent policy interventions like governance reforms for improving health services delivery nonetheless demand rigorous and comprehensive evaluation strategies that can produce valid results and engage policymakers. Drawing on illustrations from our research on health sector decentralization in Honduras, in this paper we present a quasi-experimental, multi-method, and participatory approach that addresses these persistent challenges to policy evaluation.

Keywords

Impact evaluation Policy analysis Causal inference Mixed methods 

Notes

Funding

This project was completed with financial support from the National Science Foundation (Award Numbers DGE-1144083 & SMA-1328688), Social Science Research Council, University of Colorado Boulder, and University of Chicago. We are especially grateful for the support and assistance we received from staff at the Ministry of Health in Honduras and the Regional Health Authority of Intibucá. All errors and omissions are our own.

Compliance with ethical standards

Conflict of interest

The authors declare they have no conflicts of interest.

Ethical approval

All procedures performed in the study involving human participants were in accordance with the ethical standards of the institutional and/or national research committee and with the 1964 Helsinki declaration and its later amendments or comparable ethical standards. Approval for the research was received from the University of Colorado Boulder Institutional Review Board (Protocol #12-0318).

Informed consent

Informed consent was obtained from all individual participants included in the study.

Supplementary material

10742_2019_201_MOESM1_ESM.docx (1.8 mb)
Supplementary material 1 (DOCX 1827 kb)

References

  1. Andersson, K.: Understanding decentralized forest governance: an application of the institutional analysis and development framework. Sustain. Sci. Pract. Policy 2(1), 25–35 (2006)Google Scholar
  2. Bamberger, M.: Innovations in the use of mixed methods in real-world evaluation. J. Dev. Eff. 7(3), 317–326 (2015)CrossRefGoogle Scholar
  3. Baron, R.M., Kenny, D.A.: The moderator–mediator variable distinction in social psychological research: conceptual, strategic, and statistical considerations. J. Pers. Soc. Psychol. 51(6), 1173–1182 (1986)CrossRefGoogle Scholar
  4. Bennett, A.: Process tracing: a Bayesian approach. In: Box-Steffensmeier, J., Brady, H., Collier, D. (eds.) Oxford Handbook of Political Methodology. Oxford University Press, Oxford (2008)Google Scholar
  5. Blume, G., Scott, T., Pirog, M.: Empirical innovations in policy analysis. Policy Stud. J. 42(S1), S33–S50 (2014)CrossRefGoogle Scholar
  6. Brownson, R.C., et al.: Getting the word out: new approaches for disseminating public health science. J. Public Health Manag. Pract. 24(2), 102 (2018)CrossRefGoogle Scholar
  7. Brady, H.: Data-set observations versus causal-process observations: the 2000 U.S. presidential election. In: Brady, H., Collier, D. (eds.) Rethinking Social Inquiry: Diverse Tools, Shared Standards, 2nd edn. Rowman & Littlefield Publishers, Lanham (2010)Google Scholar
  8. Cash, D., et al.: Salience, Credibility, Legitimacy and Boundaries: Linking Research, Assessment and Decision Making. Social Science Research Network, Rochester. SSRN Scholarly Paper. https://papers.ssrn.com/abstract=372280 (October 9, 2018) (2002)
  9. Cash, D.W., et al.: Knowledge systems for sustainable development. Proc. Natl. Acad. Sci. 100(14), 8086–8091 (2003)CrossRefGoogle Scholar
  10. Clark, W.C., van Kerkhoff, L., Lebel, L., Gallopin, G.C.: Crafting usable knowledge for sustainable development. Proc. Natl. Acad. Sci. 113(17), 4570–4578 (2016)CrossRefGoogle Scholar
  11. Collier, D.: Understanding process tracing. PS Polit. Sci. Polit. 44(4), 823–830 (2011)CrossRefGoogle Scholar
  12. Dixit, A.: Evaluating recipes for development success. World Bank Res. Obs. 22(2), 131–157 (2007).  https://doi.org/10.1093/wbro/lkm005 CrossRefGoogle Scholar
  13. Dunning, T.: Natural Experiments in the Social Sciences: A Design-Based Approach. Cambridge University Press, New York (2012)CrossRefGoogle Scholar
  14. Fu, A.Z., Dow, W.H., Liu, G.G.: Propensity score and difference-in-difference methods: a study of second-generation antidepressant use in patients with bipolar disorder. Health Serv. Outcomes Res. Methodol. 7(1–2), 23–38 (2007)CrossRefGoogle Scholar
  15. Grimmelikhuijsen, S., Jilke, S., Olsen, A.L., Tummers, L.: Behavioral public administration: combining insights from public administration and psychology. Public Adm. Rev. 77(1), 45–56 (2017)CrossRefGoogle Scholar
  16. Habyarimana, J., Humphreys, M., Posner, D.N., Weinstein, J.M.: Coethnicity: Diversity and the Dilemmas of Collective Action. Russell Sage Foundation, New York (2009)Google Scholar
  17. Imai, K., Keele, L., Tingley, D., Yamamoto, T.: Unpacking the black box of causality: learning about causal mechanisms from experimental and observational studies. Am. Polit. Sci. Rev. 105(04), 765–789 (2011)CrossRefGoogle Scholar
  18. Imbens, G.W., Rubin, D.B.: Causal Inference for Statistics, Social, and Biomedical Sciences. Cambridge University Press, New York (2015)CrossRefGoogle Scholar
  19. Imbens, G.W., Wooldridge, J.M.: Recent developments in the econometrics of program evaluation. Journal of Economic Literature 47(1), 5–86 (2009)CrossRefGoogle Scholar
  20. Jung, H., Pirog, M.A.: What works best and when: accounting for multiple sources of pure selection bias in program evaluations. J. Policy Anal. Manag. 33, 752–777 (2014).  https://doi.org/10.1002/pam.21764 CrossRefGoogle Scholar
  21. Kapiszewski, D., MacLean, L., Read, B.: Field Research in Political Science. Cambridge University Press, New York (2014)Google Scholar
  22. King, G., et al.: A ‘politically robust’ experimental design for public policy evaluation, with application to the Mexican universal health insurance program. J. Policy Anal. Manag. 26(3), 479–506 (2007)CrossRefGoogle Scholar
  23. Kreif, N., Grieve, R., Radice, R., Sekhon, J.S.: Regression-adjusted matching and double-robust methods for estimating average treatment effects in health economic evaluation. Health Serv. Outcomes Res. Method. 13(2), 174–202 (2013)CrossRefGoogle Scholar
  24. Lindner, S., John McConnell, K.: Difference-in-differences and matching on outcomes: a tale of two unobservables. Health Serv. Outcomes Res. Methodol. (2018).  https://doi.org/10.1007/s10742-018-0189-0 CrossRefGoogle Scholar
  25. Matson, P., Clark, W.C., Andersson, K.: Pursuing Sustainability: A Guide to the Science and Practice, 1st edn. Princeton University Press, Princeton (2016)Google Scholar
  26. Ministry of Health (MOH), Government of Honduras: Marco Conceptual Político Y Estratégico de La Reforma Del Sector de Salud (2009)Google Scholar
  27. Ministry of Health (MOH), Government of Honduras: Plan Nacional de Salud 2010–2014 (2010)Google Scholar
  28. Morgan, S.L., Winship, C.: Counterfactuals and Causal Inference: Methods and Principles for Social Research. Cambridge University Press, New York (2007)CrossRefGoogle Scholar
  29. Moynihan, D.: A great schism approaching? Towards a micro and macro public administration. J. Behav. Public Adm. 1(1) (2018).  https://doi.org/10.30636/jbpa.11.15
  30. Normand, S.-L.T., Wang, Y., Krumholz, H.M.: Assessing surrogacy of data sources for institutional comparisons. Health Serv. Outcomes Res. Methodol. 7(1), 79–96 (2007)CrossRefGoogle Scholar
  31. O’Neill, K.: Decentralization as an electoral strategy. Comp. Polit. Stud. 36(9), 1068–1091 (2003)CrossRefGoogle Scholar
  32. O’Neill, S., et al.: Estimating causal effects: considering three alternatives to difference-in-differences estimation. Health Serv. Outcomes Res. Methodol. 16(1–2), 1–21 (2016)CrossRefGoogle Scholar
  33. Oakerson, R.J., Parks, R.B.: The study of local public economies: multi-organizational, multi-level institutional analysis and development. Policy Stud. J. 39(1), 147–167 (2011)CrossRefGoogle Scholar
  34. Ostrom, E.: Understanding Institutional Diversity. Princeton University Press, Princeton (2005)Google Scholar
  35. Ostrom, V., Tiebout, C.M., Warren, R.: The organization of government in metropolitan areas: a theoretical inquiry. Am. Polit. Sci. Rev. 55(4), 831–842 (1961)CrossRefGoogle Scholar
  36. Pearl, J.: Causal inference in the health sciences: a conceptual introduction. Health Serv. Outcomes Res. Methodol. 2(3–4), 189–220 (2001)CrossRefGoogle Scholar
  37. Posner, M.A., et al.: Comparing standard regression, propensity score matching, and instrumental variables methods for determining the influence of mammography on stage of diagnosis. Health Serv. Outcomes Res. Methodol. 2(3–4), 279–290 (2001)CrossRefGoogle Scholar
  38. Ricks, J.I., Liu, A.H.: Process-tracing research designs: a practical guide. Polit. Sci. Polit. 51, 842–846 (2018)CrossRefGoogle Scholar
  39. Rosenbaum, P.R., Rubin, D.B.: The central role of the propensity score in observational studies for causal effects. Biometrika 70(1), 41–55 (1983)CrossRefGoogle Scholar
  40. Rubin, D.B.: Using propensity scores to help design observational studies: application to the tobacco litigation. Health Serv. Outcomes Res. Methodol. 2(3–4), 169–188 (2001)CrossRefGoogle Scholar
  41. Rubin, D.B.: For objective causal inference, design trumps analysis. Ann. Appl. Stat. 2(3), 808–840 (2008)CrossRefGoogle Scholar
  42. Tannahill, A., Kelly, M.P.: Layers of complexity in interpreting evidence on effectiveness. Public Health 127(2), 164–170 (2013)CrossRefGoogle Scholar
  43. Sekhon, J.S.: Opiates for the matches: matching methods for causal inference. Annu. Rev. Polit. Sci. 12(1), 487–508 (2009)CrossRefGoogle Scholar
  44. Stokes, S.C., Dunning, T., Nazareno, M., Brusco, V.: Brokers, Voters, and Clientelism: The Puzzle of Distributive Politics. Cambridge University Press, New York (2013)CrossRefGoogle Scholar
  45. White, H.: An introduction to the use of randomised control trials to evaluate development interventions. J. Dev. Effect. 5(1), 30–49 (2013)CrossRefGoogle Scholar
  46. Wright, G.D., Andersson, K.P., Gibson, C.C., Evans, T.P.: Decentralization can help reduce deforestation when user groups engage with local government. Proc. Natl. Acad. Sci. 113(52), 14958–14963 (2016)CrossRefGoogle Scholar
  47. Zarychta, A.: Making Social Services Work Better for the Poor: Evidence from a Natural Experiment with Health Sector Decentralization in Honduras. Working Paper (2018)Google Scholar

Copyright information

© Springer Science+Business Media, LLC, part of Springer Nature 2019

Authors and Affiliations

  1. 1.University of ChicagoChicagoUSA
  2. 2.University of Colorado BoulderBoulderUSA
  3. 3.The Ohio State UniversityColombusUSA
  4. 4.Purdue UniversityWest LafayetteUSA

Personalised recommendations