Advertisement

PharmacoEconomics

, Volume 37, Issue 11, pp 1371–1381 | Cite as

Can You Repeat That? Exploring the Definition of a Successful Model Replication in Health Economics

  • Emma McManusEmail author
  • David Turner
  • Tracey Sach
Review Article

Abstract

The International Society for Pharmacoeconomics and Outcomes Research (ISPOR) modelling taskforce suggests decision models should be thoroughly reported and transparent. However, the level of transparency and indeed how transparency should be assessed are yet to be defined. One way may be to attempt to replicate the model and its outputs. The ability to replicate a decision model could demonstrate adequate reporting transparency. This review aims to explore published definitions of replication success across all scientific disciplines and to consider how such a definition should be tailored for use in health economic models. A literature review was conducted to identify published definitions of a ‘successful replication’. Using these as a foundation, several definitions of replication success were constructed, to be applicable to replications of economic decision models, with the associated strengths and weaknesses of such definitions discussed. A substantial body of literature discussing replicability was found; however, relatively few studies, ten, explicitly defined a successful replication. These definitions varied from subjective assessments to expecting exactly the same results to be reproduced. Whilst the definitions that have been found may help to construct a definition specific to health economics, no definition was found that completely encompassed the unique requirements for decision models. Replication is widely discussed in other scientific disciplines; however, as of yet, there is no consensus on how replicable models should be within health economics or what constitutes a successful replication. Replication studies can demonstrate how transparently a model is reported, identify potential calculation errors and inform future reporting practices. It may therefore be a useful adjunct to other transparency or quality measures.

Notes

Acknowledgements

The authors would like to thank the delegates of the Bristol Health Economists’ Study Group (2018), where an earlier version of this paper was discussed.

Compliance with Ethical Standards

Conflict of interest

The authors, Emma McManus, David Turner, and Tracey Sach, have no conflicts of interest to declare.

Funding

Funding for this project was received from CLAHRC East of England (Project HE7: Dec 2015). This paper was written whilst also being funded by Professor Tracey Sach’s Career Development Fellowship (CDF award-2014-07-006) supported by the National Institute for Health Research. The views expressed in this paper are those of the authors and not necessarily those of the NHS, the National Institute for Health Research or the Department of Health and Social Care.

References

  1. 1.
    Iqbal SA, et al. Reproducible research practices and transparency across the biomedical literature. PLoS Biol. 2016;14(1):e1002333.PubMedPubMedCentralGoogle Scholar
  2. 2.
    Peng RD. Reproducible research in computational science. Science. 2011;334(6060):1226–7.PubMedPubMedCentralGoogle Scholar
  3. 3.
    Rougier NP, et al. Sustainable computational science: the ReScience initiative. PeerJ Comput Sci. 2017;3:e142.Google Scholar
  4. 4.
    Makel MC, Plucker JA, Hegarty B. Replications in psychology research: how often do they really occur? Perspect Psychol Sci. 2012;7(6):537–42.PubMedGoogle Scholar
  5. 5.
    Peng RD, Dominici F, Zeger SL. Reproducible epidemiologic research. Am J Epidemiol. 2006;163(9):783–9.PubMedGoogle Scholar
  6. 6.
    Duvendack M, Palmer-Jones R, Reed WR. What is meant by “replication” and why does it encounter resistance in economics? Am Econ Rev. 2017;107(5):46–51.Google Scholar
  7. 7.
    Bettis RA, Helfat CE, Shaver JM. The necessity, logic, and forms of replication. Strategic Manag J. 2016;37(11):2193–203.Google Scholar
  8. 8.
    Brown AN, Cameron DB, Wood BD. Quality evidence for policymaking: I’ll believe it when I see the replication. J Dev Effect. 2014;6(3):215–35.Google Scholar
  9. 9.
    Höffler JH. Replication and economics journal policies. Am Econ Rev. 2017;107(5):52–5.Google Scholar
  10. 10.
    Duvendack M, Palmer-Jones RW, Reed WR. Replications in economics: a progress report. Econ J Watch. 2015;12(2):164–91.Google Scholar
  11. 11.
    McCullough BD, McGeary KA, Harrison TD. Do economics journal archives promote replicable research? Can J Econ. 2008;41(4):1406–20.Google Scholar
  12. 12.
    Chang AC, Li P. Is economics research replicable? Sixty published papers from thirteen journals say ‘usually not’. Finance and Economics Discussion Series 2015-083, 2015.Google Scholar
  13. 13.
    Herndon T, Ash M, Pollin R. Does high public debt consistently stifle economic growth? A critique of Reinhart and Rogoff. Camb J Econ. 2014;38(2):257–79.Google Scholar
  14. 14.
    Baker M. Is there a reproducibility crisis? A nature survey lifts the lid on how researchers view the ‘crisis rocking science and what they think will help. Nature. 2016;533(7604):452–5.Google Scholar
  15. 15.
    Maxwell SE, Lau MY, Howard GS. Is psychology suffering from a replication crisis? What does “failure to replicate” really mean? Am Psychol. 2015;70(6):487.PubMedGoogle Scholar
  16. 16.
    Ioannidis JP. Why most published research findings are false. PLoS Med. 2005;2(8):e124.PubMedPubMedCentralGoogle Scholar
  17. 17.
    International Initiative for Impact Evaluation. About 3ie. 2017; Available from: http://www.3ieimpact.org/en/about/.
  18. 18.
    Berkeley Initiative for Transparency in the Social Sciences. 2017; Available from: http://www.bitss.org/. Cited 2 Nov 2017.
  19. 19.
    The Replication Network. Furthering the practice of replication in economics. 2017; Available from: https://replicationnetwork.com/. Cited 2 Nov 2017.
  20. 20.
    Pashler H et al. PsychFileDrawer: archive of replication attempts in experimental psychology. Available from: http://www.psychfiledrawer.org/faq.php. Cited 29 April 2019.
  21. 21.
    Schmidt S. Shall we really do it again? The powerful concept of replication is neglected in the social sciences. Review of General Psychology. 2009;13(2):90.Google Scholar
  22. 22.
    Goodman SN, Fanelli D, Ioannidis JP. What does research reproducibility mean? Sci Transl Med 2016;8(341):341ps12.PubMedGoogle Scholar
  23. 23.
    Clemens MA. The meaning of failed replications: a review and proposal. J Econ Surv. 2017;31(1):326–42.Google Scholar
  24. 24.
    Pesaran H. Introducing a replication section. J Appl Econ. 2003;18(1):111.Google Scholar
  25. 25.
    Morin A, et al. Shining light into black boxes. Science. 2012;336(6078):159–60.PubMedPubMedCentralGoogle Scholar
  26. 26.
    Beca J, et al. Oncology modeling for fun and profit! Key steps for busy analysts in Health Technology Assessment. PharmacoEconomics. 2018;36(1):7–15.PubMedGoogle Scholar
  27. 27.
    Hlatky MA. Considering cost-effectiveness in cardiology clinical guidelines: progress and prospects. Value Health. 2016;19(5):516–9.PubMedGoogle Scholar
  28. 28.
    Bell CM, et al. Bias in published cost effectiveness studies: systematic review. BMJ. 2006;332(7543):699–703.PubMedPubMedCentralGoogle Scholar
  29. 29.
    Baker CB, et al. Quantitative analysis of sponsorship bias in economic studies of antidepressants. Br J Psychiatry. 2003;183(6):498–506.PubMedGoogle Scholar
  30. 30.
    Caro JJ, et al. Modeling good research practices—overview: a report of the ISPOR-SMDM Modeling Good Research Practices Task Force–1. Med Decis Making. 2012;32(5):667–77.PubMedGoogle Scholar
  31. 31.
    Eddy DM, et al. Model transparency and validation a report of the ISPOR-SMDM Modeling Good Research Practices Task Force–7. Med Decis Making. 2012;32(5):733–43.Google Scholar
  32. 32.
    Husereau D, et al. Consolidated health economic evaluation reporting standards (CHEERS) statement. Cost Effect Resour Allocat. 2013;11(1):6.Google Scholar
  33. 33.
    Philips Z et al. Review of guidelines for good practice in decision-analytic modelling in health technology assessment. Health Technol Assess 2004;8.Google Scholar
  34. 34.
    The Reward Alliance. The REWARD statement. 2016; Available from: http://researchwaste.net/reward-statement/. Cited 19 Apr 2016.
  35. 35.
    Equator Network. EQUATOR Network: what we do and how we are organised. 2016; Available from: http://www.equator-network.org/about-us/. Cited 19 Apr 2016.
  36. 36.
    Bermejo I, Tappenden P, Youn J-H. Replicating health economic models: firm foundations or a house of cards? PharmacoEconomics. 2017;35(11):1113–21.PubMedGoogle Scholar
  37. 37.
    Bermejo I, Tappenden P, Youn J-H. Response to ‘Comment on “Replicating health economic models: firm foundations or a house of cards?”’. PharmacoEconomics. 2017;35(11):1189–90.PubMedGoogle Scholar
  38. 38.
    Mokkink LB, et al. The COSMIN study reached international consensus on taxonomy, terminology, and definitions of measurement properties for health-related patient-reported outcomes. J Clin Epidemiol. 2010;63(7):737–45.PubMedGoogle Scholar
  39. 39.
    Bakkalbasi N, et al. Three options for citation tracking: Google Scholar, Scopus and Web of Science. Biomed Digit Lib. 2006;3(1):7.Google Scholar
  40. 40.
    Wohlin, C.: Guidelines for snowballing in systematic literature studies and a replication in software engineering. In Proceedings of the 18th international Conference on Evaluation and Assessment in Software Engineering. 2014. ACM.Google Scholar
  41. 41.
    Open Science Collaboration. Estimating the reproducibility of psychological science. Science 2015;349(6251):aac4716.Google Scholar
  42. 42.
    Peng RD. Reproducible research and biostatistics. Biostatistics. 2009;10(3):405–8.PubMedGoogle Scholar
  43. 43.
    García FM. Do Kenyan teenagers respond to HIV risk information? A procedural replication of Dupas (2011). 2014.Google Scholar
  44. 44.
    Chang AC. A replication recipe: list your ingredients before you start cooking. 2017, Economics Discussion Papers.Google Scholar
  45. 45.
    Hardwicke TE, Mathur MB, Frank MC. Pre-registration: evaluation of the open data policy at Cognition. 2017. Available from: https://osf.io/q4qy3/. Cited 3 July 2018.
  46. 46.
    Höffler JH. ReplicationWiki - Improving transparency in the social sciences. 2018; http://replication.uni-goettingen.de/wiki/index.php/Main_Page. Cited 3 July 2018.
  47. 47.
    Brunner J, Schimmack U. How replicable is psychology? A comparison of four methods of estimating replicability on the basis of test statistics in original studies. 2016.Google Scholar
  48. 48.
    Patil P, Peng RD, Leek JT. What should researchers expect when they replicate studies? A statistical view of replicability in psychological science. Perspect Psychol Sci. 2016;11(4):539–44.PubMedPubMedCentralGoogle Scholar
  49. 49.
    Cova F et al. Estimating the reproducibility of experimental philosophy. OpenCova, Florian et al.“Estimating the Reproducibility of Experimental Philosophy”. PsyArXiv, 2018. p. 21.Google Scholar
  50. 50.
    Jones LE, Ziebarth NR. Successful scientific replication and extension of Levitt (2008): child seats are still no safer than seat belts. J Appl Econ. 2016;31(5):920–8.Google Scholar
  51. 51.
    Ritchie SJ, Wiseman R, French CC. Failing the future: three unsuccessful attempts to replicate Bem’s ‘retroactive facilitation of recall’ effect. PLoS One. 2012;7(3):e33423.PubMedPubMedCentralGoogle Scholar
  52. 52.
    Palmer AJ et al. Computer modeling of diabetes and its transparency: a report on the Eighth Mount Hood Challenge. Value in Health, 2018.Google Scholar
  53. 53.
    McManus E et al. The barriers and facilitators to model replication within health economics. Value in Health, 2019.Google Scholar
  54. 54.
    Smolen LJ, Klein TM, Kelton K. Replication of a published Markov chronic migraine cost-effectiveness analysis model for purposes of early phase adaptation and expansion. Value Health. 2015;18(3):A19.Google Scholar
  55. 55.
    McManus E, Sach TJP. Comment on “replicating health economic models: firm foundations or a house of cards?” 2017;35(11):1187–1188.Google Scholar
  56. 56.
    Baxter M, et al. Estimating the impact of better management of glycaemic control in adults with type 1 and type 2 diabetes on the number of clinical complications and the associated financial benefit. Diabet Med. 2016;33(11):1575–81.PubMedGoogle Scholar
  57. 57.
    Clarke P, et al. Cost-utility analyses of intensive blood glucose and tight blood pressure control in type 2 diabetes (UKPDS 72). Diabetologia. 2005;48(5):868–77.PubMedGoogle Scholar
  58. 58.
    Batty AJ, et al. The cost-effectiveness of onabotulinumtoxinA for the prophylaxis of headache in adults with chronic migraine in the UK. J Med Econ. 2013;16(7):877–87.PubMedPubMedCentralGoogle Scholar
  59. 59.
    Martin G, Clarke RM. Are psychology journals anti-replication? A snapshot of editorial practices. Front Psychol. 2017;8:523.PubMedPubMedCentralGoogle Scholar
  60. 60.
    Ryan-Wenger NA. The benefits of replication research. J Spec Pediatr Nurs. 2017;22(4):e12198.Google Scholar
  61. 61.
    American Economic Association. Data availability policy. 2018; https://www.aeaweb.org/journals/policies/data-availability-policy. Cited 10 May 2018.
  62. 62.
    Sampson CJ, Wrightson T. Model registration: a call to action. PharmacoEconomics Open. 2017;1(2):73–7.PubMedPubMedCentralGoogle Scholar
  63. 63.
    Sampson CJ. Call for a model registry. 2012; Available from: https://aheblog.com/2012/10/19/call-for-a-model-registry/. Cited 11 May 2018.
  64. 64.
    Arnold RJ, Ekins S. Time for cooperation in health economics among the modelling community. PharmacoEconomics. 2010;28(8):609–13.PubMedGoogle Scholar

Copyright information

© Springer Nature Switzerland AG 2019

Authors and Affiliations

  1. 1.Norwich Medical SchoolUniversity of East AngliaNorwichUK

Personalised recommendations