Good Practice Guidelines for Decision-Analytic Modelling in Health Technology Assessment
- 1.5k Downloads
The use of decision-analytic modelling for the purpose of health technology assessment (HTA) has increased dramatically in recent years. Several guidelines for best practice have emerged in the literature; however, there is no agreed standard for what constitutes a ‘good model’ or how models should be formally assessed. The objective of this paper is to identify, review and consolidate existing guidelines on the use of decision-analytic modelling for the purpose of HTA and to develop a consistent framework against which the quality of models may be assessed.
The review and resultant framework are summarised under the three key themes of Structure, Data and Consistency. ‘Structural’ aspects relate to the scope and mathematical structure of the model including the strategies under evaluation. Issues covered under the general heading of ‘Data’ include data identification methods and how uncertainty should be addressed. ‘Consistency’ relates to the overall quality of the model.
The review of existing guidelines showed that although authors may provide a consistent message regarding some aspects of modelling, such as the need for transparency, they are contradictory in other areas. Particular areas of disagreement are how data should be incorporated into models and how uncertainty should be assessed.
For the purpose of evaluation, the resultant framework is applied to a decision-analytic model developed as part of an appraisal for the National Institute for Health and Clinical Excellence (NICE) in the UK. As a further assessment, the review based on the framework is compared with an assessment provided by an independent experienced modeller not using the framework.
It is hoped that the framework developed here may form part of the appraisals process for assessment bodies such as NICE and decision models submitted to peer review journals. However, given the speed with which decision-modelling methodology advances, there is a need for its continual update.
KeywordsHealth Technology Assessment Probabilistic Sensitivity Analysis Review Team Appraisal Committee Decision Uncertainty
Elisabeth Fenwick provided the independent assessment of the case study models used here. The expert advisory group comprised: Pelham Barton (Health Service Management Centre, University of Birmingham, UK); Colin Green (National Coordinating Centre for Health Technology Assessment, University of Southampton, UK); Luke Vale (Health Service Research Unit, University of Aberdeen, UK); Chris McCabe (School of Health and Related Research, University of Sheffield, UK); Suzy Paisley (School of Health and Related Research, University of Sheffield, UK); Alec Miners (Technology Appraisals Team, National Institute for Health and Clinical Excellence, UK); Steve Palmer (Centre for Health Economics, University of York, UK); and Elisabeth Fenwick (Centre for Health Economics, University of York, UK). However, the views in this paper are those of the authors alone.
This study was funded by the NHS Research and Development Health Technology Assessment Programme. The views expressed are those of the authors and do not necessarily reflect those of the UK Department of Health.
Mark Sculpher also received funding via a Career Award in Public Health funded by the NHS Research and Development Programme.
The authors have no conflicts of interest that are directly relevant to the contents of this article.
- 4.National Institute for Clinical Excellence. Guide to the methods of technology appraisal. London: NICE, 2004Google Scholar
- 8.Decision analytic modelling in the economic evaluation of health technologies: a consensus statement. Phannacoeconomics 2000; 17 (5): 443–4Google Scholar
- 10.Eddy DM. Technology assessment: the role of mathematical modelling. In: Committee for Evaluating Medical Technologies in Clinical Use, Institute of Medicine, editors. Assessing medical technologies. Washington, DC: National Academy Press, 1985: 144–160Google Scholar