Issues in the Design of Discrete Choice Experiments

  • Richard NormanEmail author
  • Benjamin M. Craig
  • Paul Hansen
  • Marcel F. Jonker
  • John Rose
  • Deborah J. Street
  • Brendan Mulhern
Part of the following topical collections:
  1. From the International Academy of Health Preference Research


The use of preference-elicitation tasks—in particular, discrete choice experiments (DCEs)—in health economics has grown significantly in recent decades [1]. The most widely used DCE approach asks respondents to consider a series of hypothetical choices between alternatives (here called choice tasks), and to specify which alternative they prefer. The use of choice tasks in other areas—especially psychology, transportation, marketing and agriculture—has a more established history. Health preference studies have been conducted for about as long [2, 3], but not to the same extent; the relatively late uptake in preference evidence in health is surprising in some regards as patient and population values concerning health have always been key components of a range of questions from health policy to clinical practice, and often cannot be directly observed, a problem exacerbated by the lack of a perfectly competitive market [4]. Though there is broad consensus on the value...


Compliance with Ethical Standards

Conflict of interest

No funding was received for the writing of this commentary. RN, DS, BMC, MFJ and BM have no potential conflicts of interest. JR is a co-developer of Ngene. PH is a co-developer of 1000minds.


  1. 1.
    Clark MD, Determann D, Petrou S, Moro D, de Bekker-Grob EW. Discrete choice experiments in health economics: a review of the literature. Pharmacoeconomics. 2014;32(9):883–902.CrossRefGoogle Scholar
  2. 2.
    Thorndike EL. Valuations of certain pains, deprivations, and frustrations. Pedagog Semin J Genet Psychol. 1937;51(2):227–39.CrossRefGoogle Scholar
  3. 3.
    Thurstone LL. The method of paired comparisons for social values. J Abnorm Soc Psychol. 1927;21:384–400.CrossRefGoogle Scholar
  4. 4.
    Arrow KJ. Uncertainty and the welfare economics of medical care. Am Econ Rev. 1963;53(5):941–73.Google Scholar
  5. 5.
    David HA. The method of paired comparisons. 2nd ed. London: Oxford University Press; 1988.Google Scholar
  6. 6.
    Gonzalez JM, Johnson FR, Levitan B, Noel R, Peay H. Symposium title: preference evidence for regulatory decisions. Patient. 2018;11(5):467–73.CrossRefGoogle Scholar
  7. 7.
    Groothuis-Oudshoorn CGM, Flynn TN, Yoo HI, Magidson J, Oppe M. Key issues and potential solutions for understanding healthcare preference heterogeneity free from patient-level scale confounds. Patient. 2018;11(5):463–6.CrossRefGoogle Scholar
  8. 8.
    Coast J, Al-Janabi H, Sutton EJ, Horrocks SA, Vosper AJ, Swancutt DR, et al. Using qualitative methods for attribute development for discrete choice experiments: issues and recommendations. Health Econ. 2012. Scholar
  9. 9.
    Reed Johnson F, Lancsar E, Marshall D, Kilambi V, Muhlbacher A, Regier DA, et al. Constructing experimental designs for discrete-choice experiments: report of the ISPOR Conjoint Analysis Experimental Design Good Research Practices Task Force. Value Health. 2013;16(1):3–13.CrossRefGoogle Scholar
  10. 10.
    Street DJ, Burgess L. The construction of optimal stated choice experiments: theory and methods. Hoboken: Wiley; 2007.CrossRefGoogle Scholar
  11. 11.
    Cook RD, Nachtsheim CJ. A comparison of algorithms for constructing exact D-optimal designs. Technometrics. 1980;22(3):315–24.CrossRefGoogle Scholar
  12. 12.
    Meyer RK, Nachtsheim CJ. The coordinate-exchange algorithm for constructing exact optimal experimental designs. Technometrics. 1995;37(1):60–9.CrossRefGoogle Scholar
  13. 13.
    Fiebig D, Keane M, Louviere J, Wasi N. The Generalized multinomial logit model: accounting for scale and coefficient heterogeneity. Mark Sci. 2010;29(3):393–421.CrossRefGoogle Scholar
  14. 14.
    Scarpa R, Rose JM. Design efficiency for non-market valuation with choice modelling: how to measure it, what to report and why. Aust J Agric Resour Econ. 2008;52(3):253–82.CrossRefGoogle Scholar
  15. 15.
    Kanninen B. Optimal design of choice experiments for non-market valuation. In: Stated preference: what do we know? Where do we go?. Environmental Law Institute, Washington DC; 2000.Google Scholar
  16. 16.
    Atkinson AC, Donev AN. Optimum experimental designs. Oxford: Oxford Science Publications; 1992.Google Scholar
  17. 17.
    Rummel M, Kim TM, Aversa F, Brugger W, Capochiani E, Plenteda C, et al. Preference for subcutaneous or intravenous administration of rituximab among patients with untreated CD20+ diffuse large B-cell lymphoma or follicular lymphoma: results from a prospective, randomized, open-label, crossover study (PrefMab). Ann Oncol. 2017;28(4):836–42.Google Scholar
  18. 18.
    Brazier J, Ratcliffe J, Salomon JA, Tsuchiya A. Measuring and valuing health benefits for economic evaluation. Oxford: Oxford University Press; 2007.Google Scholar
  19. 19.
    Erdem S, Campbell D, Hole AR. Accounting for attribute-level non-attendance in a health choice experiment: does it matter? Health Econ. 2014;24(7):773–89.CrossRefGoogle Scholar
  20. 20.
    Hole AR, Norman R, Viney R. Response patterns in health state valuation using endogenous attribute attendance and latent class analysis. Health Econ. 2016;25(2):212–24.CrossRefGoogle Scholar
  21. 21.
    Jonker MF, Donkers B, De Bekker-Grob EW, Stolk EA. The effect of level overlap and color coding on attribute non-attendance in discrete choice experiments. Value Health. 2018;21(7):767–71.CrossRefGoogle Scholar
  22. 22.
    Louviere JJ, Islam T, Wasi N, Street D, Burgess L. Designing discrete choice experiments: do optimal designs come at a price? J Consum Res. 2008;35(2):360–75.CrossRefGoogle Scholar
  23. 23.
    Grossman H, Schwabe R. Design for discrete choice experiments. In: Dean A, Morris M, Stufken J, Bingham D, editors. Handbook of design and analysis of experiments. Boca Raton: CRC Press; 2015.Google Scholar
  24. 24.
    Choice Metrics Pty Ltd. Ngene user manual and reference guide (version 1.2). 2018. Accessed 15 Nov 2018.
  25. 25.
    Norman R, Viney R, Aaronson NK, Brazier JE, Cella DF, Costa DSJ, et al. Using a discrete choice experiment to value the QLU-C10D: feasibility and sensitivity to presentation format. Qual Life Res. 2016;25(3):637–49.CrossRefGoogle Scholar
  26. 26.
    Mulhern B, Norman R, Street DJ, Viney R. One method, many methodological choices: a structured review of discrete-choice experiments for health state valuation. Pharmacoeconomics. 2018. Scholar
  27. 27.
    Janssen EM, Marshall DA, Hauber AB, Bridges JFP. Improving the quality of discrete-choice experiments in health: how can we assess validity and reliability? Expert Rev Pharmacoecon Outcomes Res. 2017;17(6):531–42.CrossRefGoogle Scholar
  28. 28.
    Hansen P, Ombler F. A new method for scoring multi-attribute value models using pairwise rankings of alternatives. J Multi Criteria Decis Anal. 2008;15:87–107.CrossRefGoogle Scholar
  29. 29.
    Thokala P, Devlin N, Marsh K, Baltussen R, Boysen M, Kalo Z, et al. Multiple criteria decision analysis for health care decision making—an introduction: report 1 of the ISPOR MCDA emerging good practices task force. Value Health. 2016;19(1):1–13.CrossRefGoogle Scholar
  30. 30.
    Green PE, Krieger AM, Wind Y. Thirty years of conjoint analysis: reflections and prospects. Interfaces. 2001;31(3):S56.CrossRefGoogle Scholar
  31. 31.
    de Bekker-Grob EW, Donkers B, Jonker MF, Stolk EA. Sample size requirements for discrete-choice experiments in healthcare: a practical guide. Patient. 2015;8(5):373–84.CrossRefGoogle Scholar

Copyright information

© Springer Nature Switzerland AG 2018

Authors and Affiliations

  • Richard Norman
    • 1
    Email author
  • Benjamin M. Craig
    • 2
  • Paul Hansen
    • 3
  • Marcel F. Jonker
    • 4
    • 5
    • 6
  • John Rose
    • 7
  • Deborah J. Street
    • 8
  • Brendan Mulhern
    • 8
  1. 1.School of Public HealthCurtin UniversityPerthAustralia
  2. 2.University of Southern FloridaTampaUSA
  3. 3.Department of EconomicsUniversity of OtagoDunedinNew Zealand
  4. 4.Duke Clinical Research InstituteDuke UniversityDurhamUSA
  5. 5.Erasmus Choice Modeling CentreErasmus University RotterdamRotterdamThe Netherlands
  6. 6.Erasmus School of Health Policy and ManagementErasmus University RotterdamRotterdamThe Netherlands
  7. 7.Centre for Business Intelligence and Data AnalyticsUniversity of Technology SydneySydneyAustralia
  8. 8.Centre for Health Economics Research and Evaluation (CHERE)University of Technology SydneySydneyAustralia

Personalised recommendations