Skip to main content

Experiments in Market Research

  • Living reference work entry
  • First Online:
Handbook of Market Research

Abstract

The question of how a certain activity (e.g., the intensity of communication activities during the launch of a new product) influences important outcomes (e.g., sales, preferences) is one of the key questions in applied (as well as academic) research in marketing. While such questions may be answered based on observed values of activities and the respective outcomes using survey and/or archival data, it is often not possible to claim that the particular activity has actually caused the observed changes in the outcomes. To demonstrate cause-effect relationships, experiments take a different route. Instead of observing activities, experimentation involves the systematic variation of an independent variable (factor) and the observation of the outcome only. The goal of this chapter is to discuss the parameters relevant to the proper execution of experimental studies. Among others, this involves decisions regarding the number of factors to be manipulated, the measurement of the outcome variable, the environment in which to conduct the experiment, and the recruitment of participants.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Institutional subscriptions

References

  • Aaker, D. A., Kumar, V., Day, G. S., & Leone, R. P. (2011). Marketing research. Hoboken: Wiley.

    Google Scholar 

  • Albrecht, C.-M., Hattula, S., Bornemann, T., & Hoyer, W. D. (2016). Customer response to interactional service experience: The role of interaction environment. Journal of Service Management, 27(5), 704–729.

    Article  Google Scholar 

  • Albrecht, C.-M., Hattula, S., & Lehmann, D. R. (2017). The relationship between consumer shopping stress and purchase abandonment in task-oriented and recreation-oriented consumers. Journal of the Academy of Marketing Science, 45(5), 720–740.

    Article  Google Scholar 

  • Anderson, E. T., & Simester, D. (2011). A step-by-step guide to smart business experiments. Harvard Business Review, 89(3), 98–105.

    Google Scholar 

  • APA. (2002). Ethical principles of psychologists and code of conduct. American Psychologist, 57(12), 1060–1073.

    Article  Google Scholar 

  • Arnold, V. (2008). Advances in accounting behavioral research. Bradford: Emerald Group Publishing.

    Google Scholar 

  • Baum, D., & Spann, M. (2011). Experimentelle Forschung im Marketing: Entwicklung und zukünftige Chancen. Marketing – Zeitschrift für Forschung und Praxis, 33(3), 179–191.

    Google Scholar 

  • Bearden, W. O., & Etzel, M. (1982). Reference group influence on product and brand decisions. Journal of Consumer Research, 9(April), 183–194.

    Article  Google Scholar 

  • Benz, M., & Meier, S. (2008). Do people behave in experiments as in the field?—Evidence from donations. Experimental Economics, 11(3), 268–281.

    Article  Google Scholar 

  • Berkowitz, L., & Donnerstein, E. (1982). External validity is more than skin deep: Some answers to criticisms of laboratory experiments. American Psychologist, 37(3), 245–257.

    Article  Google Scholar 

  • Bornemann, T., & Homburg, C. (2011). Psychological distance and the dual role of price. Journal of Consumer Research, 38(3), 490–504.

    Article  Google Scholar 

  • Buhrmester, M., Kwang, T., & Gosling, S. D. (2011). Amazon's mechanical Turk: A new source of inexpensive, yet high-quality, data? Perspectives on Psychological Science, 6(1), 3–5.

    Article  Google Scholar 

  • Bullock, J. G., Green, D. P., & Ha, S. E. (2010). Yes, but what’s the mechanism? (don’t expect an easy answer). Journal of Personality and Social Psychology, 98(4), 550–558.

    Article  Google Scholar 

  • Camerer, C. F. (2011). The promise and success of lab-field generalizability in experimental economics: A critical reply to levitt and list. Available at SSRN 1977749.

    Google Scholar 

  • Camerer, C. F., & Hogarth, R. M. (1999). The effects of financial incentives in experiments: A review and capital-labor-production framework. Journal of Risk and Uncertainty, 19(1), 7–42.

    Article  Google Scholar 

  • Charness, G., Gneezy, U., & Kuhn, M. A. (2012). Experimental methods: Between-subject and within-subject design. Journal of Economic Behavior & Organization, 81(1), 1–8.

    Article  Google Scholar 

  • Christian, B. (2012). The a/b test: Inside the technology that’s changing the rules of business. http://www.wired.com/business/2012/04/ff_abtesting. Accessed 15 Mar 2018.

  • Cohen, J. (1988). Statistical power analysis for the behavioral sciences. Hillsdale: Lawrence Erlbaum Associates.

    Google Scholar 

  • Cohen, J., Cohen, P., West, S. G., & Aiken, L. S. (2003). Applied multiple regression/correlation analysis for the behavioral sciences. Hillsdale: Lawrence Erlbaum Associates.

    Google Scholar 

  • Collins, L. M., Dziak, J. J., & Li, R. (2009). Design of experiments with multiple independent variables: A resource management perspective on complete and reduced factorial designs. Psychological Methods, 14(3), 202–224.

    Article  Google Scholar 

  • Cox, D. R. (1992). Planning of experiments. Hoboken: Wiley.

    Google Scholar 

  • Dean, A., Voss, D., & Draguljić, D. (2017). Design and analysis of experiments. Cham: Springer.

    Book  Google Scholar 

  • Deutskens, E., de Ruyter, K., Wetzels, M., & Oosterveld, P. (2004). Response rate and response quality of internet-based surveys: An experimental study. Marketing Letters, 15(1), 21–36.

    Article  Google Scholar 

  • Ellis, P. D. (2010). The essential guide to effect sizes: Statistical power, meta-analysis, and the interpretation of research results. Cambridge: Cambridge University Press.

    Book  Google Scholar 

  • Eriksson, L., Johansson, E., Kettaneh-Wold, N., Wikström, C., & Wold, S. (2008). Design of experiments: Principles and applications. Stockholm: Umetrics AB, UmeÃ¥ Learnways AB.

    Google Scholar 

  • Evans, A. N., & Rooney, B. J. (2013). Methods in psychological research. Los Angeles: Sage.

    Google Scholar 

  • Falk, A., & Heckman, J. J. (2009). Lab experiments are a major source of knowledge in the social sciences. Science, 326(5952), 535–538.

    Article  Google Scholar 

  • Feldman, J. M., & Lynch, J. G., Jr. (1988). Self-generated validity and other effects of measurement on belief, attitude, intention and behavior. Journal of Applied Psychology, 73(3), 421–435.

    Article  Google Scholar 

  • Festinger, L. A. (1957). Theory of cognitive dissonance. Stanford: Stanford University Press.

    Google Scholar 

  • Fritz, C. O., Morris, P. E., & Richler, J. J. (2012). Effect size estimates: Current use, calculations, and interpretation. Journal of Experimental Psychology: General, 141(1), 2–18.

    Article  Google Scholar 

  • Glasman, L. R., & Albarracín, D. (2006). Forming attitudes that predict future behavior: A meta-analysis of the attitude-behavior relation. Psychological Bulletin, 132(5), 778–822.

    Article  Google Scholar 

  • Goodman, J. K., Cryder, C. E., & Cheema, A. (2013). Data collection in a flat world: The strengths and weaknesses of mechanical turk samples. Journal of Behavioral Decision Making, 26(3), 213–224.

    Article  Google Scholar 

  • Greenwald, A. G. (1976). Within-subjects designs: To use or not to use? Psychological Bulletin, 83(2), 314–320.

    Article  Google Scholar 

  • Hakel, M. D., Ohnesorge, J. P., & Dunnette, M. D. (1970). Interviewer evaluations of job applicants’ resumes as a function of the qualifications of the immediately preceding applicants: An examination of contrast effects. Journal of Applied Psychology, 54(1, Pt.1), 27–30.

    Article  Google Scholar 

  • Hansen, R. A. (1980). A self-perception interpretation of the effect of monetary and nonmonetary incentives on mail survey respondent behavior. Journal of Marketing Research, 17(1), 77–83.

    Article  Google Scholar 

  • Harris, A. D., McGregor, J. C., Perencevich, E. N., Furuno, J. P., Zhu, J., Peterson, D. E., & Finkelstein, J. (2006). The use and interpretation of quasi-experimental studies in medical informatics. Journal of the American Medical Informatics Association, 13(1), 16–23.

    Article  Google Scholar 

  • Harrison, G. W., & List, J. A. (2003). What constitutes a field experiment in economics? Working paper. Columbia: Department of Economics, University of South Carolina http://faculty.haas.berkeley.edu/hoteck/PAPERS/field.pdf. Accessed 15 Mar 2018.

    Google Scholar 

  • Harrison, G. W., & List, J. A. (2004). Field experiments. Journal of Economic Literature, 42(4), 1009–1055.

    Article  Google Scholar 

  • Hattula, J. D., Herzog, W., Dahl, D. W., & Reinecke, S. (2015). Managerial empathy facilitates egocentric predictions of consumer preferences. Journal of Marketing Research, 52(2), 235–252.

    Article  Google Scholar 

  • Hauser, D. J., & Schwarz, N. (2016). Attentive turkers: MTurk participants perform better on online attention checks than do subject pool participants. Behavior Research Methods, 48(1), 400–407.

    Article  Google Scholar 

  • Hegtvedt, K. A. (2014). Ethics and experiments. In M. Webster Jr. & J. Sell (Eds.), Laboratory experiments in the social sciences (pp. 23–51). Amsterdam/Heidelberg: Elsevier.

    Chapter  Google Scholar 

  • Hertwig, R., & Ortmann, A. (2008). Deception in experiments: Revisiting the arguments in its defense. Ethics and Behavior, 18(1), 59–92.

    Article  Google Scholar 

  • Hibbeln, M., Jenkins, J. L., Schneider, C., Valacich, J. S., & Weinmann, M. (2017). Inferring negative emotion from mouse cursor movements. MIS Quarterly, 41(1), 1–21.

    Article  Google Scholar 

  • Horswill, M. S., & Coster, M. E. (2001). User-controlled photographic animations, photograph-based questions, and questionnaires: Three internet-based instruments for measuring drivers’ risk-taking behavior. Behavior Research Methods, Instruments, & Computers, 33(1), 46–58.

    Article  Google Scholar 

  • Kalkoff, W., Youngreen, R., Nath, L., & Lovaglia, M. J. (2014). Human participants in laboratory experiments in the social sciences. In M. Webster Jr. & J. Sell (Eds.), Laboratory experiments in the social sciences (pp. 127–144). Amsterdam/Heidelberg: Elsevier.

    Google Scholar 

  • Koschate-Fischer, N., & Schandelmeier, S. (2014). A guideline for designing experimental studies in marketing research and a critical discussion of selected problem areas. Journal of Business Economics, 84(6), 793–826.

    Article  Google Scholar 

  • Kuipers, K. J., & Hysom, S. J. (2014). Common problems and solutions in experiments. In M. Webster Jr. & J. Sell (Eds.), Laboratory experiments in the social sciences (pp. 127–144). Amsterdam/Heidelberg: Elsevier.

    Google Scholar 

  • Larsen, R. J., & Fredrickson, B. L. (1999). Measurement issues in emotion research. In D. Kahneman, E. Diener, & N. Schwarz (Eds.), Well-being: Foundations of hedonic psychology (pp. 40–60). New York: Russell Sage.

    Google Scholar 

  • Laugwitz, B. (2001). A web-experiment on colour harmony principles applied to computer user interface design. Lengerich: Pabst Science.

    Google Scholar 

  • Levitt, S. D., & List, J. A. (2007). Viewpoint: On the generalizability of lab behaviour to the field. Canadian Journal of Economics, 40(2), 347–370.

    Article  Google Scholar 

  • Li, J. Q., Rusmevichientong, P., Simester, D., Tsitsiklis, J. N., & Zoumpoulis, S. I. (2015). The value of field experiments. Management Science, 61(7), 1722–1740.

    Article  Google Scholar 

  • List, J. A. (2011). Why economists should conduct field experiments and 14 tips for pulling one off. The Journal of Economic Perspectives, 25(3), 3–15.

    Article  Google Scholar 

  • Lynch, J. G. (1982). On the external validity of experiments in consumer research. Journal of Consumer Research, 9(3), 225–239.

    Article  Google Scholar 

  • Lynch, J. G., Marmorstein, H., & Weigold, M. F. (1988). Choices from sets including remembered brands: Use of recalled attributes and prior overall evaluations. Journal of Consumer Research, 15(2), 169–184.

    Article  Google Scholar 

  • Madzharov, A. V., Block, L. G., & Morrin, M. (2015). The cool scent of power: Effects of ambient scent on consumer preferences and choice behavior. Journal of Marketing, 79(1), 83–96.

    Article  Google Scholar 

  • Maxwell, S. E., & Delaney, H. D. (2004). Designing experiments and analyzing data: A model comparison perspective. Mahwah: Lawrence Erlbaum Associates.

    Google Scholar 

  • Meyvis, T., & Van Osselaer, S. M. J. (2018). Increasing the power of your study by increasing the effect size. Journal of Consumer Research, 44(5), 1157–1173.

    Google Scholar 

  • Mitra, A., & Lynch, J. G. (1995). Toward a reconciliation of market power and information theories of advertising effects on price elasticity. Journal of Consumer Research, 21(4), 644–659.

    Article  Google Scholar 

  • Montgomery, D. C. (2009). Design and analysis of experiments. New York: Wiley.

    Google Scholar 

  • Morales, A. C., Amir, O., & Lee, L. (2017). Keeping it real in experimental research—Understanding when, where, and how to enhance realism and measure consumer behavior. Journal of Consumer Research, 44(2), 465–476.

    Article  Google Scholar 

  • Morton, R. B., & Williams, K. C. (2010). Experimental political science and the study of causality: From nature to the lab. New York: Cambridge University Press.

    Book  Google Scholar 

  • Myers, H., & Lumbers, M. (2008). Understanding older shoppers: A phenomenological investigation. Journal of Consumer Marketing, 25(5), 294–301.

    Article  Google Scholar 

  • Nielsen, J. (2000). Why you only need to test with 5 users. https://www.nngroup.com/articles/why-you-only-need-to-test-with-5-users. Accessed 15 Mar 2018.

  • Nielsen, J. (2012). How many test users in a usability study. https://www.nngroup.com/articles/how-many-test-users. Accessed 15 Mar 2018.

  • Nisbett, R. E. (2015). Mindware: Tools for smart thinking. New York: Farrar, Straus and Giroux.

    Google Scholar 

  • Nordhielm, C. L. (2002). The influence of level of processing on advertising repetition effects. Journal of Consumer Research, 29(3), 371–382.

    Article  Google Scholar 

  • Oppenheimer, D. M., Meyvis, T., & Davidenko, N. (2009). Instructional manipulation checks: Detecting satisficing to increase statistical power. Journal of Experimental Social Psychology, 45(4), 867–872.

    Article  Google Scholar 

  • Pascual-Leone, A., Singh, T., & Scoboria, A. (2010). Using deception ethically: Practical research guidelines for researchers and reviewers. Canadian Psychology, 51(4), 241–248.

    Article  Google Scholar 

  • Peer, E., Brandimarte, L., Samat, S., & Acquisti, A. (2017). Beyond the Turk: Alternative platforms for crowdsourcing behavioral research. Journal of Experimental Social Psychology, 70, 153–163.

    Article  Google Scholar 

  • Perdue, B. C., & Summers, J. O. (1986). Checking the success of manipulations in marketing experiments. Journal of Marketing Research, 23(4), 317–326.

    Article  Google Scholar 

  • Pirlott, A. G., & MacKinnon, D. P. (2016). Design approaches to experimental mediation. Journal of Experimental Social Psychology, 66(September), 29–38.

    Article  Google Scholar 

  • Postmes, T., Spears, R., & Cihangir, S. (2001). Quality of decision making and group norms. Journal of Personality and Social Psychology, 80(6), 918–930.

    Article  Google Scholar 

  • Rashotte, L. S., Webster, M., & Whitmeyer, J. M. (2005). Pretesting experimental instructions. Sociological Methodology, 35(1), 151–175.

    Google Scholar 

  • Reips, U.-D. (2002). Standards for internet-based experimenting. Experimental Psychology, 49(4), 243–256.

    Article  Google Scholar 

  • Remler, D. K., & Van Ryzin, G. G. (2010). Research methods in practice: Strategies for description and causation. Thousand Oaks: Sage.

    Google Scholar 

  • Reynolds, N., Diamantopoulos, A., & Schlegelmilch, B. (1993). Pretesting in questionnaire design: A review of the literature and suggestions for further research. Journal of the Market Research Society, 35(2), 171–183.

    Article  Google Scholar 

  • Robertson, D. H., & Bellenger, D. N. (1978). A new method of increasing mail survey responses: Contributions to charity. Journal of Marketing Research, 15(4), 632–633.

    Article  Google Scholar 

  • Sawyer, A. G., & Ball, A. D. (1981). Statistical power and effect size in marketing research. Journal of Marketing Research, 18(3), 275–290.

    Article  Google Scholar 

  • Sawyer, A. G., Lynch, J. G., & Brinberg, D. L. (1995). A bayesian analysis of the information value of manipulation and confounding checks in theory tests. Journal of Consumer Research, 21(4), 581–595.

    Article  Google Scholar 

  • Sears, D. O. (1986). College sophomores in the laboratory: Influences of a narrow data base on social psychology’s view of human nature. Journal of Personality and Social Psychology, 51(3), 515–530.

    Article  Google Scholar 

  • Shadish, W. R., Cook, T. D., & Campbell, D. T. (2002). Experimental and quasi-experimental designs for generalized causal inference. Boston: Houghton Mifflin.

    Google Scholar 

  • Sieber, J. E. (1992). Planning ethically responsible research: A guide for students and internal review boards. Newbury Park: Sage.

    Book  Google Scholar 

  • Simester, D. (2017). Field experiments in marketing. In E. Duflo & A. Banerjee (Eds.), Handbook of economic field experiments Amsterdam: North-Holland (pp. 465–497).

    Chapter  Google Scholar 

  • Singer, E., & Couper, M. P. (2008). Do incentives exert undue influence on survey participation? Experimental evidence. Journal of Empirical Research on Human Research Ethics, 3(3), 49–56.

    Article  Google Scholar 

  • Singer, E., Van Hoewyk, J., Gebler, N., & McGonagle, K. (1999). The effect of incentives on response rates in interviewer-mediated surveys. Journal of Official Statistics, 15(2), 217–230.

    Google Scholar 

  • Smith, V. L., & Walker, J. M. (1993). Rewards, experience and decision cost in first price auctions. Economic Inquiry, 31(2), 237–244.

    Article  Google Scholar 

  • Spencer, S. J., Zanna, M. P., & Fong, G. T. (2005). Establishing a causal chain: Why experiments are often more effective than mediational analyses in examining psychological processes. Journal of Personality and Social Psychology, 89(6), 845–851.

    Article  Google Scholar 

  • Stuart, E. A., & Rubin, D. B. (2007). Best practices in quasi-experimental designs: Matching methods for causal inference. In J. Osborne (Ed.), Best practices in quantitative methods (pp. 155–176). New York. Thousand Oaks, CA: Sage.

    Google Scholar 

  • Thye, S. R. (2014). Logical and philosophical foundations of experimental research in the social sciences. In M. Webster Jr. & J. Sell (Eds.), Laboratory experiments in the social sciences (pp. 53–82). Amsterdam/Heidelberg: Elsevier.

    Chapter  Google Scholar 

  • Trafimow, D., Leonhardt, J. M., Niculescu, M., & Payne, C. (2016). A method for evaluating and selecting field experiment locations. Marketing Letters, 7(3), 437–447.

    Article  Google Scholar 

  • Trafimow, D., & Rice, S. (2009). What if social scientists had reviewed great scientific works of the past? Perspectives on Psychological Science, 4(1), 65–78.

    Article  Google Scholar 

  • Verlegh, P. W. J., Schifferstein, H. N. J., & Wittink, D. R. (2002). Range and number-of-levels effects in derived and stated measures of attribute importance. Marketing Letters, 13(1), 41–52.

    Article  Google Scholar 

  • Völckner, F., & Hofmann, J. (2007). The price-perceived quality relationship: A meta-analytic review and assessment of its determinants. Marketing Letters, 18(3), 181–196.

    Article  Google Scholar 

  • Wetzel, C. G. (1977). Manipulation checks: A reply to kidd. Representative Research in Social Psychology, 8(2), 88–93.

    Google Scholar 

  • Zhao, X., Lynch, J. G., Jr., & Chen, Q. (2010). Reconsidering baron and kenny: Myths and truths about mediation analysis. Journal of Consumer Research, 37(2), 197–206.

    Article  Google Scholar 

  • Zikmund, W., & Babin, B. (2006). Exploring marketing research. Mason: Thomson South-Western.

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Torsten Bornemann .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2018 Springer International Publishing AG, part of Springer Nature

About this entry

Check for updates. Verify currency and authenticity via CrossMark

Cite this entry

Bornemann, T., Hattula, S. (2018). Experiments in Market Research. In: Homburg, C., Klarmann, M., Vomberg, A. (eds) Handbook of Market Research. Springer, Cham. https://doi.org/10.1007/978-3-319-05542-8_2-1

Download citation

  • DOI: https://doi.org/10.1007/978-3-319-05542-8_2-1

  • Received:

  • Accepted:

  • Published:

  • Publisher Name: Springer, Cham

  • Print ISBN: 978-3-319-05542-8

  • Online ISBN: 978-3-319-05542-8

  • eBook Packages: Springer Reference Business and ManagementReference Module Humanities and Social SciencesReference Module Business, Economics and Social Sciences

Publish with us

Policies and ethics