Questions on honest responding
This article presents a new method for reducing socially desirable responding in Internet self-reports of desirable and undesirable behavior. The method is based on moving the request for honest responding, often included in the introduction to surveys, to the questioning phase of the survey. Over a quarter of Internet survey participants do not read survey instructions, and therefore, instead of asking respondents to answer honestly, they were asked whether they responded honestly. Posing the honesty message in the form of questions on honest responding draws attention to the message, increases the processing of it, and puts subsequent questions in context with the questions on honest responding. In three studies (nStudy I = 475, nStudy II = 1,015, nStudy III = 899), we tested whether presenting the questions on honest responding before questions on desirable and undesirable behavior could increase the honesty of responses, under the assumption that less attribution of desirable behavior and/or admitting to more undesirable behavior could be taken to indicate more honest responses. In all studies the participants who were presented with the questions on honest responding before questions on the target behavior produced, on average, significantly less socially desirable responses, though the effect sizes were small in all cases (Cohen’s d ranging between 0.02 and 0.28 for single items, and from 0.17 to 0.34 for sum scores). The overall findings and the possible mechanisms behind the influence of the questions concerning honest responding on subsequent questions are discussed, and suggestions are made for future research.
KeywordsQuestions on honest responding Socially desirable responding Sensitive questions Self-reports Internet surveys
This work was part funded by the Eimskip Fund of the University of Iceland (Háskólasjóður Eimskipafélags Íslands). The funding source had no role in study design; in the collection, analysis, and interpretation of the data; in the writing of the report; or in the decision to submit the article for publication. The authors acknowledge networking support from the COST Action IS1004, WEBDATANET: www.webdatanet.eu.
- Bäckström, M., Björklund, F., & Larsson, M. R. (2009). Five-factor inventories have a major general factor related to social desirability which can be reduced by framing items neutrally. Journal of Research in Personality, 43, 335–344. https://doi.org/10.1016/j.jrp.2008.12.013 CrossRefGoogle Scholar
- Bayram, A. B. (2018). Serious subjects: A test of the seriousness technique to increase participant motivation in political science experiments. Research & Politics, 5 (2):205316801876745Google Scholar
- de Leeuw, E. D. (2008). Choosing the method of data collection. In E. D. de Leeuw, J. J. Hox and D. A. Dillman (Eds.), International handbook of survey methodology (pp. 264–284). Mahwah, NJ: Erlbaum.Google Scholar
- de Leeuw, E. D., & Hox, J. J. (2008). Self-administered questionnaires. In E. D. de Leeuw, J. J. Hox and D. A. Dillman (Eds.), International handbook of survey methodology (pp. 264–284). Mahwah, NJ: Erlbaum.Google Scholar
- DeVellis, R. F. (2012). Scale development: Theory and applications (3rd ed.). Los Angeles, CA: Sage.Google Scholar
- Joinson, A. N., & Paine, C. B. (2007). Self-disclosure, privacy and the Internet. In A. N. Joinson (Ed.), The Oxford handbook of Internet psychology (pp. 237–252). Oxford, UK: Oxford University Press.Google Scholar
- Little, R. J. A. (1988). A test of missing completely at random for multivariate data with missing values. Journal of the American Statistical Association, 83, 1198–1202. https://doi.org/10.2307/2290157.
- Lozar Manfreda, K., & Vehovar, V. (2008). Internet surveys. In E. D. de Leeuw, J. J. Hox, & D. A. Dillman (Eds.), International handbook of survey methodology (pp. 264–284). Mahwah, NJ: Erlbaum.Google Scholar
- Meier, S. T. (1994). The chronic crisis in psychological measurement and assessment: A historical survey. New York, NY: Academic Press.Google Scholar
- Petty, R. E., & Cacioppo, J. T. (1981). Attitudes and persuasion: Classic and contemporary approaches. Dubuque, IA: Wm. C. Brown.Google Scholar
- Reips, U.-D. (2000). The Web Experiment Method: Advantages, disadvantages, and solutions. In M. H. Birnbaum (Ed.), Psychological experiments on the Internet (pp. 89–118). San Diego, CA: Academic Press. https://doi.org/10.5167/uzh-19760
- Reips, U.-D. (2002). Context effects in Web surveys. In B. Batinic, U.-D. Reips, & M. Bosnjak (Eds.), Online Social Sciences (pp. 69–80). Seattle: Hogrefe & Huber.Google Scholar
- Reips, U.-D. (2012). Using the Internet to collect data. In H. Cooper, P. M. Camic, R. Gonzalez, D. L. Long, A. Panter, D. Rindskopf, & K. J. Sher (Eds.), APA handbook of research methods in psychology: 2. Research designs: Quantitative, qualitative, neuropsychological, and biological (pp. 291–310). Washington, DC: American Psychological Association. https://doi.org/10.1037/13620-017 CrossRefGoogle Scholar
- Rosenthal, R. (1986). Media violence, antisocial behavior, and the social consequences of small effects. Journal of Social Issues, 42, 141–154. https://doi.org/10.1111/j.1540-4560.1986.tb00247.x
- Schuman, H., & Presser, S. (1996). Questions and answers in attitude surveys: Experiments on question form, wording, and context. Thousand Oaks, CA: Sage.Google Scholar
- Vésteinsdóttir, V., Reips, U.-D., Joinson, A., & Thorsdottir, F. (2017). An item level evaluation of the Marlowe–Crowne Social Desirability Scale using item response theory on Icelandic Internet panel data and cognitive interviews. Personality and Individual Differences, 107, 164–173. https://doi.org/10.1016/j.paid.2016.11.023