Advertisement

Identification of Measurement Problems of Survey Items and Scales Using Paradata

  • Jochen MayerlEmail author
  • Henrik Andersen
  • Christoph Giehl
Chapter
Part of the Schriftenreihe der ASI - Arbeitsgemeinschaft Sozialwissenschaftlicher Institute book series (SASI)

Zusammenfassung

This article discusses some various applications of paradata in the form of response latencies in identifying survey measurement error. Specifically, it presents empirical analyses regarding response latencies as they pertain to such problems as acquiescence bias, question order effects (contrast and assimilation effects) and social desirability bias. It demonstrates that response latencies can provide helpful insight into cognitive processes that would be otherwise unobservable. Finally, we briefly touch on the challenges involved with the collection and use of paradata.

Schlüsselwörter

Paradata measurement error response effects response latencies social desirability acquiescence question order dual-process theory 

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

Literature

  1. Ajzen, I., & Fishbein, M. (1980). Understanding attitudes and predicting social behavior. New York/Englewood Cliffs: Prentice-Hall.Google Scholar
  2. Amelang, M., & Müller, J. (2001). Reaktionszeit-Analysen der Beantwortung von Eigenschaftswörtern. Psychologische Beiträge 43 (4), 731-750.Google Scholar
  3. Andersen, H., & Mayerl, J. (2017). Social Desirability and Undesirability Effects on Survey Response Latencies. Bulletin of Sociological Methodology 135, 68-89.CrossRefGoogle Scholar
  4. Andersen, H., & Mayerl, J. (expected 2018). Responding to socially desirable and socially undesirable topics: Different types of response behaviour? Unpublished Manuscript, Submitted to MDA.Google Scholar
  5. Bassili, J. (1996). The How and Why of Response Latency Measurement in Telephone Surveys. In N. Schwarz & S. Sudman (Eds.), Answering Questions. Methodology for Determining Cognitive and Communicative Process in Survey Research (pp.319-346). San Francisco: Jossey-Bass.Google Scholar
  6. Bassili, J. (2003). The minority slowness effect: Subtle inhibitions in the expression of views not shared by others. Journal of Personality and Social Psychology, 84(2), 261-276.CrossRefGoogle Scholar
  7. Bassili, J., & Fletcher, J. (1991). Response-Time Measurement in Survey Research. A Method for CATI and a New Look at Nonattitudes. Public Opinion Quarterly, 55, 331-346.CrossRefGoogle Scholar
  8. Belli, R. F., Traugott, M. W., & Beckmann, M. N. (2001). What Leads to Vote Overreports? Contrasts of Overreporters to Validated Voters and Admitted Nonvoters in the American National Election Studies. Journal of Official Statistics 17, 479–498.Google Scholar
  9. Cannell, C. F., Miller, P. V., & Oksenberg, L. (1981). Research on interviewing techniques. In S. Leinhardt (Ed.), Social Methodology (pp. 389-437). San Francisco: Jossey-Bass Publishers.CrossRefGoogle Scholar
  10. Chaiken, S. (1980). Heuristic versus systematic information processing and the use of source versus message cues in persuasion. Journal of personality and social psychology 39(5), 752.CrossRefGoogle Scholar
  11. Clark, H. H. (1985). Language use and language users. In G. Lindzey and E. Aronson (Eds.), Handbook of social psychology (pp. 179-232). New York: Random House.Google Scholar
  12. Couper, M. P. (1998). Measuring survey quality in a CASIC environment. Proceedings of the Survey Research Methods Section ASA, 41-49.Google Scholar
  13. Couper, M. P., & Kreuter, F. (2013). Using paradata to explore item level response times in surveys. Journal of the Royal Statistical Society: Series A (Statistics in Society) 176 (1), 271-286.CrossRefGoogle Scholar
  14. Couper, M. P., & Singer, E. (2013). Informed Consent for Web Paradata Use. Survey Research Methods 7(1), 57-67.Google Scholar
  15. Crowne, D., & Marlowe, D. (1960). A new scale of social desirability independent of psychopathology. Journal of Consulting Psychology 24(4), 349–354.CrossRefGoogle Scholar
  16. DePaulo, B. M., Lindsay, J. J., Malone, B. E., Muhlenbruck, L., Charlton, K., & Cooper, H. (2003). Cues to deception. Psychological bulletin 129 (1), 74-118.CrossRefGoogle Scholar
  17. Edwards, A. L. (1957). The social desirability variable in personality assessment and research. New York: Dryden.Google Scholar
  18. Esser, H. (2010). Das Modell der Frame-Selektion. Eine allgemeine Handlungstheorie für die Sozialwissenschaften?. In G. Albert & S. Sigmund. (Eds.), Soziologische Theorie kontrovers (pp.45-62). Wiesbaden: VS Verlag für Sozialwissenschaften.Google Scholar
  19. Fazio, R. H., Sherman, S. J., & Herr, P. M. (1982). The feature-positive effect in the self-perception process: Does not doing matter as much as doing? Journal of Personality and Social Psychology, 42(3), 404-411.CrossRefGoogle Scholar
  20. Fazio, R. H. (1986). How do attitudes guide behaviour? In R. Sorrentino & E. Higgins (Eds.), The handbook of motivation and cognition: Foundation of social behaviour (pp. 204-243). New York: Guilford Press.Google Scholar
  21. Fazio, R. H. (1990a). A Practical Guide to the Use of Response Latency in Social Psychological Research. In C. Hendrick & M. Clark (Eds.), Research methods in personality and social research (pp. 74-97). Thousand Oaks: Sage Publications.Google Scholar
  22. Fazio, R. H. (1990b). Multiple processes by which attitudes guide behavior: The MODE model as an integrative framework. Advances in experimental social psychology 23, 75-109.Google Scholar
  23. Holbrook, A. L., & Krosnick, J. A. (2010a). Measuring Voter Turnout by Using the Randomized Response Technique: Evidence Calling Into Question the Method’s Validity. Public Opinion Quarterly 74, 328-43.CrossRefGoogle Scholar
  24. Holbrook, A. L., & Krosnick, J. A. (2010b). Social Desirability Bias in Voter Turnout Reports: Tests Using the Item Count Technique. Public Opinion Quarterly 74, 37-67.CrossRefGoogle Scholar
  25. Holtgraves, T. (2004). Social desirability and self-reports: Testing models of socially desirable responding. Personality and Social Psychology Bulletin 30(2), 161-172.CrossRefGoogle Scholar
  26. Keller, J., Bohner, G., & Erb, H.-P. (2000). Intuitive und heuristische Urteilsbildung - verschiedene Prozesse? Präsentation einer deutschen Fassung des „Rational-Experiential Inventory“ sowie neuer Selbstberichtskalen zur Heuristiknutzung. Zeitschrift für Sozialpsychologie 31(2), 87-101.CrossRefGoogle Scholar
  27. Knowles, E. S., & Condon, C. A. (1999). Why people say “yes”: A dual-process theory of acquiescence. Journal of Personality and Social Psychology, 77(2), 379-386.CrossRefGoogle Scholar
  28. Kohler, A., & Schneider, J. F. (1995). Einfluß der Kenntnis der Gruppennorm auf die Beantwortungszeit von Persönlichkeitsfragebogen-Items. Arbeiten der Fachrichtung Psychologie, Universität des Saarlandes, Nr. 179.Google Scholar
  29. Krosnick, J. A. (1991). Response strategies for coping with the cognitive demands of attitude measures in surveys. Applied cognitive psychology 5(3), 213-236.CrossRefGoogle Scholar
  30. Mayerl, J. (2009). Kognitive Grundlagen sozialen Verhaltens. Framing, Einstellungen und Rationalität. Wiesbaden: VS Verlag für Sozialwissenschaften.Google Scholar
  31. Mayerl, J. (2013). Response Latency Measurement in Surveys. Detecting Strong Attitudes and Response Effects. Survey Methods: Insights from the Field.  https://doi.org/10.13094/smif-2013-00005.
  32. Mayerl, J., & Giehl, C. (2018). A Closer Look at Attitude Scales with Positive and Negative Items. Response Latency Perspectives on Measurement Quality. Survey Research Methods, Vol. 12(3), 9999-10016Google Scholar
  33. Mayerl, J., & Urban, D. (2008). Antwortreaktionszeiten in Survey-Analysen. Messung, Auswertung und Anwendung. Wiesbaden: VS Verlag für Sozialwissenschaften.Google Scholar
  34. McCrae, R. R., & Costa, P. T. (1983). Social desirability scales: More substance than style. Journal of consulting and clinical psychology, 51(6), 882.CrossRefGoogle Scholar
  35. Paulhus, D. L (1984). Two-component models of socially desirable responding. Journal of Personality and Social Psychology 46, 598-609.CrossRefGoogle Scholar
  36. Paulhus, D. L. (1991). Measurement and control of response bias. In J. P. Robinson, P. R. Shaver & L. S. Wrightsman (Eds.), Measures of personality and social psychological attitudes (pp. 17-59). New York Academic Press.Google Scholar
  37. Paulhus, D. L. (2002). Socially desirable responding: The evolution of a construct. In H. I. Braun, D. N. Jackson, & D. E. Wiley (Eds.), The role of constructs in psychological and educational measurement (pp. 49-69). Mahwah, NJ, US: Lawrence Erlbaum Associates Publishers.Google Scholar
  38. Petty, R. E., & Cacioppo, J. T. (1986). Communication and Persuasion: Central and Peripheral Routes to Attitude Change. New York: Springer-Verlag.Google Scholar
  39. Sackeim, H. A., & Gur, R. C. (1978). Self-deception, self-confrontation and consciousness. In. G. E. Schwatz & D. Shapiro (Eds.), Consciousness and self-regulation: Advances in research, Vol. 2 (pp.139-197). New York: Plenum.CrossRefGoogle Scholar
  40. Schwarz, N., & Strack, F. (1999). Reports of subjective well-being: Judgemental processes and their methodological implications. In D. Kahnemann, E. Diener, & N. Schwarz (Eds.), Well-being: The foundations of hedonic psychology (pp. 61-84). New York: Russell-Sage.Google Scholar
  41. Stocké, V. (2004). Entstehungsbedingungen von Antwortverzerrungen durch soziale Erwünschtheit. Ein Vergleich der Rational-Choice Theorie und des Modells der Frame-Selektion. Zeitschrift für Soziologie, 33(4), 303-320.Google Scholar
  42. Stocké, V., & Hunkler, C. (2004). Die angemessene Erfassung der Stärke und Richtung von Anreizen durch soziale Erwünschtheit. ZA-Information / Zentralarchiv für Empirische Sozialforschung 54. http://nbn-resolving.de/urn:nbn:de:0168-ssoar-198773. Zugegriffen: 28.03.2018.
  43. Stocké, V., & Hunkler, C. (2007). Measures of desirability beliefs and their validity as indicators for socially desirable responding. Field Methods 19 (3), 313-336.CrossRefGoogle Scholar
  44. Strack, F., & Martin, L. L. (1987). Thinking, Judging, and Communicating: A Process Account of Context Effects in Attitude Surveys. In H. J. Hippler, N. Schwarz & S. Sudman (Eds.), Social Information Processing and Survey Methodology (pp. 123-147). New York: Springer.Google Scholar
  45. Strack, F., Martin, L.L., & Schwarz, N. (1987). The Context Paradox in Attitude Surveys: Assimilation or Contrast? ZUMA-Arbeitsbericht Nr. 87/07. Mannheim: ZUMA.Google Scholar
  46. Sudman, S., Bradburn, N. M., & Schwarz, N. (1996). Thinking about answers. The application of cognitive processes to survey methodology. San Francisco: Jossey-Bass.Google Scholar
  47. Tourangeau, R., & Rasinski, K. (1988). Cognitive Processes Underlying Context Effects in Attitude Measurement. Psychological Bulletin 103 (3), 299-314.CrossRefGoogle Scholar
  48. Tourangeau, R., Rasinski, K. A., Bradburn, N., & D’Andrade, R. (1989). Carryover Effects in Attitude Surveys. Public Opinion Quarterly 53, 495-524.CrossRefGoogle Scholar
  49. Tourangeau, R., & Yan, T. (2007). Sensitive questions in surveys. Psychological Bulletin 133(5), 859–883.CrossRefGoogle Scholar
  50. Uziel, L. (2010). Rethinking social desirability scales: From impression management to interpersonally oriented self-control. Perspectives on Psychological Science 5(3), 243-262.CrossRefGoogle Scholar
  51. Wiggins, J. S. (1964). Convergences among stylistic response measures from objective personality tests. Educational and Psychological Measurement 24, 551-562.CrossRefGoogle Scholar
  52. Wolter, F. (2012). Heikle Fragen in Interviews. Eine Validierung der Randomized Response-Technik. Wiesbaden: VS Verlag für Sozialwissenschaften.CrossRefGoogle Scholar
  53. Wolter, F. & Preisendörfer, P. (2013). Asking Sensitive Questions: An Evaluation of the Randomized Response Technique versus Direct Questioning Using Individual Validation Data. Sociological Methods and Research 42, 321-353.CrossRefGoogle Scholar

Copyright information

© Springer Fachmedien Wiesbaden GmbH, ein Teil von Springer Nature 2019

Authors and Affiliations

  • Jochen Mayerl
    • 1
    Email author
  • Henrik Andersen
    • 1
  • Christoph Giehl
    • 2
  1. 1.ChemnitzDeutschland
  2. 2.KaiserslauternDeutschland

Personalised recommendations