Skip to main content
Log in

Effects of Personalized Versus Generic Implementation of an Intra-Organizational Online Survey on Psychological Anonymity and Response Behavior: A Field Experiment

  • Published:
Journal of Business and Psychology Aims and scope Submit manuscript

Abstract

Purpose

In the present study, we sought to investigate the effects of online survey implementation strategies on perceived anonymity and employee response behavior in organizational surveys.

Design/Methodology/Approach

A field experiment was conducted to compare two commonly used online survey implementation strategies (N = 815). One group of employees received a personalized invitation to the survey and a log-in password, while the other group received a general invitation and did not have to provide a password.

Findings

The results showed that the applied implementation strategies had no substantial effects on perceived anonymity. Moreover, there were no significant effects on nonresponse or the responses of survey participants to closed-ended and open-ended survey questions.

Implications

The present study supposes that online surveys are not a uniform phenomenon and that differences in the implementation of online surveys need to be considered. However, the findings indicate that the use of specified-personalized implementation strategies does not necessarily lead to a substantial decrease in perceived anonymity or automatically result in reduced data quality. Thus, in many cases, the investigated online survey implementation strategies are unlikely to cause serious reductions in perceptions of anonymity and quality of responses to organizational online surveys.

Originality/Value

In spite of the frequent use of online surveys in organizations, little is known about the consequences of online implementation strategies for perceptions of anonymity and response behavior. This study is one of the few empirical examinations of the psychological consequences of different online implementation strategies frequently used in organizational surveying.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Fig. 1

Similar content being viewed by others

References

  • Anseel, F., Lievens, F., Schollaert, E., & Choragwicka, B. (2010). Response rates in organizational science, 1995–2008: A meta-analytic review and guidelines for survey researchers. Journal of Business and Psychology, 25, 335–349.

    Article  Google Scholar 

  • Armenakis, A. A., & Bedian, A. G. (1999). Organizational change: A review of theory and research in the 1990s. Journal of Management, 25(3), 293–315.

    Article  Google Scholar 

  • Baumgartner, H., & Steenkamp, J.-B. E. M. (2001). Response styles in marketing research: A cross-national investigation. Journal of Marketing Research, 38(2), 143–156.

    Article  Google Scholar 

  • Bentler, P. M. (1990). Comparative fit indexes in structural models. Psychological Bulletin, 107(2), 238–246.

    Article  PubMed  Google Scholar 

  • Bentler, P. M., & Bonett, D. G. (1980). Significance tests and goodness of fit in the analysis of covariance structures. Psychological Bulletin, 88, 588–606.

    Article  Google Scholar 

  • Bergkvist, L., & Rossiter, J. R. (2007). The predictive validity of multiple-item versus single-item measures of the same constructs. Journal of Marketing Research, 44(May), 175–184.

    Article  Google Scholar 

  • Chapman, D. S., & Webster, J. (2003). The use of technologies in the recruiting, screening, and selection processes for job candidates. International Journal of Selection and Assessment, 11, 113–120.

    Article  Google Scholar 

  • Church, A., & Waclawski, J. (1998). Designing and using organizational surveys. Aldershot: Gower.

    Google Scholar 

  • Church, A. H., & Waclawski, J. (2001). Designing and using organizational surveys: A seven step process. San Francisco: Jossey-Bass.

    Google Scholar 

  • Churchill, G. A. (1979). A paradigm for developing better measures of marketing constructs. Journal of Marketing Research, 16, 64–73.

    Article  Google Scholar 

  • Cook, C., Heath, F., & Thompson, R. L. (2000). A meta-analysis of response rates in web- or internet-based surveys. Educational and Psychological Measurement, 60, 821–836.

    Article  Google Scholar 

  • Couper, M. P., Tourangeau, R., Conrad, F. G., & Crawford, S. D. (2004). What they see is what we get: Response options for web surveys. Social Science Computer Review, 22, 111–127.

    Article  Google Scholar 

  • Cycyota, C. S., & Harrison, D. A. (2006). What (not) to expect when surveying executives: A meta-analysis of top manager response rates and techniques over time. Organizational Research Methods, 9, 133–160.

    Article  Google Scholar 

  • Cycyota, C. S., Harrison, D. A., & Stahl, A. S. (2002). Enhancing response rates at the executive level: Are employee- or consumer-level techniques effective? Journal of Management, 28, 163–189.

    Article  Google Scholar 

  • de Jong, M. G., Steenkamp, J.-B. E. M., Fox, J.-P., & Baumgartner, H. (2008). Using item response theory to measure extreme response style in marketing research: A global investigation. Journal of Marketing Research, 45(1), 104–115.

    Article  Google Scholar 

  • Diamantopoulos, A., Reynolds, N. L., & Simintiras, A. C. (2006). The impact of response styles on the stability of cross-national comparisons. Journal of Business Research, 59(8), 925–935.

    Article  Google Scholar 

  • Dieffendorff, J. M., Silverman, S. B., & Greguras, G. J. (2005). Measurement equivalence and multisource ratings for non-managerial positions: Recommendations for research and practice. Journal of Business and Psychology, 19, 399–425.

    Article  Google Scholar 

  • Dillman, D. A. (2000). Mail and internet surveys: The tailored design method. New York: Wiley.

    Google Scholar 

  • Dommeyer, C. J., & Moriarty, E. (2000). Comparing two forms of an e-mail survey: Embedded vs. attached. International Journal of Market Research, 42(1), 39–50.

    Google Scholar 

  • Dunnette, M. D., & Heneman, H. G. (1956). Influence of scale administrator on employee attitude responses. Journal of Applied Psychology, 40, 73–77.

    Article  Google Scholar 

  • Falletta, S. V., & Combs, W. (2002). Surveys as a tool for organization development and change. In J. Waclawski & A. H. Church (Eds.), Organization development: A data-driven approach to organizational change (pp. 78–102). San Francisco: Jossey-Bass.

    Google Scholar 

  • Fenlason, K. J., & Suckow-Zimberg, K. (2006). Online surveys. Critical issues in using the web to conduct surveys. In A. I. Kraut (Ed.), Getting action from organizational surveys (pp. 183–212). San Francisco: Jossey-Bass.

    Google Scholar 

  • Fox, S., & Schwartz, D. (2002). Social desirability and controllability in computerized and paper-and-pencil personality questionnaires. Computers in Human Behaviour, 18(4), 389–410.

    Article  Google Scholar 

  • Gardner, D. G., Cummings, L. L., Dunham, R. B., & Pierce, J. L. (1998). Single-item versus multiple-item measurement scales: An empirical comparison. Educational and Psychological Measurement, 58(6), 898–915.

    Article  Google Scholar 

  • Göritz, A. S., Reinhold, N., & Batinic, B. (2002). Online panels. In B. Batinic, U. Reips, & M. Bosnjak (Eds.), Online social sciences (pp. 27–47). Seattle, WA: Hogrefe & Huber.

    Google Scholar 

  • Graeff, T. R., & Harmon, S. (2002). Collecting and using personal data: Consumers’ awareness and concerns. Journal of Consumer Marketing, 19(4), 302–318.

    Article  Google Scholar 

  • Harman, R. P., Thompson, L. F., & Surface, E. A. (2009, April). Understanding survey comment nonresponse and the characteristics of nonresponders. Paper presented at the 24th annual meeting of the Society for Industrial and Organizational Psychology, New Orleans, LA.

  • Heerwegh, D., Vanhove, T., Matthijs, K., & Loosveldt, G. (2005). The effect of personalization on response rates and data quality in web surveys. International Journal of Social Research Methodology, 8(2), 85–99.

    Article  Google Scholar 

  • Holt, D. T., Armenakis, A. A., Harris, S. G., & Feild, H. S. (2007). Readiness for organizational change: The systematic development of a scale. Journal of Applied Behavioral Science, 43(2), 232–255.

    Article  Google Scholar 

  • Hu, L., & Bentler, P. M. (1999). Cutoff criteria for fit indexes in covariance structure analysis: Conventional criteria versus new alternatives. Structural Equation Modeling, 6(1), 1–55.

    Article  Google Scholar 

  • Jensen, C., Potts, C., & Jensen, C. (2005). Privacy practices of internet users: Self-reports versus observed behavior. International Journal of Human-Computer Studies, 63(1–2), 203–227.

    Article  Google Scholar 

  • Kantor, J. (1991). The effects of computer administration and identification on the job descriptive index. Journal of Business and Psychology, 75, 309–323.

    Article  Google Scholar 

  • Kerin, R. A., & Petersen, R. A. (1977). Personalization, respondent anonymity, and response distortion in mail surveys. Journal of Applied Psychology, 62, 86–89.

    Article  Google Scholar 

  • Kraut, A. I. (1996). An overview of organizational surveys. In A. I. Kraut (Ed.), Organizational surveys: Tools for assessment and change (pp. 1–14). San Francisco: Jossey-Bass.

    Google Scholar 

  • Kraut, A. I. (2006). Getting action from organizational surveys. New concepts, technologies and applications. San Francisco: Jossey Bass.

    Google Scholar 

  • Kraut, A. I., & Saari, L. M. (1999). Organization surveys: Coming of age for a new era. In A. I. Kraut & A. K. Korman (Eds.), Evolving practices in human resource management (pp. 302–327). San Francisco: Jossey-Bass.

    Google Scholar 

  • McFarland, L. A., Ryan, A. M., & Paul, K. B. (1998, April). Equivalence of an organizational attitude survey across administration modes. Paper presented at the 13th Annual Conference of the Society for Industrial and Organizational Psychology, Dallas, TX.

  • Mowday, R. T., Steers, R. M., & Porter, L. W. (1979). The measurement of organizational commitment. Journal of Vocational Behavior, 14, 224–247.

    Article  Google Scholar 

  • Mueller, K., Liebig, C., & Hattrup, K. (2007). Computerizing organizational attitudes surveys: An investigation of the measurement equivalence of a multifaceted job satisfaction measure. Educational and Psychological Measurement, 67(4), 658–678.

    Article  Google Scholar 

  • Nunnally, J. (1978). Psychometric theory. New York: McGraw-Hill.

    Google Scholar 

  • Penny, J. A. (2003). Exploring differential item functioning in a 360-degree assessment: Rater source and method of delivery. Organizational Research Methods, 6(1), 61–79.

    Article  Google Scholar 

  • Postmes, T., Spears, R., & Lea, M. (1998). Breaching or building social boundaries? SIDE-effects of computer-mediated communication. Communication Research, 25, 689–715.

    Article  Google Scholar 

  • Price, J. L., & Mueller, C. W. (1986). Handbook of organizational measurement. Marshfield, MA: Pitman.

    Google Scholar 

  • Richman, W. L., Kiesler, S., Weisband, S., & Drasgow, F. (1999). A meta-analytic study of social desirability distortion in computer administered questionnaires, traditional questionnaires, and interviews. Journal of Applied Psychology, 84, 754–775.

    Article  Google Scholar 

  • Rogelberg, S. G., Fisher, G. G., Maynard, D. C., Hakel, M. D., & Horvath, M. (2001). Attitudes toward surveys: Development of a measure and its relationship to respondent behavior. Organizational Research Methods, 4, 3–25.

    Article  Google Scholar 

  • Rogelberg, S. G., Spitzmüller, C., Little, I. S., & Reeve, C. L. (2006). Understanding response behavior to an online special topics organizational satisfaction survey. Personnel Psychology, 59, 903–923.

    Article  Google Scholar 

  • Rose, D. S., Sidle, S. D., & Griffith, K. H. (2007). A penny for your thoughts: Monetary incentives improve response rates for company-sponsored employee surveys. Organizational Research Methods, 10(2), 225–240.

    Article  Google Scholar 

  • Rosenfeld, P., Booth-Kewley, S., Edwards, J. E., & Thomas, M. D. (1996). Responses on computer surveys: Impression management, social desirability, and the Big Brother syndrome. Computers in Human Behaviour, 12, 263–274.

    Article  Google Scholar 

  • Rosenfeld, P., Doherty, L. M., & Carroll, L. (1987). Microcomputer-based organizational survey assessment: Applications to training. Journal of Business and Psychology, 2, 182–193.

    Article  Google Scholar 

  • Saari, L. M., & Judge, T. A. (2004). Employee attitudes and job satisfaction. Human Resource Management, 43(4), 395–407.

    Article  Google Scholar 

  • Saari, L. M., & Scherbaum, C. A. (2011). Identified employee surveys: Potential promise, perils, and professional practice guidelines. Industrial and Organizational Psychology, 4, 435–448.

    Article  Google Scholar 

  • Sheehan, K. B. (2001). E-mail survey response rates: A review. Journal of Computer-Mediated Communication, 6(2). Retrieved March 2, 2008, from http://jcmc.indiana.edu/vol6/issue2/sheehan.html.

  • Singer, E., Mathiowetz, N. A., & Couper, M. P. (1993). The impact of privacy and confidentiality concerns on survey participation: The case of the 1990 U.S. census. Public Opinion Quarterly, 57, 465–482.

    Article  Google Scholar 

  • Smith, P. B. (2003). Meeting the challenge of cultural difference. In D. Tjosvold & K. Leung (Eds.), Cross-cultural management: Foundation and future (pp. 59–73). Aldershot: Ashgate.

    Google Scholar 

  • Spears, R., & Lea, M. (1994). Panacea or panopticon: The hidden power in computer mediated communication. Communication Research, 21, 427–459.

    Article  Google Scholar 

  • Stanton, J. M. (1998). An empirical assessment of data collection using the internet. Personnel Psychology, 51, 709–725.

    Article  Google Scholar 

  • Steiger, J. H. (1990). Structural model evaluation and modification: An internal estimation approach. Multivariate Behavior Research, 25, 173–180.

    Article  Google Scholar 

  • Thompson, L. F., & Surface, E. A. (2007). Employee surveys administered online: Attitudes toward the medium, nonresponse, and data representativeness. Organizational Research Methods, 10, 241–261.

    Article  Google Scholar 

  • Thompson, L. F., Surface, E. A., Martin, D. L., & Sanders, M. G. (2003). From paper to pixels: Moving personnel surveys to the web. Personnel Psychology and Marketing, 56, 197–227.

    Article  Google Scholar 

  • Tourangeau, R., Groves, R. M., Kennedy, C., & Yan, T. (2009). The presentation of a web survey, nonresponse, and measurement error among members of web panel. Journal of Official Statistics, 25(3), 299–321.

    Google Scholar 

  • Tourangeau, R., & Yan, T. (2007). Sensitive questions in surveys. Psychological Bulletin, 133(5), 859–883.

    Article  PubMed  Google Scholar 

  • Uriell, Z. A., & Dudley, C. M. (2009). Sensitive topics: Are there modal differences? Computers in Human Behavior, 25, 76–87.

    Article  Google Scholar 

  • Vandenberg, R. J., & Lance, C. E. (2000). A review and synthesis of the measurement invariance literature: Suggestions, practices, and recommendations for organizational research. Organizational Research Methods, 3, 4–69.

    Article  Google Scholar 

  • Vehovar, V., Batagelj, Z., Manfreda, K. L., & Zaletel, M. (2002). Nonresponse in web surveys. In R. M. Groves, D. A. Dillman, J. L. Eltinge, & R. J. A. Little (Eds.), Survey nonresponse (pp. 229–242). New York: Wiley.

    Google Scholar 

  • Venkatesh, V., Morris, M. G., Davis, F. D., & Davis, G. B. (2003). User acceptance of information technology: Toward a unified view. MIS Quarterly, 27(3), 425–478.

    Google Scholar 

  • West, S. G., Finch, J. F., & Curran, P. J. (1995). Structural equation models with nonnormal variables: Problems and remedies. In R. H. Hoyle (Ed.), Structural equation modeling: Concepts, issues, and applications (pp. 56–75). Thousand Oaks, CA: Sage.

    Google Scholar 

  • Whelan, T. J. (2008, April). Antecedents of anonymity perceptions in Web-based surveys. Paper presented at the 23rd annual meeting of the Society for Industrial and Organizational Psychology, San Francisco, CA.

  • Whelan, T. J., & Meade, A. W. (2009, April). Examining DIF in perceived anonymity when sensitive items are endorsed. Paper presented at the 24th annual conference of the Society for Industrial and Organizational Psychology, New Orleans, LA.

  • Whelan, T. J., & Thompson, L. F. (2009, April). Development/validation of the PANON scale assessing survey anonymity perceptions. Paper presented at the 24th annual conference of the Society for Industrial and Organizational Psychology, New Orleans, LA.

  • Yost, P. R., & Homer, L. E. (1998, April). Electronic versus paper surveys: Does the medium affect the response? Paper presented at the 13th annual conference of the Society for Industrial and Organizational Psychology, Dallas, TX.

  • Zimbardo, P. G. (1969). The human choice: Individuation, reason, and order vs. deindividuation, impulse and chaos. In W. J. Arnold & D. Levine (Eds.), Nebraska symposium on motivation (pp. 237–307). Lincoln, NE: University of Nebraska Press.

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Karsten Mueller.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Mueller, K., Straatmann, T., Hattrup, K. et al. Effects of Personalized Versus Generic Implementation of an Intra-Organizational Online Survey on Psychological Anonymity and Response Behavior: A Field Experiment. J Bus Psychol 29, 169–181 (2014). https://doi.org/10.1007/s10869-012-9262-9

Download citation

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10869-012-9262-9

Keywords

Navigation