Promoting Participation in a Culture of Sustainability Web Survey
The Sustainability Cultural Indicators Program (SCIP) at the University of Michigan is designed to measure and track the university’s progress (Callewaert and Marans 2017) in moving the campus community towards a culture of sustainability. SCIP gathers this data using a web survey conducted annually. Web surveys generally attain lower response rates than other modes of data collection. Web surveys are also at risk of other forms of nonresponse, such as breakoffs, which happen less frequently in other modes. Breakoffs commonly occur very early in a web survey, often on informed consent screens required by Institutional Review Boards (IRBs), before respondents have a chance to get to the survey content. There are many methods used (prenotification, incentives, etc.) to try to increase participation and reduce breakoffs. This paper investigates the efficacy of two experiments designed to increase participation and reduce breakoffs in two SCIP surveys. The first experiment examines the effect of “celebrity endorsement”. As part of the final email reminder, respondents were randomized to receive a reminder with a link to the survey or a reminder that also contained a link to a video of a head coach from the U-M Department of Athletics encouraging non-respondents to participate. The second experiment investigates informed consent screen design. One group was presented a screen appearing as a traditional informed consent form. The other group was presented a screen with the most important items visible and the rest of the information available via a series of accordion menus.
KeywordsEmail Paradata Sustainability survey Consent screen design Breakoff Celebrity endorsement
The authors thank the Office of the Provost at the University of Michigan for funding the SCIP. Thanks to John Callewaert and Robert Marans for allowing the research to be conducted on the SCIP and to the reviewers for their useful suggestions.
- Callegaro, M. (2010). Do you know which device your respondent has used to take your online survey? Survey Practice, 3(6), 1–12.Google Scholar
- Callegaro, M. (2013). Paradata in web surveys. Improving Surveys with Paradata, 259–279.Google Scholar
- Callegaro, M., Manfreda, K. L., & Vehovar, V. (2015). Web survey methodology. Thousand Oaks: Sage.Google Scholar
- Callewaert, J., & Marans, R. W. (2017). Measuring progress over time: The sustainability cultural indicators program at the University of Michigan. In W. Leal (ed.). Handbook of theory and practice of sustainable development in higher education (Vol. 2). New York: Springer.Google Scholar
- Cook, C., Heath, F., & Thompson, R. L. (2000). A meta-analysis of response rates in web-or internet-based surveys. Educational and Psychological Measurement, 60(6), 821–836.Google Scholar
- Couper, M. P. (1998). Measuring survey quality in a CASIC environment. Proceedings of the survey research methods section, ASA, pp. 41–49.Google Scholar
- Crawford, S. D., McCabe, S. E., Saltz, B., Boyd, C. J., Freisthler, B., & Paschall, M. J. (2004). Gaining respondent cooperation in college web-based alcohol surveys: Findings from experiments at two universities. Paper presented at the 59th annual conference of the American Association for Public Opinion Research, Phoenix, AZ, May.Google Scholar
- Dillman, D. A., Smyth, J. D., & Christian, L. M. (2014). Internet, phone, mail, and mixed-mode surveys. Hoboken: Wiley.Google Scholar
- Harmon, M. A., Westin, E. C., & Levin, K. Y. (2005). Does type of pre-notification affect web survey response rates? Paper presented at the 60th annual conference of the American Association for Public Opinion Research, Miami Beach, FL, May.Google Scholar
- Holland, L., Couper, M. P., & Schroeder, H. (2014), Pre-notification strategies for mixed-mode data collection. Paper presented at the 69th annual conference of the American Association for Public Opinion Research, Anaheim, CA, May.Google Scholar
- Lozar Manfreda, K., Bosnjak, M., Berzelak, J., Haas, I., & Vehovar, V. (2008). Web surveys versus other survey modes: A meta-analysis comparing response rates. International Journal of Market Research, 50(1), 79–104.Google Scholar
- Pew Research Center. (April, 2015). The smartphone difference. http://www.pewinternet.org/2015/04/01/us-smartphone-use-in-2015/.
- The American Association for Public Opinion Research. (2016). Standard definitions: Final dispositions of case codes and outcome rates for surveys. 9th edition. AAPOR.Google Scholar
- Tuten, T. L. (2005). Do reminders encourage response by affect response behaviors? reminders in web-based surveys. Paper presented at the ESF workshop on internet survey methodology, Dubrovnik, Croatia, September.Google Scholar
- Vehovar, V., Batagelj, Z., Lozar Manfreda, K., & Zaletel, M. (2002). Nonresponse in web surveys. In R. M. Groves, D. A. Dillman, J. L. Eltinge, & R. J. A. Little (Eds.), Survey Nonresponse (pp. 229–242). New York: Wiley.Google Scholar