Use of Email Paradata in a Survey of Sustainability Culture
Survey data collection is often utilized to study/gage public perceptions of sustainability, and can use considerable resources to carry out. It seems natural that collecting data about sustainability and the culture of sustainability should be done in a sustainable way, optimizing the use of available resources. This paper strives to investigate and understand respondent engagement with web survey email invitations. This is important because often less sustainable contact methods are used in follow-up to raise response rates. This paper uses data from the Sustainability Cultural Indicators Program (SCIP) survey at the University of Michigan (U-M). During the 2014 and 2015 data collections, email paradata was utilized to understand sample members’ engagement with emails sent asking them to complete a survey. Engagement is determined by using email paradata combined with paradata from the survey about access and completion. Low engagement may mean not receiving (e.g. spam, bad email address, etc.) or never opening the email. High engagement with low survey access (and completion) may mean there are other attributes (e.g. survey length, survey topic, incentive, how the data will be used, etc.) of the design affecting the decision to participate that researchers may need to address. The data also provide insight as to when emails are opened. This has the potential for the survey practitioners to focus on optimal times to attempt contact to try to gain cooperation. Three engagement analyses were conducted. The first analysis looks at the open rate for each email type (prenotification, invitation, reminder 1, etc.). The second analysis looks at the elapsed time (lag) between sending and opening of each email type. The final analysis looks at the optimal day to send an email invitation to elicit response. This information can be used to inform future survey design decisions and provide insight into non-response.
KeywordsEmail Paradata Sustainability culture Web survey
The authors thank the Office of the Provost at the University of Michigan for funding the SCIP. Thanks also go to John Callewaert and Robert Marans for allowing the research to be conducted on the SCIP and to the reviewers for their useful suggestions.
- Callegaro, M. (2010). Do you know which device your respondent has used to take your online survey? Survey Practice 3(6), 1–12. Google Scholar
- Callewaert, J., and Marans, R. W. (2017). Measuring progress over time: The sustainability cultural indicators program at the University of Michigan. In W. Leal (ed.). Handbook of theory and practice of sustainable development in higher education (Vol. 2). New York: Springer (2017).Google Scholar
- Couper, M. P. (1998). Measuring Survey Quality in a CASIC Environment. Proceedings of the Survey Research Methods Section, ASA, pp. 41–49.Google Scholar
- Heerwegh, D. (2011). Internet survey paradata. In M. Das, P. Ester, & L. Kaczmirek (Eds.), Social and behavioral research and the internet. Advances in applied methods and research strategies (pp. 325–348). New York: Taylor and Francis.Google Scholar
- Kreuter, F. (Ed.). (2013). Improving surveys with paradata: Analytic uses of process information. Hoboken: Wiley.Google Scholar
- Lozar Manfreda, K., Bosnjak, M., Berzelak, J., Haas, I., & Vehovar, V. (2008). Web surveys versus other survey modes: A meta-analysis comparing response rates. International Journal of Market Research, 50(1), 79–104.Google Scholar