Use of Email Paradata in a Survey of Sustainability Culture

  • Andrew L. Hupp
  • Heather M. Schroeder
  • Andrew D. Piskorowski
Chapter
Part of the World Sustainability Series book series (WSUSE)

Abstract

Survey data collection is often utilized to study/gage public perceptions of sustainability, and can use considerable resources to carry out. It seems natural that collecting data about sustainability and the culture of sustainability should be done in a sustainable way, optimizing the use of available resources. This paper strives to investigate and understand respondent engagement with web survey email invitations. This is important because often less sustainable contact methods are used in follow-up to raise response rates. This paper uses data from the Sustainability Cultural Indicators Program (SCIP) survey at the University of Michigan (U-M). During the 2014 and 2015 data collections, email paradata was utilized to understand sample members’ engagement with emails sent asking them to complete a survey. Engagement is determined by using email paradata combined with paradata from the survey about access and completion. Low engagement may mean not receiving (e.g. spam, bad email address, etc.) or never opening the email. High engagement with low survey access (and completion) may mean there are other attributes (e.g. survey length, survey topic, incentive, how the data will be used, etc.) of the design affecting the decision to participate that researchers may need to address. The data also provide insight as to when emails are opened. This has the potential for the survey practitioners to focus on optimal times to attempt contact to try to gain cooperation. Three engagement analyses were conducted. The first analysis looks at the open rate for each email type (prenotification, invitation, reminder 1, etc.). The second analysis looks at the elapsed time (lag) between sending and opening of each email type. The final analysis looks at the optimal day to send an email invitation to elicit response. This information can be used to inform future survey design decisions and provide insight into non-response.

Keywords

Email Paradata Sustainability culture Web survey 

Notes

Acknowledgements

The authors thank the Office of the Provost at the University of Michigan for funding the SCIP. Thanks also go to John Callewaert and Robert Marans for allowing the research to be conducted on the SCIP and to the reviewers for their useful suggestions.

References

  1. Callegaro, M. (2010). Do you know which device your respondent has used to take your online survey? Survey Practice 3(6), 1–12. Google Scholar
  2. Callegaro, M. (2013). Paradata in web surveys. In F. Kreuter (Ed.), Improving surveys with paradata: Analytic uses of process information Improving Surveys with Paradata (pp. 259–279). Hoboken: Wiley.CrossRefGoogle Scholar
  3. Callewaert, J., and Marans, R. W. (2017). Measuring progress over time: The sustainability cultural indicators program at the University of Michigan. In W. Leal (ed.). Handbook of theory and practice of sustainable development in higher education (Vol. 2). New York: Springer (2017).Google Scholar
  4. Couper, M. P. (1998). Measuring Survey Quality in a CASIC Environment. Proceedings of the Survey Research Methods Section, ASA, pp. 41–49.Google Scholar
  5. Couper, M. P. (2000). Review: Web surveys: A review of issues and approaches. The Public Opinion Quarterly, 64(4), 464–494.CrossRefGoogle Scholar
  6. Couper, M. P. (2008). Designing effective web surveys. New York: Cambridge University Press.CrossRefGoogle Scholar
  7. Couper, M. P., & Miller, P. V. (2008). Web survey methods public. Opinion Quarterly, 72(5), 831–835.CrossRefGoogle Scholar
  8. Groves, R. M., Singer, E., & Corning, A. (2000). Leverage-salience theory of survey participation: Description and illustration. Public Opinion Quarterly, 64(3), 299–308.CrossRefGoogle Scholar
  9. Heerwegh, D. (2003). Explaining response latencies and changing answers using client-side paradata from a web survey. Social Science Computing Review, 21(3), 360–373.CrossRefGoogle Scholar
  10. Heerwegh, D. (2011). Internet survey paradata. In M. Das, P. Ester, & L. Kaczmirek (Eds.), Social and behavioral research and the internet. Advances in applied methods and research strategies (pp. 325–348). New York: Taylor and Francis.Google Scholar
  11. Kreuter, F. (Ed.). (2013). Improving surveys with paradata: Analytic uses of process information. Hoboken: Wiley.Google Scholar
  12. Lozar Manfreda, K., Bosnjak, M., Berzelak, J., Haas, I., & Vehovar, V. (2008). Web surveys versus other survey modes: A meta-analysis comparing response rates. International Journal of Market Research, 50(1), 79–104.Google Scholar
  13. Rezgui, A., Bouguettaya, A., & Eltoweissy, M. Y. (2003). Privacy on the web: facts, challenges, and solutions. IEEE Security and Privacy, 1(6), 40–49.CrossRefGoogle Scholar

Copyright information

© Springer International Publishing AG 2018

Authors and Affiliations

  • Andrew L. Hupp
    • 1
  • Heather M. Schroeder
    • 1
  • Andrew D. Piskorowski
    • 1
  1. 1.Survey Research Center, University of MichiganAnn ArborUSA

Personalised recommendations