Advertisement

Talking to People I: Surveys

  • Uwe Hasebrink
  • Sascha Hölig
Chapter

Abstract

Surveys are built up of standardized interviews with larger samples of individuals in order to make inferences about a specific population. They belong to the most popular methods within the social sciences; this is particularly true for research in different areas of policy, where surveys, initiated by all kinds of actors that are involved in policy-making, have become an integral element of the whole policy cycle. With regard to media policy research surveys can provide data on patterns of media use and on opinions on specific media policy issues. In order to achieve meaningful results researchers who consider applying surveys have to reflect their main characteristics: they are based on self-reports, they are reactive methods, and they include standardized measurements. The design of a survey includes the following steps: defining the relevant population, sampling, decision on specific comparative designs, selecting a mode of interviewing, designing the questionnaire, data analysis and presentation. As an illustration of surveys that are particularly relevant for media policy research two international studies are shortly presented: the Reuters Institute Digital News Report on current trends of news consumption, and the EU Kids Online survey on children’s and young people’s online experiences and online safety.

References

  1. American Association for Public Opinion Research (AAPOR). (2016). Standard definitions: Final dispositions of case codes and outcome rates for surveys (9th ed.). Lenexa, KS: AAPOR. Retrieved July 25, 2018, from https://www.aapor.org/Standards-Ethics/Standard-Definitions-(1).aspx.
  2. Banducci, S., & Stevens, D. (2015). Surveys in context. Public Opinion Quarterly, 79, 214–243.CrossRefGoogle Scholar
  3. Boehm, M., Bowman, D., & Zinn, J. O. (2013). Survey research and the production of evidence for social policy. Social Policy & Society, 12(2), 309–318.CrossRefGoogle Scholar
  4. Borgman, C. L. (2015). Big data, little data, no data: Scholarship in the networked world. Cambridge, MA: MIT Press.CrossRefGoogle Scholar
  5. Carpenter, S. (2018). Ten steps in scale development and reporting: A guide for researchers. Communication Methods and Measures, 12(1), 25–44.  https://doi.org/10.1080/19312458.2017.1396583.CrossRefGoogle Scholar
  6. Elmelund‐Præstekær, C., Hopmann, D. N., & Pedersen, R. T. (2017). Survey methods, traditional, public opinion polling. In J. Matthes, C. S. Davis, & R. F. Potter (Eds.), The international encyclopedia of communication research methods. Wiley.  https://doi.org/10.1002/9781118901731.iecrm0245.
  7. Erba, J., Ternes, B., Bobkowski, P., Logan, T., & Liu, Y. (2018). Sampling methods and sample populations in quantitative mass communication research studies: A 15-year census of six journals. Communication Research Reports, 35(1), 42–47.CrossRefGoogle Scholar
  8. Esser, F., & Hanitzsch, T. (2012). On the why and how of comparative inquiry in communication studies. In F. Esser & T. Hanitzsch (Eds.), The handbook of comparative communication research (pp. 3–22). New York, NY: Routledge.Google Scholar
  9. Eveland, W. P., Jr., Hutchens, M. J., & Shen, F. (2009). Exposure, attention, or “use” of news? Assessing aspects of the reliability and validity of a central concept in political communication research. Communication Methods and Measures, 3(4), 223–244.  https://doi.org/10.1080/19312450903378925.CrossRefGoogle Scholar
  10. Fink, A. (2013). How to conduct surveys: A step-by-step guide (5th ed.). Thousand Oaks, CA: Sage.Google Scholar
  11. Fowler, F. J. (2009). Survey research methods (4th ed.). London: Sage.Google Scholar
  12. Hasebrink, U. (2011). Giving the audience a voice: The role of research in making media regulation more responsive to the needs of the audience. Journal of Information Policy, 1, 321–336.CrossRefGoogle Scholar
  13. Hasebrink, U., Görzig, A., Haddon, L., Kalmus, V., & Livingstone, S. (2011). Patterns of risk and safety online: In-depth analyses from the EU Kids Online survey of 9–16 year olds and their parents in 25 countries. London: LSE; EU Kids Online. http://eprints.lse.ac.uk/39356/.
  14. Hasebrink, U., Livingstone, S., Haddon, L., & Ólafsson, K. (2009). Comparing children’s online opportunities and risks across Europe: Cross-national comparisons for EU Kids Online (2nd ed.). London: EU Kids Online. http://eprints.lse.ac.uk/24368/.
  15. Hasebrink, U., & Lobe, B. (2013). The cultural context of risk: On the role of intercultural differences for safer Internet issues. In B. O’Neill, E. Staksrud, & S. McLaughlin (Eds.), Towards a better Internet for children? Policy pillars, players and paradoxes (pp. 283–299). Göteborg: Nordicom.Google Scholar
  16. Hastak, M., Mazis, M. B., & Morris, L. A. (2001). The role of consumer surveys in public policy decision making. Journal of Public Policy & Marketing, 20(2), 170–185.CrossRefGoogle Scholar
  17. Henry, T. G. (1990). Practical sampling. Newbury Park: Sage.CrossRefGoogle Scholar
  18. Hepp, A., Breiter, A., & Friemel, T. (2018). Digital traces in context: An introduction. International Journal of Communication, 12, 439–449. https://doi.org/1932-8036/20180005.
  19. Herbst, S. (1993). Numbered voices: How opinion polling has shaped American politics. Chicago: University of Chicago Press.Google Scholar
  20. Hocevar, K. P., & Flanagin, A. J. (2017). Online research methods, quantitative. In J. Matthes, C. S. Davis, & R. F. Potter (Eds.), The international encyclopedia of communication research methods. Wiley.  https://doi.org/10.1002/9781118901731.iecrm0174.
  21. Holcomb, J., & Spalsbury, A. (2005). Teaching students to use summary statistics and graphics to clean and analyze data. Journal of Statistics Education, 13, 3.  https://doi.org/10.1080/10691898.2005.11910567.CrossRefGoogle Scholar
  22. Hooghe, M., Stolle, D., Mahéo, V. A., & Vissers, S. (2010). Why can’t a student be more like an average person? Sampling and effects in social science field and laboratory experiments. The Annals of the American Academy of Political and Social Science, 628(1), 85–96.  https://doi.org/10.1177/0002716209351516.CrossRefGoogle Scholar
  23. Hooker, C. M., & de Zúniga, H. G. (2017). Survey methods, online. In J. Matthes, C. S. Davis, & R. F. Potter (Eds.), The international encyclopedia of communication research methods. Wiley.  https://doi.org/10.1002/9781118901731.iecrm0244.
  24. Lavrakas, P. J. (2008). Encyclopedia of survey research methods (Vol. 2). London: Sage.CrossRefGoogle Scholar
  25. Livingstone, S., Haddon, L., Görzig, A., & Ólafsson, K. (2011). Risks and safety on the Internet: The perspective of European children. London: EU Kids Online. http://eprints.lse.ac.uk/33731/.
  26. Livingstone, S., Ólafsson, K., O’Neill, B., & Donoso, V. (2012). Towards a better Internet for children. London: EU Kids Online.Google Scholar
  27. Livingstone, S., Ólafsson, K., & Staksrud, E. (2011). Social networking, age and privacy. London: EU Kids Online. http://eprints.lse.ac.uk/35849/.
  28. Newman, N., Richard, F., Kalogeropoulos, A., Levy, D. A. L., & Nielsen, R. K. (2017). Reuters Institute digital news report 2017. Oxford: Reuters Institute for the Study of Journalism.Google Scholar
  29. O’Neill, B., Staksrud, E., & McLaughlin, S. (Eds.). (2013). Towards a better Internet for children? Policy pillars, players and paradoxes. Göteborg: Nordicom.Google Scholar
  30. Peterson, R. A. (2000). Constructing effective questionnaires. Thousand Oaks, CA: Sage.  https://doi.org/10.4135/9781483349022.CrossRefGoogle Scholar
  31. Rubin, R. B., Palmgreen, P., & Sypher, H. E. (Eds.). (2009). Communication research measures II: A sourcebook. New York: Routledge.Google Scholar
  32. Ryan, K., Gannon-Slater, N., & Culbertson, M. J. (2012). Improving survey methods with cognitive interviews in small- and medium-scale evaluations. American Journal of Evaluation, 33(3), 414–430.CrossRefGoogle Scholar
  33. Sala, E., & Lillini, R. (2015). Undercoverage bias in telephone surveys in Europe: The Italian case. International Journal of Public Opinion Research, 29(1), 133–156.Google Scholar
  34. Scherpenzeel, A. C., & Bethlehem, J. G. (2011). How representative are online panels? Problems of coverage and selection and possible solutions. In M. Das, P. Ester, & L. Kaczmirek (Eds.), Social and behavioral research and the Internet: Advances in applied methods and research strategies (pp. 105–132). New York: Routledge.Google Scholar
  35. Smyth, J. D., & Pearson, J. E. (2011). Internet survey methods: A review of strengths, weaknesses, and innovations. In M. Das, P. Ester, & L. Kaczmirek (Eds.), Social and behavioral research and the Internet: Advances in applied methods and research strategies (pp. 11–44). New York: Routledge.Google Scholar

Further Reading

  1. Elmelund‐Præstekær, C., Hopmann, D. N., & Pedersen, R. T. (2017). Survey methods, traditional, public opinion polling. In J. Matthes, C. S. Davis, & R. F. Potter (Eds.), The international encyclopedia of communication research methods. Wiley.  https://doi.org/10.1002/9781118901731.iecrm0245.
  2. Fink, A. (2013). How to conduct surveys: A step-by-step guide (5th ed.). Thousand Oaks, CA: Sage.Google Scholar
  3. Fowler, F. J. (2008). Survey research methods. London: Sage.Google Scholar
  4. Lavrakas, P. J. (2008). Encyclopedia of survey research methods (Vol. 2). London: Sage.CrossRefGoogle Scholar
  5. Smyth, J. D., & Pearson, J. E. (2011). Internet survey methods: A review of strengths, weaknesses, and innovations. In M. Das, P. Ester, & L. Kaczmirek (Eds.), Social and behavioral research and the Internet: Advances in applied methods and research strategies (pp. 11–44). New York: Routledge.Google Scholar

Copyright information

© The Author(s) 2019

Authors and Affiliations

  • Uwe Hasebrink
    • 1
  • Sascha Hölig
    • 1
  1. 1.Leibniz Institute for Media Research | Hans-Bredow-InstitutHamburgGermany

Personalised recommendations