Abstract
This chapter assesses Falk and Needham’s (2011) study of the California Science Center’s long-term impact on the Los Angeles population’s scientific understanding, attitudes, and interests. The study has been put forward as a good model of long-term impact evaluation for other researchers and informal science learning institutions to emulate. Moreover, the study’s claims about the Science Center’s positive impacts have been widely cited. To contribute to methodological development in informal science learning research, we critically examine the authors’ methods and claims, identifying major challenges to the validity and reliability of the research approach. We focus on sampling, data collection and analysis practices, including the use of self-report data , asking parents to report impact on behalf of children and non-representative sampling methods. An important innovation claimed by the authors is an indicator-based impact measure (a ‘marker’) designed to limit their reliance on self-report data. Our essay highlights this measure’s limitations, while pointing to alternative approaches that could more validly assess long-term learning or attitudinal impacts. We also outline directions for improving the statistical analysis and its interpretation. Ultimately, we conclude that Falk and Needham’s sanguine conclusions about the Science Center’s impacts are not empirically justified. We recommend that future research employ more direct measurements of learning outcomes grounded in established social scientific methodology to evaluate informal science learning impacts.
Sections of this article are reprinted with permission from Wiley & Sons. Reference for the original article is as follows:.
Jensen, E., & Lister, T. (2016). Evaluating indicator-based methods of measuring long-term impacts of a science center on its community (comment). Journal of Research in Science Teaching, 53(1), 60–64.
A rejoinder for this chapter follows in Chap. 14.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
References
Bureau, U. C. (2008–2012). American fact finder. Retrieved September 10, 2014, from Selcted Economic Characterisitcs 2008–2012 http://factfinder2.census.gov/faces/tableservices/jsf/pages/productview.xhtml?src=bkmk.
Bureau, U. C. (2000). Profile of general demographic characteristics: 2000. Retrieved September 6, 2014, from American factfinder: http://factfinder2.census.gov/faces/tableservices/jsf/pages/productview.xhtml?src=bkmk.
Bureau, U. C. (2010). Profile of General Population and Housing Characterisitcs: 2010. Retrieved September 6, 2014, from American factfinder: http://factfinder2.census.gov/faces/tableservices/jsf/pages/productview.xhtml?src=bkmk.
Calsyn, R. J. (1992). Acquiescence in needs assessment studies of the elderly. The Gerontologist, 32(2), 246–252.
Cannel, C., Miller, P., & Oksenberg, L. (1981). Research on interviewing techniques. In S. Leinhardt (Ed.), Sociological methodology (pp. 389–437). San Francisco: Jossey-Bass.
Dawson, E., & Jensen, E. (2011). Towards a ‘contextual turn’ in visitor research: Evaluating visitor segmentation and identity-related motivations. Visitor Studies, 14(2), 127–140.
Falk, J. H., & Gillespie, K. L. (2009). Investigating the role of emotion in Science Center visitor learning. Visitor Studies, 12(2), 112–132.
Falk, J. H., & Needham, M. D. (2011). Measuring the impact of a science center on its community. Journal of Research in Science Teaching, 48(1), 1–12.
Falk, J. H., & Needham, M. D. (2013). Factors contributing to adult knowledge of science and technology. Journal of Research in Science Teaching, 50(4), 431–452.
Falk, J. H., & Needham, M. D. (2016). Utilizing indicator–based methods: Measuring the impact of a science center on its community. Journal of Research in Science Teaching, 53(1), 65–69.
Falk, J. H., & Storksdieck, M. (2005). Using the contextual model of learning to understand visitor learning from a science center exhibition. Science Education, 89, 744–778.
Hood, M. (1995). A view from ‘outside’ research on community audiences. Visitor Studies: Theory, Research and Practice, 7, 77–87.
Jensen, E. (2014a). Evaluating children’s conservation biology learning at the zoo. Conservation Biology.
Jensen, E. (2014b). The problems with science communication evaluation. Journal of Science Communication, 1, C04.
Jensen, E., Dawson, E., & Falk, J. (2011). Dialogue and synthesis: Developing consensus in visitor research methodology. Visitor Studies, 14(2), 158–161.
Jensen, E., & Lister, T. (2016). Evaluating indicator-based methods of measuring long-term impacts of a science center on its community (comment). Journal of Research in Science Teaching, 53(1), 60–64.
Krippendorff, K. (2013). Content analysis: An introduction to its methodology (3rd ed.). Pennsylavania: SAGE Publications.
Krosnick, J. A. (1999). Survey research. Annual Review of Psychology, 50, 537–567.
Lenski, G. E., & Leggett, J. C. (1960). Caste, class, and deference in the research interview. American Jouranl of Sociology, 65(5), 463–467.
Miller, J. D. (2001). The acquisition and retention of scientific information by American adults. In J. H. Falk, Free-choice science education: How we learn science outside of school (pp. 93–114). New York: NY: Teachers College Press.
Miller, J. D. (2004). Public understanding of, and attitudes toward, scientific research: What we know and what we need to know. Understanding of Science, 13, 273–294.
Moss, A., Jensesn, E., & Gusset, M. (2015). Evaluating the Contribution of Zoos and Aquariums to Aichi Biodiversity Target 1. Conservation Biology, 29(2), 537–544.
National Science Board. (2006). Science and engineering indicators. Washington, DC: U.S. Government Printing Office.
Neuendorf, K. A. (2002). The content analysis guidebook. Cleveland State University: SAGE Publications.
St. John, M., & Perry, D. (1993). A framework for evaluation and research: Science, infrastructure, and relationships. In S. Bicknell, & G. Farmelo, Museum visitor studies in the 90s (pp. 59–66). London: Science Museum.
Tourangeau, R., Rips, L., & Rasinski, K. (2000). The psychology of survey response. Cambridge: Cambridge University Press.
Wagoner, B., & Jensen, E. (2014). Microgenetic evaluation: Studying learning in motion. In Yearbook of Idiographic Science: Reflexivity and Change. Charlotte, N.C.: Information Age Publishers.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2017 Springer International Publishing AG
About this chapter
Cite this chapter
Jensen, E., Lister, T. (2017). The Challenges of ‘Measuring Long-Term Impacts of a Science Center on Its Community’: A Methodological Review. In: Patrick, P. (eds) Preparing Informal Science Educators. Springer, Cham. https://doi.org/10.1007/978-3-319-50398-1_13
Download citation
DOI: https://doi.org/10.1007/978-3-319-50398-1_13
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-50396-7
Online ISBN: 978-3-319-50398-1
eBook Packages: EducationEducation (R0)