Practice makes proficient: teaching undergraduate students to understand published research
Scientific knowledge, including the critical evaluation and comprehension of empirical articles, is a key skill valued by most undergraduate institutions for students within the sciences. Students often find it difficult to not only summarize empirical journal articles, but moreover to successfully grasp the quality and rigor of investigation behind the source. In this paper, we use instructional scaffolds (reading worksheets, RWs, with tutorials) to aid students in being able to comprehend, and ultimately transfer, the skills necessary in critically evaluating primary sources of research. We assess students’ learning of these skills on a multiple-choice assessment of Journal Article Comprehension (JAC). Students in experimental classes, who received instructional scaffolds, improved on the JAC post-test compared with students in control classes. This result shows that students are acquiring fundamental research skills such as understanding the components of research articles. We also showed that improvement on the JAC post-test for the experimental class extended to a written summary test. This result suggests that students in the experimental group are developing discipline-specific science process skills that allow them to apply JAC skills to a near-transfer task of writing a summary.
KeywordsReading empirical articles Instructional scaffolds Assessment Learning outcomes Cognitive psychology
The authors thank Amy Shapiro and Scott Hinze for access to their cognitive psychology courses. We thank James Bradley for his assistance in creating the summary coding scheme. Judy Sims-Knight gave invaluable advice on several statistical issues, and Susan Goldman and Micki Chi provided important suggestions for best practices in establishing inter-rater reliability. We also thank several anonymous reviewers for their feedback. Preliminary results from the pilot studies were presented at the 15th Annual European Association for Research on Learning and Instruction conference (EARLI 2013, Munich, Germany) and the 4th Biennial Conference of the International Society for the Psychology of Science and Technology (ISPST, 2012, Pittsburgh, PA).
- Lippman, J. P., Kershaw, T. C., Pellegrino, J. W., & Ohlsson, S. (2008). Beyond standard lectures: Supporting the development of critical thinking in cognitive psychology courses. In D. S. Dunn, J. S. Halonen & R. A. Smith (Eds.), Teaching critical thinking in psychology: A handbook of best practices (pp. 183–198). Boston: Blackwell Publishing.CrossRefGoogle Scholar
- Kershaw, T. C., Lippman, J. P., & Kolev, L. N. (in preparation). Learning to critique published psychological research.Google Scholar
- Christopher, A. N., & Walter, M. I. (2006). An assignment to help students learn to navigate primary sources of information. Teaching of Psychology, 33, 42–45.Google Scholar
- Dunlosky, J., Rawson, K. A., Marsh, E. J., Nathan, M. J., & Willingham, D. T. (2013). Improving students’ learning with effective learning techniques: Promising directions from cognitive and educational psychology. Psychological Science in the Public Interest, 14, 4–58. https://doi.org/10.1177/1529100612453266.CrossRefGoogle Scholar
- Fleiss, J. L. (1981). Statistical methods for rates and proportions. New York: Wiley.Google Scholar
- Goldman, S. R., & Bisanz, G. (2002). Toward a functional analysis of scientific genres: Implications for understanding and learning processes. In J. Otero, J. A. Leon, & A. C. Graesser (Eds.), The psychology of science text comprehension (pp. 19–50). Mahwah, NJ: Lawrence Erlbaum Associates.Google Scholar
- Gottfried, G. M., Johnson, K. E., & Vosmik, J. R. (2009). Assessing student learning: A collection of evaluation tools. Office of Teaching Resources in Psychology. Retrieved from http://teachpsych.org/resources/Documents/otrp/resources/gottfried09.pdf.
- Gwet, K. L. (2014). Handbook of inter-rater reliability (4th ed.). Gaithersburg, MD: Advanced Analytics LLC.Google Scholar
- Karcher, S. J. (2000). Student reviews of scientific literature: Opportunities to improve students’ scientific literacy and writing skills. In S. J. Karcher (Ed.), Tested studies for laboratory teaching. Proceedings of the 22nd Workshop/Conference of the Association for Biology Laboratory Education (pp. 484–487).Google Scholar
- Levine, E. (2001). Reading your way to scientific literacy. Journal of College Science Teaching, 31, 122–125.Google Scholar
- Locke, L. F., Silverman, S. J., & Spirduso, W. W. (1998). Reading and understanding research. London: Sage.Google Scholar
- Morris, B. J., Croker, S., Masnick, A. M., & Zimmerman, C. (2012). The emergence of scientific reasoning. In H. Kloos, B. J. Morris, & J. L. Amaral (Eds.), Current topics in children’s learning and cognition (pp. 61–82). Rijeka, Croatia: InTech.Google Scholar
- Robertson, K. (2012). A journal club workshop that teaches undergraduates a systematic method for reading, interpreting, and presenting primary literature. Journal of College Science Teaching, 41, 25–31.Google Scholar
- Tabachnick, B. G., & Fidell, L. S. (2013). Using multivariate statistics (6th ed.). Boston: Pearson.Google Scholar
- Taylor, K. K. (1983). Can college students summarize? Journal of Reading, 26, 524–528.Google Scholar
- Zieffler, A. S., & Garfield, J. B. (2009). Modeling the growth of students’ covariational reasoning during an introductory statistics course. Statistics Education Research Journal, 8, 7–31.Google Scholar