Advertisement

Metacognition and Learning

, Volume 14, Issue 2, pp 215–228 | Cite as

Accuracy in judgments of study time predicts academic success in an engineering course

  • Justin G. Gyllen
  • Thomas F. StahovichEmail author
  • Richard E. Mayer
  • Amirali Darvishzadeh
  • Negin Entezari
Article
  • 108 Downloads

Abstract

The present work examines the accuracy of self-reports of study time for college students. In a 10-week Mechanical Engineering course, 99 college students accessed their textbook, homework solutions, graded work, and lecture slides via custom software that recorded objective measures of reading time. In addition, the students provided subjective judgments of the time they spent reading these materials. Comparisons between the objective and subjective measures reveal that students significantly overestimated time with the textbook, homework solutions, graded work, and lecture slides, with higher performing students overestimating to a lesser degree. The difference between objective and subjective judgments of study time correlated significantly and negatively with final course grade for the textbook (r = −.31), homework solutions (r = −.39), and lecture slides (r = −.24), but not for graded work (r = −.05). This study calls into question the utility of self-report data in studies of student study habits, and showcases the value of objective technology-based measures of such habits.

Keywords

Study strategies Evaluation methodologies Interactive learning environments Learning management systems 

Notes

Funding information

This project was supported by the National Science Foundation under Award Numbers 0935239, 1432820, and 1612511.

Compliance with ethical standards

The authors listed on this manuscript declare that they have no conflicts of interest to report.

References

  1. Austin, E. J., Deary, I. J., Gibson, G. J., McGregor, M. J., & Dent, J. B. (1998). Individual response spread in self-report scales: Personality correlations and consequences. Personality and Individual Differences, 24(3), 421–438.CrossRefGoogle Scholar
  2. Azevedo, R., & Aleven, V. (Eds.). (2013). International handbook of metacognition and learning technologies. New York: Springer.Google Scholar
  3. Bash, K. L., & Kreiner, D. S. (2014). Student perceptions of study time. Psi Chi Journal of Psychological Research, 19(1), 3–9.CrossRefGoogle Scholar
  4. Berry, T., Cook, L., Hill, N., & Stevens, K. (2010). An exploratory analysis of textbook usage and study habits: Misperceptions and barriers to success. College Teaching, 59(1), 31–39.  https://doi.org/10.1080/87567555.2010.509376.CrossRefGoogle Scholar
  5. Carroll, J. B. (1963). A model of school learning. Teachers College Record, 64, 723–733.Google Scholar
  6. Cole, J. S., & Gonyea, R. M. (2010). Accuracy of self-reported SAT and ACT test scores: Implications for research. Research in Higher Education, 51(4), 305–319.  https://doi.org/10.1007/s11162-009-9160-9.CrossRefGoogle Scholar
  7. Cummings, K., French, T., & Cooney, P. J. (2002). Student textbook use in introductory physics. In Paper presented at Physics Education Research Conference 2002. Boise: Idaho.Google Scholar
  8. Daniel, D. B., & Woody, W. D. (2013). E-textbooks at what cost? Performance and use of electronic v. print texts. Computers & Education, 62, 18–23.  https://doi.org/10.1016/j.compedu.2012.10.016.CrossRefGoogle Scholar
  9. Dunlosky, J., & Nelson, T. O. (1992). Importance of the kind of cue for judgments of learning (JOL) and the delayed-JOL effect. Memory & Cognition, 20(4), 374–380.  https://doi.org/10.3758/BF03210921.CrossRefGoogle Scholar
  10. Fiorella, L., & Mayer, R. E. (2015). Learning as a generative activity: Eight learning strategies that promote understanding. New York: Cambridge University Press.CrossRefGoogle Scholar
  11. Flavell, J. H. (1979). Metacognition and cognitive monitoring: A new area of cognitive–developmental inquiry. American Psychologist, 34(10), 906–911.CrossRefGoogle Scholar
  12. van Gog, T. (2013). Time on task. In J. Hattie & E. M. Anderman (Eds.), International guide to student achievement (pp. 432–433). New York: Routledge.Google Scholar
  13. Gonyea, R. M. (2005) Self‐reported data in institutional research: Review and recommendations. New Directions for Institutional Research, 2005, 73–89.  https://doi.org/10.1002/ir.156.CrossRefGoogle Scholar
  14. Junco, R., & Clem, C. (2015). Predicting course outcomes with digital textbook usage data. The Internet and Higher Education, 27, 54–63.  https://doi.org/10.1016/j.iheduc.2015.06.001.CrossRefGoogle Scholar
  15. Kuncel, N. R., Credé, M., & Thomas, L. L. (2005). The validity of self-reported grade point averages, class ranks, and test scores: A meta-analysis and review of the literature. Review of Educational Research, 75(1), 63–82.  https://doi.org/10.3102/00346543075001063.CrossRefGoogle Scholar
  16. Landrum, R. E., Gurung, R. A. R., & Spann, N. (2012). Assessments of textbook usage and the relationship to student course performance. College Teaching, 60(1), 17–24.  https://doi.org/10.1080/87567555.2011.609573.CrossRefGoogle Scholar
  17. Masui, C., Broeckmans, J., Doumen, S., Groenen, A., & Molenberghs, G. (2014). Do diligent students perform better? Complex relations between student and course characteristics, study time, and academic performance in higher education. Studies in Higher Education, 39(4), 621–643.  https://doi.org/10.1080/03075079.2012.721350.CrossRefGoogle Scholar
  18. Mayer, R. E. (2016). Role of metacognition in STEM games and simulations. In H. F. O'Neil, E. L. Baker, & R. S. Perez (Eds.), Using games and simulations for reaching and assessment (pp. 183–205). New York: Routledge.Google Scholar
  19. Mayer, R. E., Stull, A. T., Campbell, J., Almeroth, K., Bimber, B., Chun, D., & Knight, A. (2007). Overestimation Bias in self-reported SAT scores. Educational Psychology Review, 19(4), 443–454.  https://doi.org/10.1007/s10648-006-9034-z.CrossRefGoogle Scholar
  20. Metcalfe, J. (2009). Metacognitive judgments and control of study. Current Directions in Psychological Science, 18(3), 159–163.  https://doi.org/10.1111/j.1467-8721.2009.01628.x.CrossRefGoogle Scholar
  21. Metcalfe, J., & Finn, B. (2008). Evidence that judgments of learning are causally related to study choice. Psychonomic Bulletin & Review, 15(1), 174–179.  https://doi.org/10.3758/PBR.15.1.174.CrossRefGoogle Scholar
  22. National Survey of Student Engagement. (2016). Engagement insights: Survey findings on the quality of undergraduate education—Annual results 2016. Bloomington, IN: Indiana University Center for Postsecondary Research.Google Scholar
  23. Phillips, B. J., & Phillips, F. (2007). Sink or skim: Textbook Reading behaviors of introductory accounting students. Issues in Accounting Education, 22(1), 21–44.  https://doi.org/10.2308/iace.2007.22.1.21.CrossRefGoogle Scholar
  24. Pilegard, C., & Mayer, R. E. (2015a). Within-subject and between-subject conceptions of metacomprehension accuracy. Learning and Individual Differences, 41, 54–61.CrossRefGoogle Scholar
  25. Pilegard, C., & Mayer, R. E. (2015b). Adding judgments of understanding to the metacognitive toolbox. Learning and Individual Differences, 41, 62–72.CrossRefGoogle Scholar
  26. Pintrich, P. R., Smith, D. A., Garcia, T., & Mckeachie, W. J. (1993). Reliability and predictive validity of the motivated strategies for learning questionnaire (MSLQ). Educational and Psychological Measurement, 53, 801–813.CrossRefGoogle Scholar
  27. Podolefsky, N., & Finkelstein, N. (2006). The perceived value of college physics textbooks: Students and instructors may not see eye to eye. The Physics Teacher, 44(6), 338–342.  https://doi.org/10.1119/1.2336132.CrossRefGoogle Scholar
  28. Porter, S. R. (2011). Do College student surveys have any validity? The Review of Higher Education, 35(1), 45–76.  https://doi.org/10.1353/rhe.2011.0034.CrossRefGoogle Scholar
  29. Rawson, K., Stahovich, T. F., & Mayer, R. E. (2017). Homework and achievement: Using smartpen technology to find the connection. Journal of Educational Psychology, 109, 208–219.  https://doi.org/10.1037/edu0000130.CrossRefGoogle Scholar
  30. Rhodes, M. G., & Tauber, S. K. (2011). The influence of delaying judgments of learning on metacognitive accuracy: A meta-analytic review. Psychological Bulletin, 137(1), 131–148.CrossRefGoogle Scholar
  31. Salmerón, L., Naumann, J., García, V., & Fajardo, I. (2017). Scanning and deep processing of information in hypertext: An eye tracking and cued retrospective think-aloud study. Journal of Computer Assisted Learning, 33(3), 222–233.CrossRefGoogle Scholar
  32. Sato, H., & Kawahara, J. I. (2011). Selective bias in retrospective self-reports of negative mood states. Anxiety, Stress & Coping, 24(4), 359–367.CrossRefGoogle Scholar
  33. Schraw, G. (2009). Measuring metacognitive judgements. In J. Hacker, J. Dunlosky, & A. C. Graesser (Eds.), Handbook of metacognition in education (pp. 415–429). New York, NY: Routledge.Google Scholar
  34. Schraw, G. (2010). Measuring self-regulation in computer-based learning environments. Educational Psychologist, 45, 258–266.CrossRefGoogle Scholar
  35. Schraw, G., & Dennison, R. S. (1994). Assessing metacognitive awareness. Contemporary Educational Psychology, 19(4), 460–475.  https://doi.org/10.1006/ceps.1994.1033.CrossRefGoogle Scholar
  36. Schraw, G., Crippen, K. J., & Hartley, K. (2006). Promoting self-regulation in science education: Metacognition as part of a broader perspective on learning. Research in Science Education, 36(1–2), 111–139.  https://doi.org/10.1007/s11165-005-3917-8.CrossRefGoogle Scholar
  37. Schuman, H., Walsh, E., Olson, C., & Etheridge, B. (1985). Effort and reward: The assumption that college grades are affected by quantity of study. Social Forces, 63(4), 945–966.  https://doi.org/10.2307/2578600.CrossRefGoogle Scholar
  38. Sikorski, J. F., Rich, K., Saville, B. K., Buskist, W., Drogan, O., Davis, S. F., Griggs, R. A., Jackson, S. L., Marek, P., Boyce, T. E., Geller, E. S., Harvey, J. H., & Hofmann, W. J. (2002). Faculty forum. Teaching of Psychology, 29(4), 312–320.CrossRefGoogle Scholar
  39. Smith, B. D., & Jacobs, D. C. (2003). TextRev: A window into how general and organic chemistry students use textbook resources. Journal of Chemical Education, 80(1), 99.  https://doi.org/10.1021/ed080p99.CrossRefGoogle Scholar
  40. Stinebrickner, R., & Stinebrickner, T. R. (2004). Time-use and college outcomes. Journal of Econometrics, 121(1–2), 243–269.  https://doi.org/10.1016/j.jeconom.2003.10.013.CrossRefGoogle Scholar
  41. Veenman, M. V. J., Hout-Wolters, B. H. A. M. V., & Afflerbach, P. (2006). Metacognition and learning: Conceptual and methodological considerations. Metacognition and Learning, 1(1), 3–14.  https://doi.org/10.1007/s11409-006-6893-0.CrossRefGoogle Scholar
  42. Walentynowicz, M., Schneider, S., & Stone, A. A. (2018). The effects of time frames on self-report. PLoS One, 13(8), e0201655.CrossRefGoogle Scholar
  43. Wang, M. C., Haertel, G. D., & Walberg, H. J. (1990). What influences learning? A content analysis of review literature. The Journal of Educational Research, 84(1), 30–43.  https://doi.org/10.1080/00220671.1990.10885988.CrossRefGoogle Scholar
  44. Young, A., & Fry, J. D. (2008). Metacognitive awareness and academic achievement in college students. Journal of the Scholarship of Teaching and Learning, 8(2), 1–10.Google Scholar

Copyright information

© Springer Science+Business Media, LLC, part of Springer Nature 2019

Authors and Affiliations

  • Justin G. Gyllen
    • 1
  • Thomas F. Stahovich
    • 1
    Email author
  • Richard E. Mayer
    • 2
  • Amirali Darvishzadeh
    • 3
  • Negin Entezari
    • 3
  1. 1.Department of Mechanical Engineering, Bourns College of EngineeringUniversity of CaliforniaRiversideUSA
  2. 2.Department of Psychological and Brain SciencesUniversity of CaliforniaSanta BarbaraUSA
  3. 3.Department of Computer Science and Engineering, Bourns College of EngineeringUniversity of CaliforniaRiversideUSA

Personalised recommendations