Advertisement

Investigating the validity of web-enabled mechanistic case diagramming scores to assess students’ integration of foundational and clinical sciences

  • Kristi J. FergusonEmail author
  • Clarence D. Kreiter
  • Ellen Franklin
  • Thomas H. Haugen
  • Fred R. Dee
Article
  • 44 Downloads

Abstract

As medical schools have changed their curricula to address foundational and clinical sciences in a more integrated fashion, teaching methods such as concept mapping have been incorporated in small group learning settings. Methods that can assess students’ ability to apply such integrated knowledge are not as developed, however. The purpose of this project was to assess the validity of scores on a focused version of concept maps called mechanistic case diagrams (MCDs), which are hypothesized to enhance existing tools for assessing integrated knowledge that supports clinical reasoning. The data were from the medical school graduating class of 2018 (N = 136 students). In 2014–2015 we implemented a total of 16 case diagrams in case analysis groups within the Mechanisms of Health and Disease (MOHD) strand of the pre-clinical curriculum. These cases were based on topics being taught during the lectures and small group sessions for MOHD. We created an overall score across all 16 cases for each student. We then correlated these scores with performance in the preclinical curriculum [as assessed by overall performance in MOHD integrated foundational basic science courses and overall performance in the Clinical and Professional Skills (CAPS) courses], and standardized licensing exam scores [United States Medical Licensing Exam (USMLE)] Step 1 (following core clerkships) and Step 2 Clinical Knowledge (at the beginning of the fourth year of medical school). MCD scores correlated with students’ overall basic science scores (r = .46, p = .0002) and their overall performance in Clinical and Professional Skills courses (r = .49, p < .0001). In addition, they correlated significantly with standardized exam measures, including USMLE Step 1 (r = .33, p ≤ .0001), and USMLE Step 2 CK (r = .39, p < .0001). These results provide preliminary validity evidence that MCDs may be useful in identifying students who have difficulty in integrating foundational and clinical sciences.

Keywords

Basic science education Integration Learner assessment Mechanistic case diagrams Validity 

Notes

Acknowledgements

This project was funded by the Stemmler Fund of the National Board of Medical Examiners. Development of the software was funded by grants from the University of Iowa’s Innovations in Teaching with Technology Fund and from the Office of Consultation and Research in Medical Education’s Educational Development Fund. Approval for this project was obtained from the University of Iowa Carver College of Medicine Umbrella IRB on June 21, 2016 (Project ID # 201509).

Compliance with ethical standards

Conflict of interest

All authors declare that they have no conflict of interest.

References

  1. Association of American Medical Colleges. (2018). Curriculum change in US medical schools: Types of change in 20162017. AAMC Curriculum Inventory, 20162017. Retrieved August 30, 2018, from https://www.aamc.org/initiatives/cir/427196/27.html.
  2. Azer, S. A. (2005). Facilitation of students’ discussion in problem-based learning tutorials to create mechanisms: The use of five key questions. Annals of the Academy of Medicine, Singapore,34, 492–498.Google Scholar
  3. Bandiera, G., Kuper, A., Mylopoulos, M., Whitehead, C., Ruetalo, M., Kulasegaram, K., et al. (2018). Back from basics: Integration of science and practice in medical education. Medical Education,52, 78–85.CrossRefGoogle Scholar
  4. Bierer, S. B., Taylor, C. A., & Dannefer, E. F. (2009). Evaluation of essay questions used to assess medical students’ application and integration of basic and clinical science knowledge. Teaching and Learning in Medicine,21, 344–350.CrossRefGoogle Scholar
  5. Cook, D. A., Brydges, R., Ginsburg, S., & Hatala, R. (2015). A contemporary approach to validity arguments: A practical guide to Kane’s framework. Medical Education,49, 560–575.CrossRefGoogle Scholar
  6. Dee, F. R. (2013). Mechanisms of disease/pathway diagramming exercises facilitate small group case analysis. Paper presented at the International Association of Medical Science Educators meeting, St. Andrews, Scotland.Google Scholar
  7. Dee, F. R., Haugen, T. H., & Kreiter, C. D. (2014). New web-based applications for mechanistic case diagramming. Medical Education Online,19, 24708.CrossRefGoogle Scholar
  8. Dee, F. R., Wessels, S., Bolton, L., Kreiter, C. D., Schmidt, T., Rubenstein, P., & Leaven T. (2011). Improving basic medical science education through case-based pathway diagramming exercises. Presented at the Central Group for Educational Affairs of the AAMC at Omaha, Nebraska.Google Scholar
  9. Denton, G. D., Durning, S. J., Wimmer, A. P., Pangaro, L. N., & Hemmer, P. A. (2004). Is a faculty developed pretest equivalent to pre-third year GPA or USMLE Step 1 as a predictor of third-year Internal Medicine clerkship outcomes? Teaching and Learning in Medicine,16(4), 329–332.CrossRefGoogle Scholar
  10. Downing, S. M. (2003). Validity: On the meaningful interpretation of assessment data. Medical Education,37, 830–837.CrossRefGoogle Scholar
  11. Ferguson, K. J., Kreiter, C. D., Haugen, T. H., & Dee, F. R. (2018). Web-enabled mechanistic case diagramming: A novel tool for assessing students’ ability to integrate foundational and clinical sciences. Academic Medicine,93, 1146–1149.CrossRefGoogle Scholar
  12. Fischer, K., Sullivan, A., Krupat, E., & Schwartzstein R. M. (2018). Assessing the effectiveness of using mechanistic concept maps in case-based collaborative learning. Academic Medicine, September 11, 2018 - Volume Published Ahead of Print.Google Scholar
  13. Guerrero, A. P. S. (2001). Mechanistic case diagramming: A tool for problem-based learning. Academic Medicine,76, 385–389.CrossRefGoogle Scholar
  14. Ho, V. W., Harris, P. G., Kumara, R. K., & Velan, G. M. (2018). Knowledge maps: A tool for online assessment with automated feedback. Medical Education Online,23, 1457394.CrossRefGoogle Scholar
  15. Kane, M. T. (2006). In R. I. Brennan (Ed.), Educational measurement (4th ed., pp. 17–64). Westport, CT: Praeger.Google Scholar
  16. Kane, M. T. (2013). Validating the interpretation and uses of test scores. Journal of Educational Measurement,50, 1–73.CrossRefGoogle Scholar
  17. Kim, M., & Kang, B. J. (2017). Exploring the pros and cons of mechanistic case diagrams for problem-based learning. Korean Journal of Medical Education,29(3), 153–163.CrossRefGoogle Scholar
  18. Kulasegaram, K., Manzone, J. C., Ku, C., Skye, A., Wadey, V., & Woods, N. N. (2015). Cause and effect: Testing a mechanism and method for the cognitive integration of basic science. Academic Medicine,90, S63–S69.CrossRefGoogle Scholar
  19. Kulasegaram, K. M., Martimianakis, M. A., Mylopoulos, M., Whitehead, C. R., & Woods, N. N. (2013). Cognition before curriculum: Rethinking the integration of basic science and clinical learning. Academic Medicine,88, 1578–1585.CrossRefGoogle Scholar
  20. Kumar, S., Dee, F., Kumar, R., & Velan, G. (2011). Benefits of testable concept maps for learning about pathogenesis of disease. Teaching and Learning in Medicine,23, 137–143.CrossRefGoogle Scholar
  21. McCrorie, P. (2000). The place of the basic sciences in medical curricula. Medical Education,34, 594–595.CrossRefGoogle Scholar
  22. Novak, J. D., & Cañas, A. J. (2008). The theory underlying concept maps and how to construct and use them. Technical report IHMC Cmap Tools 2006-01 Rev 01-2008. Florida Institute for Human and Machine Cognition; 2008. Retrieved February 7, 2018, from http://www.webcitation.org/6Ovjy5ZZd.
  23. Spencer, A. L., Brosenitsch, T., Levine, A. S., & Kanter, S. L. (2008). Back to the basic sciences: An innovative approach to teaching senior medical students how to best integrate basic science and clinical medicine. Academic Medicine,83, 662–669.CrossRefGoogle Scholar
  24. Wood, T. J., Cunnington, J. P. W., & Norman, G. R. (2009). Assessing the measurement properties of a clinical reasoning exercise. Teaching and Learning in Medicine,12, 196–200.CrossRefGoogle Scholar
  25. Woods, N. N., Brooks, L. R., & Norman, G. R. (2005). The value of basic science in clinical diagnosis: Creating coherence among signs and symptoms. Medical Education,39, 1173–1177.CrossRefGoogle Scholar

Copyright information

© Springer Nature B.V. 2019

Authors and Affiliations

  1. 1.General Internal Medicine (Emeritus), Department of Internal MedicineUniversity of Iowa Carver College of MedicineIowa CityUSA
  2. 2.Department of Family Medicine, and Consultant, Office of Consultation and Research in Medical EducationUniversity of Iowa Carver College of MedicineIowa CityUSA
  3. 3.Office of Student Affairs and CurriculumUniversity of Iowa Carver College of MedicineIowa CityUSA
  4. 4.Department of Pathology, Pathology and Laboratory ServiceVeterans Administration Medical CenterIowa CityUSA
  5. 5.University of Iowa Carver College of MedicineIowa CityUSA
  6. 6.Department of Pathology (Emeritus)University of Iowa Carver College of MedicineIowa CityUSA

Personalised recommendations