Advertisement

Measuring Coaching in Undergraduate Medical Education: the Development and Psychometric Validation of New Instruments

  • Patricia A. CarneyEmail author
  • Erin M. Bonura
  • Jeff A. Kraakevik
  • Amy Miller Juve
  • Leslie E. Kahl
  • Nicole M. Deiorio
Article

Abstract

Background

Coaching is emerging as a novel approach to guide medical students toward becoming competent, reflective physicians and master adaptive learners. However, no instruments currently exist to measure academic coaching at the undergraduate medical education level.

Objective

To describe the development and psychometric assessments of two instruments designed to assess academic coaching of medical students toward creating a robust measurement model of this educational paradigm.

Design

Observational psychometric.

Participants

All medical students in the 2014 and 2015 cohorts and all their coaches were invited to complete the instruments being tested, which led to 662 medical student responses from 292 medical students and 468 coaching responses from 22 coaches being included in analyses. Medical student response rates were 75.7% from 2014 and 75.5% from 2015. Overall, the coaches’ response rate was 71%.

Main Measures

Two 31-item instruments were initially developed, one for medical students to assess their coach and one for faculty coaches to assess their students, both of which evaluated coaching based on definitions we formulated using existing literature. Each was administered to two cohorts of medical students and coaches in 2015 and 2016. An exploratory factor analysis using principal component analysis as the extraction method and Varimax with Kaiser normalization as the rotation method was conducted.

Key Results

Eighteen items reflecting four domains were retained with eigenvalues higher than 1.0 for medical students’ assessment of coaching, which measured promoting self-monitoring, relationship building, promoting reflective behavior, and establishing foundational ground rules. Sixteen items reflecting two domains were retained for the faculty assessment of coaching with eigenvalues higher than 1.0: the Practice of Coaching and Relationship Formation.

Conclusions

We successfully developed and psychometrically validated surveys designed to measure key aspects of the coaching relationship, coaching processes, and reflective outcomes. The new validated instruments offer a robust measurement model for academic coaching.

KEY WORDS

undergraduate medical education academic coaching 

Notes

Acknowledgments

The authors gratefully acknowledge Patrick Chung, BS, and Elaine Waller, BS, for their assistance with data capture and management.

Funding Information

This work was supported by Dean’s Office at Oregon Health & Science University, an Accelerating Change in Medical Education Grant from the American Medical Association, and the Research Program in Family Medicine at Oregon Health & Science University.

Compliance with Ethical Standards

All research activities related to this study were reviewed and approved by OHSU’s Institutional Review Board (IRB no. 10873).

Conflict of Interest

The authors declare that they do not have a conflict of interest.

Supplementary material

11606_2019_4888_MOESM1_ESM.docx (35 kb)
ESM 1 (DOCX 35 kb)

References

  1. 1.
    Deiorio MN, Carney PA, Kahl LE, Bonura EM, Miller-Juve A. Coaching: A New Model for Academic Achievement. Med Educ Online 2016, 21(1): doi: https://doi.org/10.3402/meo.v21.33480 (Accessed 12/19/17_.
  2. 2.
    Sargeant J, Lockyer J, Mann K, et al. Facilitated reflective performance feedback: developing an evidence- and theory-based model that builds relationship, explores reactions and content, and coaches for performance change (R2C2). Acad Med 2015; 90: 1698–706.CrossRefGoogle Scholar
  3. 3.
    Schumacher DJ, Englander R, Carraccio C. Developing the master learner: applying learning theory to the learner, the teacher, and the learning environment. Acad Med 2013; 88: 1635–45.CrossRefGoogle Scholar
  4. 4.
    Skochelak SE, Stack SJ. Creating the medical schools of the future. Acad Med, 2017; 92(1): 16–19.CrossRefGoogle Scholar
  5. 5.
    Kuhn TL. Historical foundations of academic advising. In: Gordon VN, Habley WR, Grites TJ, eds. Academic advising: a comprehensive handbook. 5th ed San Francisco, CA: Jossey-Bass; 2008, pp. 3–16.Google Scholar
  6. 6.
    D’Abate CP, Eddy ER, Tannenbaum SI. What’s in a Name? A Literature-Based Approach to Understanding Mentoring, Coaching, and Other Constructs That Describe Developmental Interactions. Human Resource Development Review 2003 2: 360.CrossRefGoogle Scholar
  7. 7.
    Cummings TG, Worley CG. Coaching and mentoring (in) Organizational development and change. Mason, OH: South-Western Cengage Learning; 2009.Google Scholar
  8. 8.
    Schonrock-Adema J, Schaub-De Jong MA, Cohen-Schotanus J. The development and validation of the SEERLinG: A practical, valid and reliable tool to evaluate teacher competencies to encourage reflective learning.Google Scholar
  9. 9.
    Schaub-De Jong MA, Schonrock-Adema J, Cohen-Schotanus J, Dekker H, Verkerk MA. Development of a student rating scale to evaluate teachers’ competencies for facilitating reflective learning. Med Educ, 2011; 45:155–165.CrossRefGoogle Scholar
  10. 10.
    Drennan J. Cognitive interviewing: verbal data in the design and pretesting of questionnaires. Journal of Advanced Nursing 2003;41(1):57–63.CrossRefGoogle Scholar
  11. 11.
    Mundfrom DJ, Shaw DG, Tian LK, Minimum Sample size recommendations for conducting factor analyses. Int J of Testing, 2005; 5(2): 159–168.CrossRefGoogle Scholar
  12. 12.
    Rahn M. Factor Analysis: A short Introduction, Part 3 – The difference between confirmatory and exploratory factor analysis. Making Statistics Make Sense: https://www.theanalysisfactor.com/confirmatory-and-exploratory-factor-analysis/ (Accessed 11/19/18).
  13. 13.
    Thompson B. Exploratory and confirmatory factor analysis: Understanding concepts and applications. Washington, DC, US: American Psychological Association, 2004.CrossRefGoogle Scholar
  14. 14.
    Schmitt N. Uses and abuses of Coefficient Alpha. Psych Assessment 1996;8(4):350–353.CrossRefGoogle Scholar
  15. 15.
    Ponterotto JG, Ruckdeschel DE. An overview of coefficient alpha and a reliability matrix for estimating adequacy of internal consistency coefficients with psychological research measures. Perceptual and Motor Skills 2007; 105:997–1014.CrossRefGoogle Scholar
  16. 16.
    Nothnagle M, Goldman R, Quirk M, Reis S. Promoting self-directed learning skills in residency: A case study in program development. Academic Medicine, 2010;85(12):1874–1879.CrossRefGoogle Scholar
  17. 17.
    Eva KW, Regehr G. Exploring the divergence between self-assessment and self-monitoring. Advances in Health Science Education, 2011;16(3):311–329.CrossRefGoogle Scholar
  18. 18.
    Davis DA, Mazmanian PE, Fordis M, Van Harrison R, Thorpe KE, Perrier L. Accuracy of physician self-assessment compared with observed measures of competence: A systematic review. JAMA. 2006;296: 1094–1102.CrossRefGoogle Scholar
  19. 19.
    Telio RG, Ajjawi R. Feedback and the educational alliance: examining credibility judgments and their consequences. Medical Education, 2016; 50(9):933–942.CrossRefGoogle Scholar

Copyright information

© Society of General Internal Medicine 2019

Authors and Affiliations

  • Patricia A. Carney
    • 1
    Email author
  • Erin M. Bonura
    • 1
  • Jeff A. Kraakevik
    • 1
  • Amy Miller Juve
    • 1
  • Leslie E. Kahl
    • 1
  • Nicole M. Deiorio
    • 1
  1. 1.Oregon Health & Science UniversityPortlandUSA

Personalised recommendations