Abstract
When you are trying something completely new, like an interprofessional medical home training clinic, how do you know if it works? This question has different answers depending on whether the report is for the funder, considering return on investment; the institution, considering whether to continue the program after the grant expires; the trainees, wanting to judge participation by whether it “works”; or the faculty, wanting to know “how” and “why” it works. Each local CoE site faced these tensions in trainee assessment and local program evaluations, and the CoE as a whole faced them for enterprise-wide evaluation. Our site has learned much along the way about differences between program evaluation and trainee assessment and also expectations for simple, complicated, and complex adaptive system evaluations. In this chapter, we hope to share some lessons learned that can guide your assessment and evaluation plans.
This is a preview of subscription content, log in via an institution.
Buying options
Tax calculation will be finalised at checkout
Purchases are for personal use only
Learn about institutional subscriptionsReferences
Argyris C. On organizational learning. Oxford: Blackwell Business; 1999.
Bender E. An introduction to mathematical modeling. Mineola: Dover; 1978.
Bhaskar R. The realist theory of science. London: Routledge; 2008.
Brest P. The power of theories of change. Stanford Soc Innov Rev. Spring 2010.
Campbell D, Stanley J. Experimental and quasi-experimental designs for research. Boston: Houghton Mifflin Company; 1963.
Charmaz K. Constructing grounded theory. Los Angeles: Sage; 2012.
Christakis N, Fowler J. Connected. Boston: Back Bay Books; 2011.
Cook T, Campbell D. Quasi-experimental design and analysis issues for field settings. Boston: Houghton Mifflin Company; 1979.
Crabtree B, Miller W. Chapter 1: primary care research: a multimethod typology and qualitative road map. In: Crabtree B, Miller W, editors. Doing qualitative research. Newbury Park: Sage; 1992.
Crotty M. The foundations of social research. Los Angeles: Sage; 2010.
Farkas I, Helbing D, Vicsek T. Mexican waves in an excitable medium. Nature. 2002;419:131–2.
Fihn S, Box T. Care assessment needs (CAN) score and the patient care assessment system (PCAS): tools for care management. 2013. http://www.hsrd.reseacrh.va.gov/for_researchers/cyber_seminars/archives/713-notes.pdf. Accessed 2 Jan 2015.
Gintis H. The bounds of reason. Princeton: Princeton University Press; 2009.
Glaser BG, Strauss AL. The discovery of grounded theory. Chicago: Aldine; 1967.
Golafshani N. Understanding reliability and validity in qualitative research. Qual Rep. 2003;8(4):597–607.
Holland J. Complexity. Oxford: Oxford University Press; 2014.
Inui T. The virtue of qualitative and quantitative research. Ann Intern Med. 1996;125(9):770–1.
Khan JA, Casper M, Asimos AW, et al. Geographic and sociodemographic disparities in drive times to Joint Commission-certified primary stroke centers in North Carolina, South Carolina, and Georgia. Prev Chronic Dis. 2011;8(4):A79.
Lincoln Y, Guba E. Naturalistic inquiry. Newbury Park: Sage; 1985.
Meyer W. Concepts of mathematical modeling. Mineola: Dover; 1984.
Nelson KM, Helfrich C, Sun H, Hebert PL, Liu CF, Dolan E, et al. Implementation of the patient-centered medical home in the Veterans Health Administration: associations with patient satisfaction, quality of care, staff burnout, and hospital and emergency department use. JAMA Intern Med. 2014;174(8):1350–8.
Ogrinc G, Ercolano E, Cohen ES, Harwood B, Baum K, van Aalst R, et al. Educational system factors that engage resident physicians in an integrated quality improvement curriculum at a VA hospital: a realist evaluation. Acad Med. 2014;89(10):1380–5.
Pawson R, Tilley N. Realistic evaluation. Los Angeles: Sage; 2010.
Regehr G. It’s NOT, rocket science: rethinking our metaphors for research in health professions education. Med Educ. 2010;44:31–9.
Rihoux B, Ragin C. Configurational comparative methods. Las Angeles: Sage; 2009.
van Schaik S, O’Brien B, Almeida S, Adler S. Perceptions of interprofessional teamwork in low-acuity settings: a qualitative analysis. Med Educ. 2014;48:583–92.
Westbrook J, Braithwaite J, Georgiou A, Ampt A, Creswick N, Coiera E, et al. Multimethod evaluation of information and communication technologies in health in the context of wicked problems and sociotechnical theory. J Am Med Inform Assoc. 2007;14(6):746–55.
Wilper A, Tivis R. Diabetes control: is travel time to primary care provider associated with changes in hemoglobin A1c? A preliminary geospatial analysis. 2013.
Wilper AP, Weppner W, et al. Temporo-spatial surveillance of influenza like illness: preliminary results from the Idaho infectious disease reporting network. Poster Presentation, International Society for Disease Surveillance, Park City; 2010.
Wong G, Greenhalgh T, Westhorp G, Pawson R. Realist methods in medical education research: what are they and what can they contribute? Med Educ. 2012;46(1):89–96.
Author information
Authors and Affiliations
Rights and permissions
Copyright information
© 2015 Springer International Publishing Switzerland
About this chapter
Cite this chapter
Smith, C.S., Gerrish, W.G., Weppner, W.G. (2015). Implications for Evaluation. In: Interprofessional Education in Patient-Centered Medical Homes. Springer, Cham. https://doi.org/10.1007/978-3-319-20158-0_7
Download citation
DOI: https://doi.org/10.1007/978-3-319-20158-0_7
Publisher Name: Springer, Cham
Print ISBN: 978-3-319-20157-3
Online ISBN: 978-3-319-20158-0
eBook Packages: MedicineMedicine (R0)