Low-Value Care and Clinician Engagement in a Large Medicare Shared Savings Program ACO: a Survey of Frontline Clinicians



Although the Medicare Shared Savings Program (MSSP) created new incentives for organizations to improve healthcare value, Accountable Care Organizations (ACOs) have achieved only modest reductions in the use of low-value care.


To assess ACO engagement of clinicians and whether engagement was associated with clinicians’ reported difficulty implementing recommendations against low-value care.


Cross-sectional survey of ACO clinicians in 2018.


1289 clinicians in the Physician Organization of Michigan ACO, including generalist physicians (18%), internal medicine specialists (16%), surgeons (10%), other physician specialists (27%), and advanced practice providers (29%). Response rate was 34%.

Main Measures

Primary exposures included clinicians’ participation in ACO decision-making, awareness of ACO incentives, perceived influence on practice, and perceived quality improvement. Our primary outcome was clinicians’ reported difficulty implementing recommendations against low-value care.


Few clinicians participated in the decision to join the ACO (3%). Few clinicians were aware of ACO incentives, including knowing the ACO was accountable for both spending and quality (23%), successfully lowered spending (9%), or faced upside risk only (3%). Few agreed (moderately or strongly) the ACO changed compensation (20%), practice (19%), or feedback (15%) or that it improved care coordination (17%) or inappropriate care (13%). Clinicians reported they had difficulty following recommendations against low-value care 18% of the time; clinicians reported patients had difficulty accepting recommendations 36% of the time. Increased ACO awareness (1 standard deviation [SD]) was associated with decreased difficulty (− 2.3 percentage points) implementing recommendations (95% confidence interval [CI] − 3.8, − 0.7), as was perceived quality improvement (1 SD increase, − 2.1 percentage points, 95% CI, − 3.4, − 0.8). Participation in ACO decision-making and perceived influence on practice were not associated with recommendation implementation.


Clinicians participating in a large Medicare ACO were broadly unaware of and unengaged with ACO objectives and activities. Whether low clinician engagement limits ACO efforts to reduce low-value care warrants further longitudinal study.


Encouraging clinicians to decrease inappropriate or low-value care is a central goal of payment reform. In Medicare Accountable Care Organizations (ACOs), groups of providers assume responsibility for the spending and quality outcomes of a defined patient population. Despite the possibility of shared savings, emerging data suggest that Medicare ACOs have had a modest effect on the use of low-value services.1, 2 Explanations for ACOs’ limited success reducing utilization are multiple, including the strength of collective incentives and lack of specialist physician participation. One unexplored explanation is organizations’ failure to engage frontline clinicians in the practice changes necessary to become a successful ACO. Since the inception of the ACO model, policymakers have commented on ACOs’ need to ensure clinician awareness ACO goals,3, 4 provide useful performance feedback,5, 6 and create payment structures that align clinicians’ and organizations’ incentives and norms.5, 7,8,9,10,11

The perspective of individual clinicians has been largely absent from these policy conversations. Research has primarily relied on data gathered from surveys and interviews of ACO executives and physician leaders.12,13,14,15,16 A recent national survey of primary care physicians (PCPs) and internal medicine specialists during the early ACO experience (2014–2015) found that Medicare ACOs had limited success engaging physicians in decision-making, awareness of ACO incentives, or changing care delivery.17 However, there are no data on ACO engagement of other physician specialists (who make an outsized contribution to total spending), physician assistants, or advanced practice nurses.18 Further, how engagement of these clinicians relates to their perceived ability to provide high-value care is not known.

In this context, we designed and administered a survey to individual clinicians in the Physician Organization of Michigan (POM) ACO. The POM ACO is the largest Medicare Shared Savings Program (MSSP) ACO in Michigan and among the ten largest in the county.19 We asked POM ACO clinicians about their level of organizational engagement by the ACO, as well as their reported ability to implement recommendations against low-value care. We hypothesized that clinicians would report limited ACO engagement but that ACO engagement would be positively associated with clinicians’ reported ability to implement recommendations against low-value care.


Study Design

We conducted a cross-sectional survey between February and July 2018. Our survey assessed four dimensions of ACO engagement of clinicians: (1) involvement in the decision to join ACO; (2) awareness of ACO incentives and initiatives; (3) perceived influence of the ACO on practice; and (4) perceived effect of the ACO on quality improvement. Our survey also assessed clinicians’ reported ability to follow recommendations against low-value care.


The POM ACO comprises 5128 clinicians from ten clinician organizations: Michigan Medicine; Integrated Health Associates; Huron Valley Physicians Association; MidMichigan Health; St Mary’s of Michigan; Answer Health; Wexford/Crawford PHO; Oakland Southfield Physician; United Physicians; and Olympia Medical (Table 1) and approximately 80,000 attributed beneficiaries. The POM ACO communicates with ACO clinicians via a semi-annual letter sent to all participating clinicians that summarizes cost and quality performance and goals for the upcoming performance year. The POM ACO also assists the 10 participating physician organizations in communicating with individual clinicians and implementing practice changes to meet performance goals.

Table 1 Characteristics of Respondents in the Accountable Care Organization

Our sample frame included all clinicians listed as participants in the POM ACO administrative roster, including physicians, advanced practice providers (physician assistants, nurse practitioners, certified nurse anesthetists, certified nurse midwives), and other clinicians (clinical social workers, psychologists, audiologists, podiatrists, optometrists, chiropractors, and physical therapists).

Survey Development

We selected survey domains from existing literature and four semi-structured interviews with ACO leaders. We then created or adapted survey items that mapped to those domains. We refined our survey based on two cognitive interviews and pretests with a purposive sample of 10 clinicians representing diverse specialties (e.g., urologists, NPs, PCPs, interventional cardiologists). The survey domains, items, and adapted item sources14, 17, 20,21,22,23,24,25 are described in the Supplemental Methods and Table A1 in the Online Appendix.


Our main exposures encompassed four dimensions of ACO engagement (Table 2, Tables A2–A4 in the Online Appendix). First, we asked respondents to indicate their level of involvement in the decision to participate in the ACO (involved in the decision-making process, not involved but aware, not involved or aware). Second, we assessed respondent awareness of ACO incentives and initiatives, such as whether ACO was held accountable for both spending and quality. Third, we assessed respondent perception of the ACO’s influence on their practice, for example, whether joining an ACO had changed how the respondent practices medicine. Fourth, we assessed respondent perception of the ACO’s effect on quality improvement, for example, whether joining an ACO had had a positive impact on care coordination.

Table 2 ACO Engagement Measures and Scales


Our main outcome was the respondent’s reported difficulty implementing recommendations against low-value care. We presented respondents with four recommendations drawn from the Choosing Wisely® campaign.26 Because Choosing Wisely® specifically targets unnecessary or harmful treatment and testing, implementation of these recommendations would likely help to achieve MSSP ACO objectives—reducing healthcare spending while maintaining minimum quality standards. All respondents were presented with the following recommendation, “Don’t recommend cancer screening in adults with life expectancy of less than 10 years.” The other three recommendations were based on the respondent’s specialty (Table 3, Table A5 in the Online Appendix).26 We asked two questions for each recommendation: “Do you find this recommendation easy or difficult to follow most of the time?” (easy to follow, difficult to follow, does not apply to my practice) and “Do most patients find this recommendation easy or difficult to accept?” (easy to accept, difficult to accept, does not apply to my practice).

Table 3 Examples of Recommendations Against Low-Value Care Presented to Respondents

Survey Administration

We administered the pilot (n = 100) and full (n = 5028) surveys in February and May 2018, respectively, including responses from each survey in the cohort of eligible respondents. We mailed survey invitations to clinician practice addresses containing the survey description, a token gift (a $2 bill and a cork coaster with the State of Michigan outline), a unique access code, and a link to the online survey (hosted by Qualtrics).27 We sent up to three follow-up reminders to non-respondents at 1, 2, and 5 weeks either by email if possible (74% of roster) or by postcard (26%).

We used the American Association for Public Research RR1 response rate for the overall survey.28 After survey administration, we restricted this analytic sample to clinicians most frequently represented in the Choosing Wisely® campaign (physicians, physician assistants, advanced practice nurses) and excluding clinical social workers, psychologists, audiologists, optometrists, podiatrists, chiropractors, physical therapists, and other/unknown (PhD, MBA, MHSA). We excluded clinicians who responded “does not apply to my practice” for all recommendations against low-value care. We also excluded pediatricians, as Medicare ACOs focus on adult beneficiaries (see Figure A1 and Supplemental Methods of Online Appendix for CONSORT diagram and response rate calculation).


We estimated linear probability fixed effects models to assess the association between ACO engagement and the probability of a clinician reporting difficulty implementing a given recommendation against low-value care. Our analysis was conducted at the clinician-recommendation-response level.

We estimated three models for each dimension of ACO engagement (e.g., ACO Awareness). First, we estimated an unadjusted model that did not account for clinician or organizational characteristics but did account for the fact that different clinical specialties were shown different recommendations. This model included fixed effects for each unique recommendation displayed in the survey across all respondents (n = 63), each specialty-specific block of recommendations (n = 27), and whether the question pertained to the clinician following the recommendation or the patient accepting the recommendation. By evaluating only within-specialty variation, this model captured potential confounding introduced by variation across specialties in ACO engagement and Choosing Wisely® recommendation, namely strength of evidence29 and clinical and financial relevance.30,31,32 Second, we estimated a model that also adjusted for unobserved differences across the ten clinician organizations in the ACO by adding fixed effects for the clinician’s organization. Third, we estimated a model that further adjusted for clinician gender, age, clinician type/specialty (generalist physician, physician with internal medicine specialty, physician with other specialty, surgeon, physician assistant, advanced practice nurse), and professional activity (direct patient care, teaching, research, administration/management, other). In this final model, we compared differences in ACO engagement and recommendation implementation among clinicians in the same specialty, of the same clinician type, and practicing within the same organization.

To reduce potential bias from survey nonresponse and generalize estimates to the target population (the POM ACO), we applied post-stratification survey weights incorporating characteristics associated with nonresponse (in this case, clinician organization). We used iterative proportional fitting, or raking, to calibrate survey weights.33 To reduce bias from missing data among respondents, we used multiple imputation for all models and implemented a recently developed quadratic-rule procedure to select the number of imputations needed to achieve estimate and standard error replicability.34 We also tested for variation across clinician type, clinician organization, and whether the question pertained to clinician vs. patient difficulty following the recommendation (details in the Online Appendix).

This study was deemed exempt from review by the University of Michigan Health Sciences and Behavioral Sciences Institutional Review Board.


Respondent Characteristics

Of the 4819 eligible respondents, 1620 completed the survey (response rate of 34%; Figure A1 in the Online Appendix). Response rates differed across the clinician organizations but were otherwise comparable across clinician specialty and sex (Table A2 in the Online Appendix). The analytic sample for the present analysis included 1289 respondents (Table 1). Respondents represented a wide range of clinician types and specialties, including generalist physicians (18%), internal medicine specialists (16%), surgeons (10%), other physician specialists (27%), advanced practice nurses (18%), and physician assistants (11%) (Table 1). Direct patient care was the most common professional activity (85%). Most respondents were either employed by a medical school (59%) or employed by a hospital (25%), and a majority served on the clinical staff of Michigan Medicine (78%).

ACO Engagement

ACO engagement of clinicians was low (Table 2). Most respondents were not aware or involved of the decision to join the ACO (69%); 28% were aware but not involved, and 3% were involved in the decision-making process. Respondents reported limited awareness of ACO incentives and initiatives (Table 2, Table A3 in the Online Appendix). For example, 23% knew that the ACO was accountable for both spending and quality, 9% knew the ACO had successfully lowered spending, 6% knew the ACO was accountable for only Medicare patients, and 3% knew the ACO only faced upside risk (i.e., could not lose money). Across the six ACO Awareness scale items, the mean respondent knew 0.6 items and the median respondent knew 0 items.

Respondents perceived the ACO to have had minimal influence on their practice (Table 3, Table A4 in the Online Appendix). Few respondents agreed (moderately or strongly) that joining an ACO has “made me more aware of controlling treatment costs” (26%), “changed how I am compensated” (20%), or “changed how I practice medicine” (18%). Only 15% felt they received “useful feedback on ACO cost and quality performance.” Respondents perceived that the ACO had a minimally positive effect on quality improvement (Table 3, Table A5 in the Online Appendix). Few respondents, for example, felt the ACO had a positive effect on their ability to coordinate care (16%), reduce inappropriate care (13%), or reduce unnecessary hospitalizations (12%).

Implementation of Recommendations Against Low-Value Care

Respondents provided 8470 responses for 27 specialty-specific blocks containing 63 unique recommendations against low-value care (Table 3, Table A6, Figure A2 in the Online Appendix). Recommendations were “difficult to follow” for clinicians 18% of the time (standard deviation, 38%) and “difficult to accept” for patients 36% of the time (standard deviation, 48%). The finding that respondents typically considered recommendations more difficult for patients to accept than for clinicians to follow was consistent across clinician type and organization (Figure A2). For instance, generalist physicians reported recommendations were “difficult to follow” 13% of the time and “difficult to accept” for patients 40% of the time (Figure A2). At the same time, perceived difficulty varied widely across specific recommendations against low-value care (Table A6).

Association between ACO Engagement and Recommendations Against Low-Value Care

Some dimensions of ACO engagement were associated with implementation of recommendations against low-value care (Figs. 1, 2, and 3). In models fully adjusted for organization and clinician characteristics, awareness of the decision to join the ACO was not significantly associated with reported difficulty implementing recommendations against low-value care (Fig. 1; − 1.9 percentage points, 95% confidence interval [CI] − 4.3 to 0.5). Increased awareness of ACO incentives and initiatives was associated with greater reported ability to implement recommendations against low-value care (Fig. 2). After adjusting for organization and clinician characteristics, a 1 standard deviation (SD) increase in ACO awareness was associated with 2.3 percentage point less reported difficulty implementing recommendations against low-value care (95% CI − 3.8, − 0.7). This represents a 9% improvement in reported ability to implement recommendations (2.3 percentage point divided by base likelihood of 27.1%). Perceived influence of ACO on practice was not associated with respondents’ reported difficulty implementing recommendations against low-value care (Fig. 3, panel a; 1.0 percentage point, 95% CI − 0.4 to 2.4). Conversely, a 1 SD increase in perceived quality improvement was associated with 2.1 percentage point less reported difficulty implementing recommendations (Fig. 3, panel b; 95% CI − 3.4 to − 0.8).

Figure 1

Association between clinician involvement in decision to join ACO and recommendations against low-value care. Models compare differences between clinicians who were not involved but were aware of decision to join ACO (panel a) or were involved in decision-making process (panel b) versus the reference group, clinicians who were not involved in or aware of the decision to join the ACO. Models are described in the main text. Survey weights were applied to generalize to the Physician Organization of Michigan ACO. Multiple imputation was used for missing data. ACO, accountable care organization; CI, confidence interval.

Figure 2

A. Association between clinician ACO awareness and recommendations against low-value care. Estimated change is for a 1 standard deviation increase in the ACO Awareness scale. The scale and models are described in the main text and in Table2. Survey weights were applied to generalize to the Physician Organization of Michigan ACO. Multiple imputation was used for missing data. ACO, accountable care organization; CI, confidence interval.

Figure 3

Association between perceived ACO impact on practice and quality and recommendations against low-value care. Estimated change is for a 1 standard deviation increase in either the ACO Practice Change scale (panel a) or the ACO Quality Improvement scale (panel b). Scales and models are described in the main text and in Table2. Survey weights were applied to generalize to the Physician Organization of Michigan ACO. Multiple imputation was used for missing data. ACO, accountable care organization; CI, confidence interval.

This pattern of results was robust across clinician organization (Michigan Medicine vs. non-Michigan Medicine) and clinician type (Table A7). Although patients were perceived to be more resistant than clinicians to recommendations against low-value care (Figure A2), both clinicians’ reported ability to follow recommendations and patients’ perceived ability to accept recommendations were associated with ACO awareness and perceived quality improvement (Table A8).


In a survey of one of the largest MSSP ACOs in the country, we found limited engagement of frontline clinicians charged with implementing ACO value-based initiatives. Few clinicians participated in the decision to join the ACO, fewer yet were aware of new organizational financial incentives created by the MSSP, and most reported that the ACO had limited effect on practice or quality improvement. At the same time, some aspects of clinician engagement—in particular, improved awareness of ACO incentives and ability to improve care quality—were associated with improvement in clinicians’ reported ability to implement recommendations against low-value care. Taken together, our results suggest that limited engagement of ACO clinicians may hamper ACO efforts to reduce low-value care.

There are few data on the degree to which ACOs have engaged individual clinicians in efforts to improve healthcare value. A national survey of PCPs (~ 78%) and internal medicine specialists (~ 18%) during the early MSSP experience (2014–2015) found MSSP ACOs had a modest perceived effect on practice change (e.g., half agreed ACOs had influenced care).17 Our study extends these results, finding little ACO engagement among physician specialists (e.g., anesthesiologists, dermatologists), surgeons, physician assistants, and advanced practices nurses. Our finding that physicians perceive Choosing Wisely® recommendations to be more difficult for patients to accept than for clinicians to follow is consistent with those from a national survey of Medicare and VA PCPs,20 and suggests that ACO efforts to lower spending may benefit from promoting patient education and clinician-patient conversations regarding high-value healthcare decisions.

Our study is the first to test whether ACO clinician engagement is associated with reported ability to practice high-value care. Why were ACO awareness and perceived ACO quality improvement associated with an improved reported ability to implement recommendations against low-value care, while perceived practice change in the ACO was not? One possibility is that direct knowledge of ACO rules and incentives is particularly important for high-value care. Another possibility is that some third unobserved characteristic (e.g., the desire to provide high-value care) may drive ACO awareness and perceived changes in quality but not practice change. Moving forward, quasi-experimental studies of longitudinal data are needed to determine whether ACO engagement plays a causal role in reducing low-value care.

Our study must be interpreted in the context of several limitations. First, data from a single, large Medicare ACO in Michigan may not generalize to other ACOs. Second, 78% of respondents were from one health system (Michigan Medicine), although survey weights were used to generalize results to the ACO and findings were robust across organizations in sensitivity analyses. Third, the moderate response rate (34%) raises the possibility of response bias, e.g., clinicians with strong opinions about ACOs might have disproportionately responded to the survey. However, this seems unlikely given respondents’ limited ACO awareness and relatively tepid perceptions of change in the ACO. Instead, our response rate likely reflects the increasing difficulty of conducting clinician surveys, particularly without use of large financial incentives.35 Fourth, our study measured reported ability to implement recommendations and not actual clinician behavior. Finally, causation cannot be inferred from this cross-sectional design. Although our analytic approach controlled for fixed differences across organizations and clinician type and specialty, it is possible that clinicians with greater desire to provide high-value care choose to engage ACOs in a more effective manner and report greater ability to implement recommendations against low-value care (i.e., confounding).

These limitations notwithstanding, our study has important policy implications. The accountable care model encompasses a wide diversity of ACOs, with studies suggesting greater savings among physician-led ACOs than among hospital-led ACOs.36 Our finding of low clinician engagement in the POM ACO (which includes multiple hospitals) is consistent with the possibility that hospital-led ACOs’ inability to lower spending may be partially due to greater difficulty engaging frontline clinicians. At the same time, the diversity of ACOs belies an essential commonality—all ACOs rely on frontline clinicians to improve quality and eliminate low-value care.17 If there is indeed a causal relationship between ACO awareness and high-value care, ACOs’ limited engagement of individual clinicians observed in this and previous studies may help explain findings that ACOs have had little-to-no effect on spending and quality.1, 17, 36,37,38,39,40,41


Systematic improvements by ACOs to healthcare value will likely require consistent engagement of frontline clinicians. Our study underscores clinician uncertainty regarding ACO participation, incentives, and initiatives. Future research should test interventions ACOs can use to engage individual clinicians more effectively. Whether low clinician engagement plays a causal role in limiting ACO success warrants longitudinal evaluation of observed clinician behavior.


  1. 1.

    Schwartz A, Chernew M, Landon B, McWilliams J. Changes in low-value services in year 1 of the medicare pioneer accountable care organization program. JAMA Intern Med. 2015;175(11):1815–1825. doi:https://doi.org/10.1001/jamainternmed.2015.4525

    Article  PubMed  PubMed Central  Google Scholar 

  2. 2.

    Hollingsworth JM, Nallamothu BK, Yan P, et al. Medicare Accountable Care Organizations Are Not Associated With Reductions in the Use of Low-Value Coronary Revascularization. Circ Cardiovasc Qual Outcomes. 2018. doi:https://doi.org/10.1161/CIRCOUTCOMES.117.004492

  3. 3.

    Berwick D. Launching accountable care organizations--the proposed rule for the Medicare Shared Savings Program. N Engl J Med. 2011;364(16):e32.

    Article  Google Scholar 

  4. 4.

    Crosson F. Analysis & commentary: The accountable care organization: whatever its growing pains, the concept is too vitally important to fail. Heal Aff. 2011;30(7):1250–1255.

    Article  Google Scholar 

  5. 5.

    Fisher E, McClellan M, Safran D. Building the path to accountable care. N Engl J Med. 2011;365(26):2445–2447.

    CAS  Article  Google Scholar 

  6. 6.

    Shortell S, Colla C, Lewis V, Fisher E, Kessell E, Ramsay P. Accountable Care Organizations: The National Landscape. J Health Polit Policy Law. 2015;40(4):647–668.

    Article  Google Scholar 

  7. 7.

    DeCamp M, Farber N, Torke A, et al. Ethical challenges for accountable care organizations: a structured review. J Gen Intern Med. 2014;29(10):1392–1399.

    Article  Google Scholar 

  8. 8.

    Weissman J, Bailit M, D’Andrea G, Rosenthal M. The design and application of shared savings programs: lessons from early adopters. Heal Aff. 2012;31(9):1959–1968.

    Article  Google Scholar 

  9. 9.

    Ryan A, Shortell S, Ramsay P, Casalino L. Salary and Quality Compensation for Physician Practices Participating in Accountable Care Organizations. Ann Fam Med. 2015;13(4):321–324.

    Article  Google Scholar 

  10. 10.

    Robinson J. The end of managed care. J Am Med Assoc. 2001;285(20):2622–2628.

    CAS  Article  Google Scholar 

  11. 11.

    Shortell S, Waters T, Clarke K, Budetti P. Physicians as double agents: maintaining trust in an era of multiple accountabilities. J Am Med Assoc. 1998;280(12):1102–1108.

    CAS  Article  Google Scholar 

  12. 12.

    Shortell S, McClellan S, Ramsay P, Casalino L, Ryan A, Copeland K. Physician Practice Participation in Accountable Care Organizations: The Emergence of the Unicorn. Health Serv Res. 2014;49(5):1519–1536.

    Article  Google Scholar 

  13. 13.

    Fisher E, Shortell S, Kreindler S, VanCitters A, Larson B. A framework for evaluating the formation, implementation, and performance of accountable care organizations. Health Aff. 2012;31(11):2368–2378.

    Article  Google Scholar 

  14. 14.

    Colla CH, Lewis VA, Shortell SM, Fisher ES. First national survey of ACOs finds that physicians are playing strong leadership and ownership roles. Heal Aff. 2014;33(6):964–971. doi:https://doi.org/10.1377/hlthaff.2013.1463

    Article  Google Scholar 

  15. 15.

    Lin MP, Muhlestein D, Carr BG, Richardson LD, Wiler JL, Schuur JD. Engagement of Accountable Care Organizations in Acute Care Redesign: Results of a National Survey. J Gen Intern Med. 2018. https://doi.org/10.1007/s11606-018-4525-4.

    Article  Google Scholar 

  16. 16.

    Lewis VA, Tierney KI, Fraze T, Murray GF. Care Transformation Strategies and Approaches of Accountable Care Organizations. Med Care Res Rev. 2017.

  17. 17.

    Schur CL, Sutton JP. Physicians in medicare ACOs offer mixed views of model for health care cost and quality. Health Aff. 2017;36(4):649–654. doi:https://doi.org/10.1377/hlthaff.2016.1427

    Article  Google Scholar 

  18. 18.

    Reschovsky JD, Hadley J, Saiontz-Martinez CB, Boukus ER. Following the money: Factors associated with the cost of treating high-cost medicare beneficiaries. Health Serv Res. 2011;46(4):997–1021. doi:https://doi.org/10.1111/j.1475-6773.2011.01242.x

    Article  PubMed  PubMed Central  Google Scholar 

  19. 19.

    Centers for Medicare and Medicaid Services. Shared Savings Program Accountable Care Organizations Public Use File. 2016a. https://www.cms.gov/research-statistics-data-and-systems/downloadable-public-use-files/sspaco/index.html. Accessed September 17, 2019.

  20. 20.

    Zikmund-Fisher BJ, Kullgren JT, Fagerlin A, Klamerus ML, Bernstein SJ, Kerr EA. Perceived Barriers to Implementing Individual Choosing Wisely(®) Recommendations in Two National Surveys of Primary Care Providers. J Gen Intern Med. 2017;32(2):210–217. doi:https://doi.org/10.1007/s11606-016-3853-5

    Article  PubMed  Google Scholar 

  21. 21.

    SteelFisher GK, Blendon RJ, Sussman T, Connolly JM, Benson JM, Herrmann MJ. Physicians’ Views of the Massachusetts Health Care Reform Law — A Poll. N Engl J Med. 2009;361(19):e39. doi:https://doi.org/10.1056/NEJMp0909851

    CAS  Article  PubMed  Google Scholar 

  22. 22.

    Osborn R, Moulds D, Schneider EC, Doty MM, Squires D, Sarnak DO. Primary care physicians in ten countries report challenges caring for patients with complex health needs. Health Aff. 2015;34(12):2104–2112. doi:https://doi.org/10.1377/hlthaff.2015.1018

    Article  Google Scholar 

  23. 23.

    Tilburt JC, Wynia MK, Sheeler RD, et al. Views of US physicians about controlling health care costs. J Am Med Assoc. 2013;310(4):380–388. doi:https://doi.org/10.1001/jama.2013.8278

    CAS  Article  Google Scholar 

  24. 24.

    American Medical Association. American Medical Association Physician Practice 2016 Benchmark Survey. https://www.ama-assn.org/about/physician-practice-benchmark-survey.

  25. 25.

    Kemper P, Blumenthal D, Corrigan JM, et al. The Design of the Community Tracking Study: A longitudinal study of health system change and its effects on people. Inquiry. 1996;33(2):195–206.

    CAS  PubMed  Google Scholar 

  26. 26.

    ABIM Foundation. Choosing Wisely. http://www.choosingwisely.org/clinician-lists/. Accessed September 17, 2019.

  27. 27.

    Dillman D. Internet, Phone, Mail, and Mixed-Mode Surveys: The Tailored Design Method. New York: John Wiley and Sons; 2014.

    Google Scholar 

  28. 28.

    American Association for Public Opinion Research. Standard Definitions: Final Dispositions of Case Codes and Outcome Rates for Surveys. 9th ed. Lenexa; 2016.

  29. 29.

    Admon AJ, Gupta A, Williams M, et al. Appraising the Evidence Supporting Choosing Wisely(R) Recommendations. J Hosp Med. 2018. https://doi.org/10.12788/jhm.2964

  30. 30.

    Kerr EA, Hofer TP. Deintensification of Routine Medical Services The Next Frontier for Improving Care Quality. JAMA Intern Med. 2016;176:978–980. doi:https://doi.org/10.1001/jamainternmed.2016.2292

    Article  PubMed  Google Scholar 

  31. 31.

    Markovitz AA, Hofer TP, Froehlich W, et al. An examination of deintensification recommendations in clinical practice guidelines: Stepping up or scaling back? JAMA Intern Med. 2018;178(3):414–416. doi:https://doi.org/10.1001/jamainternmed.2017.7198

    Article  PubMed  Google Scholar 

  32. 32.

    Morden NE, Colla CH, Sequist TD, Rosenthal MB. Choosing wisely--the politics and economics of labeling low-value services. N Engl J Med. 2014;370(7):589–592. doi:https://doi.org/10.1056/NEJMp1314965

    CAS  Article  PubMed  PubMed Central  Google Scholar 

  33. 33.

    Kolenikov S. Calibrating survey data using iterative proportional fitting (raking). Stata J. 2014;14(1):22–59.

    Article  Google Scholar 

  34. 34.

    von Hippel PT. How many imputations do you need? A two-stage calculation using a quadratic rule. Sociol Methods Res. 2018;in press.

  35. 35.

    Cho YI, Johnson TP, VanGeest JB. Enhancing Surveys of Health Care Professionals: A Meta-Analysis of Techniques to Improve Response. Eval Heal Prof. 2013;36(3):382–407. doi:https://doi.org/10.1177/0163278713496425

    Article  Google Scholar 

  36. 36.

    McWilliams JM, Hatfield LA, Chernew ME, Landon BE, Schwartz AL. Early performance of accountable care organizations in Medicare. N Engl J Med. 2016;374(24):2357–2366.

    Article  Google Scholar 

  37. 37.

    McWilliams JM. Changes in Medicare shared savings program savings From 2013 to 2014. JAMA. 2017;316(16):1711–1712.

    Article  Google Scholar 

  38. 38.

    Colla CH, Wennberg DE, Meara E, et al. Spending differences associated with the Medicare Physician Group Practice Demonstration. JAMA - J Am Med Assoc. 2012;308(10):1015–1023. doi:https://doi.org/10.1001/2012.jama.10812

    CAS  Article  Google Scholar 

  39. 39.

    Colla CH, Lewis VA, Lao L, O’Malley AJ, Chang CH, Fisher ES. Association between Medicare accountable care organization implementation and spending among clinically vulnerable beneficiaries. JAMA Intern Med. 2016.

  40. 40.

    Markovitz AA, Hollingsworth JM, Ayanian JZ, Norton EC, Yan PL, Ryan AM. Performance in the Medicare Shared Savings Program after Accounting for Nonrandom Exit: An Instrumental Variable Analysis. Ann Intern Med. 2019. doi:https://doi.org/10.7326/M18-2539

    Article  Google Scholar 

  41. 41.

    Markovitz AA, Hollingsworth JM, Ayanian JZ, et al. Risk adjustment in medicare ACO program deters coding increases but may lead ACOs to drop high-risk beneficiaries. Health Aff. 2019. doi:https://doi.org/10.1377/hlthaff.2018.05407

    Article  Google Scholar 

Download references


This work benefited from guidance on survey development and administration from Lisa Holland, M.A., Tedi Engler, B.S., and Eve Kerr, M.D., M.P.H. The authors acknowledge the Physician Organization of Michigan ACO for its commitment to evaluation and research.


Mr. Markovitz is supported by the Horowitz Foundation For Social Policy, AHRQ grant R36HS025615, and the University of Michigan Rackham Hammel Research Award. Dr. Ryan is supported by National Institute on Aging grant R01AG047932. Dr. Hollingsworth is supported by AHRQ grants R01HS024728 and 1R01HS024525-01A1. The Physician Organization of Michigan ACO provided funds for survey mailings.

Author information



Corresponding author

Correspondence to John M. Hollingsworth MD, MS.

Ethics declarations

Conflict of Interest

Dr. Rozier, Dr. Goold, Dr. Ayanian, Dr. Norton, and Dr. Peterson have no disclosures to make. The remaining author disclosures are listed in the funding statement.


The funding sources had no role in the design and conduct of the study; collection, management, analysis, and interpretation of the data; preparation of the manuscript; or decision to submit the manuscript for publication.

Additional information

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Electronic Supplementary Material


(DOCX 1655 kb)

Rights and permissions

Reprints and Permissions

About this article

Verify currency and authenticity via CrossMark

Cite this article

Markovitz, A.A., Rozier, M.D., Ryan, A.M. et al. Low-Value Care and Clinician Engagement in a Large Medicare Shared Savings Program ACO: a Survey of Frontline Clinicians. J GEN INTERN MED 35, 133–141 (2020). https://doi.org/10.1007/s11606-019-05511-8

Download citation


  • healthcare reform
  • health policy
  • health services research
  • stakeholder engagement
  • survey research