Back to Journals » Journal of Healthcare Leadership » Volume 14

Engaging Frontline Physicians in Value Improvement: A Qualitative Evaluation of Physician-Directed Reinvestment

Authors Vilendrer S , Amano A , Asch SM, Brown-Johnson C, Lu AC, Maggio P

Received 24 August 2021

Accepted for publication 17 February 2022

Published 8 April 2022 Volume 2022:14 Pages 31—45

DOI https://doi.org/10.2147/JHL.S335763

Checked for plagiarism Yes

Review by Single anonymous peer review

Peer reviewer comments 3

Editor who approved publication: Professor Russell Taichman



Stacie Vilendrer,1 Alexis Amano,1 Steven M Asch,1,2 Cati Brown-Johnson,1 Amy C Lu,3 Paul Maggio4

1Division of Primary Care and Population Health, Stanford School of Medicine, Stanford, CA, 94305, USA; 2VA Center for Innovation to Implementation, Menlo Park, CA, 94025, USA; 3Department of Anesthesia, Stanford School of Medicine, Stanford, CA, 94305, USA; 4Department of Surgery, Stanford School of Medicine, Stanford, CA, 94305, USA

Correspondence: Stacie Vilendrer, Division of Primary Care and Population Health, Stanford University School of Medicine, 1265 Welch Road, Mail Code 5475, Stanford, CA, 94305, USA, Email [email protected]

Purpose: Physicians can limit upward trending healthcare costs, yet legal and ethical barriers prevent the use of direct financial incentives to engage physicians in cost-reduction initiatives. Physician-directed reinvestment is an alternative value-sharing arrangement in which a health system reinvests a portion of savings attributed to physician-led cost reduction initiatives back into professional areas of the physicians’ choosing. Formal evaluations of such programs are lacking.
Methods: To understand the impact of Stanford Health Care’s physician-directed reinvestment in its first year (2017– 2018) on physician engagement, adherence to program requirements around safety and fund use, and factors facilitating program dissemination, semi-structured qualitative interviews with physician participants, non-participants, and administrative stakeholders were conducted July-November 2019. Interview transcripts were qualitatively analyzed through an implementation science lens. To support contextual analysis of the qualitative data, a directional estimation of the program’s impact on cost from the perspective of the health system was calculated by subtracting annual maintenance cost (derived from interview self-reported time estimates and public salary data) from internal cost accounting of the total savings from first year cohort to obtain annual net benefit, which was then divided by the annual maintenance cost.
Results: Physician participation was low compared with the overall physician population (n=14 of approximately 2300 faculty physicians), though 32 qualitative interviews suggested deep engagement across physician participants and adherence to target program requirements. Reinvestment funds activated intrinsic motivators such as autonomy, purpose and inter-professional relations, and extrinsic motivators, such as the direction of resources and external recognition. Ongoing challenges included limited physician awareness of healthcare costs and the need for increased clarity around which projects rise above one’s existing job responsibilities. Administrative data excluding physician time, which was not directly compensated, showed a direct cost savings of $8.9M. This implied an 11-fold return on investment excluding uncompensated physician time.
Conclusion: A physician-directed reinvestment program appeared to facilitate latent frontline physician innovation towards value, though additional evaluation is needed to understand its long-term impact.

Keywords: physician incentives, professional autonomy, cost savings, motivation, quality improvement, program evaluation, quality indicators, health care, organizational innovation, work engagement

Plain Language Summary

New strategies are needed to combat rising healthcare prices. Physicians, whose decisions directly influence care utilization, are prohibited from receiving direct financial incentives to reduce costs due to legal and ethical barriers. Here we evaluate an alternative physician engagement method: “physician-directed reinvestment” is a value-sharing arrangement wherein a health system reinvests a portion of savings attributed to physician-led cost reduction initiatives back into professional areas of the physician’s choosing.

A qualitative assessment explored one such program July-November 2019 at Stanford Health Care, a large academic medical center. 32 interviews with program physician participants, physician non-participants, and administrative stakeholders were conducted to understand the breadth and depth of physician engagement in the program, adherence to program requirements, and factors influencing program growth. Results indicate limited breadth of engagement with just 14 physician participants in the first year cohort out of approximately 2300 faculty. However, these physicians were highly engaged and described how the reinvestment fund structure provided intrinsic (autonomy, purpose, positive relations) and extrinsic (resources, external recognition) motivators. Stakeholders adhered to program requirements, but ongoing challenges included the need increase physician awareness of healthcare costs to encourage more high-yield project ideas and to clarify which projects rise above one’s job responsibilities to justify acceptance into the program. A directional estimation of the program’s impact on cost based on self-reported time estimates and public salary data suggested a favorable 11-fold return on investment from the health system’s perspective. These results indicate a physician-directed reinvestment program may facilitate frontline physician innovation towards value.

Introduction

As healthcare expenses in the United States continue to grow exponentially,1 payers are exploring payment models that reward high value care. Most health systems still predominantly operate on a fee-for-service basis where a high volume of care is financially rewarded2 but are gradually responding to these new value-oriented payment models. A shared understanding of what constitutes value in healthcare has evolved over time, and the resulting “value equation” often includes quality, service, and safety outcomes in the numerator and cost in the denominator.3–5 For health systems to reduce costs while maintaining or improving quality and other positive aspects of value, they must meaningfully engage frontline clinicians. Physicians in particular have a disproportionate influence over the cost of care through their ordering decisions.6–8 Their professional culture also aspires to protect quality of care despite downward pressure on cost.9,10

Yet, engaging physicians remains a challenge given their skepticism and competing demands on their time.10,11 A well-designed incentive program may facilitate physician focus on waste reduction despite these barriers. While financial rewards are an extrinsic motivator that have been linked with positive behavior change in physicians,12 the most effective incentive programs also activate intrinsic motivators, such as a desire for professional autonomy and purpose.11,13,14 Furthermore, anti-gainsharing laws limit direct financial incentives from hospitals to employed physicians without explicit regulatory oversight, thereby limiting their practical and ethical use.15–17 These facts suggest an opportunity to explore alternative incentive structures that activate physician creativity around value-promoting innovation.

Some health systems are experimenting with an alternative value-sharing arrangement termed “physician-directed reinvestment”, an explicit agreement in which a health system reinvests a portion of savings attributed to physician-led cost reduction initiatives back into mission-driven areas of the physician’s choosing, including pursuit of their scholarly research, capital investment, or education. Two academic health systems report the model has increased physician engagement and achieved cost savings while maintaining or improving quality.18 Formal evaluations of the impact of such programs on physician engagement and other implementation outcomes are lacking.

Methods

We designed a qualitative evaluation of a physician-directed reinvestment intervention in a large academic health center to understand the program’s impact on physician breadth and depth of engagement in cost-reduction activities, stakeholder adherence to program design to ensure quality and appropriate fund use (fidelity), and factors influencing program dissemination. The evaluation was designed and presented following the Consolidated Framework for Implementation Research.19 Finally, to support contextual analysis of the qualitative data and to inform future dissemination efforts, we provide a directional estimation of the program’s impact on cost from the perspective of the health system based on administrative data and self-reported time estimates.

Setting

Stanford Health Care (SHC) implemented a hospital-wide physician-directed reinvestment program called the Cost Savings Reinvestment Program (CSRP) in 2017 with the primary goal of improving physician engagement in value-improvement efforts (program description in Appendix A).18 Projects are categorized under Clinical Improvement, which aims to improve clinical processes across the health system, and Product Utilization, which aims to reduce direct costs of clinical devices and products. Notably, the reinvestment funds cannot be used for compensation but instead are reinvested in areas related to the institution’s clinical, research, and educational mission. As downward pressure on cost has the potential to negatively impact quality, standardized quality “balancing measures” are identified and tracked by physician leaders and a two-tiered oversight committee throughout each project’s implementation.

Population and Sampling

Interview participants were purposefully drawn from three populations to promote validity through data source triangulation and negative case analysis:20,21 frontline physicians who participated in the program, administrators with a leadership or operational connection to the program, and a comparator group of frontline physicians who had not participated in the program but who were already recognized as improvement leaders. Participating physicians were selected by reaching out to all key physician leaders from each of the first-year cohort of projects accepted into CSRP spanning July 2017 through June 2018. Administrators included physician chairs of Departments sponsoring projects in the first-year cohort and individuals with an operational and/or financial affiliation to the program. Finally, a physician comparator population was drawn from purposefully selected physicians recognized as quality and/or value improvement leaders in their departments who had not already participated in the program. All potential participants were contacted by email, and interviews were conducted in-person and by phone following verbal informed consent by each participant that included consent to publish anonymized responses. Interviews were conducted until thematic saturation within each target sub-population was reached.

Qualitative Analysis

Interviews using a semi-structured protocol (Appendix B) were conducted from July-November 2019 by a single author (SV) and transcribed verbatim. We conducted a multi-step thematic analysis24 of interviews using a combined inductive and deductive approach. Data disassembly was achieved through code assignments derived from topic guide content (deductive) and emergent themes (inductive). Two authors coded all interviews in sequence (AA, SV) using qualitative software (NVivo 12, QSR International, Melbourne, AUS) and discrepancies were discussed and resolved on a weekly basis using a consensus coding approach25 (CBJ, SV, AA). Data were reassembled through analysis of each individual code to identify patterns and comparisons across interviewee types to support context analysis. Data are interpreted and presented through an implementation science lens, focusing on physician adoption and engagement, adherence to intended program design, and facilitators/barriers to program dissemination.19,26 Results were reviewed in a modified member check using synthesized data with two CSRP program experts to verify the validity of the findings.22,23 Participant department and project are masked, gender pronouns are randomly assigned, and certain details are excluded where they may compromise interviewee identity.

Description of Cost-Saving Estimates Using Administrative Data

A directional estimate of the program’s impact on cost was calculated based on a review of internal cost accounting data and reported in 2018 USD ($). To estimate the annual cost of running the program excluding indirect institution and program initiation costs,27 estimated personnel time were drawn from qualitative interviews, followed by consensus discussions with the primary operational leaders. Salary estimates were drawn from national data adjusted for geographic region28 and include a 37.8% upward adjustment to account for employee benefits.29

Annual program maintenance cost was subtracted from reported total savings to obtain the net benefit attributable to the program for year 1, both excluding and including benefits from reinvestment funds. The net benefit was divided by the program maintenance cost to obtain an approximate return on investment (ROI) for the program, included here given the likelihood that ROI calculations (rather than other cost analyses30) will inform future program development and the increased role ROI analyses have played in the literature.31–34

Actual reimbursement amounts reflecting “price” from payers were neither used in the program to calculate reinvestment funds, nor were used in this descriptive evaluation; instead all monetary values reported here are drawn from standard cost accounting software used internally within the health system.18 This project was reviewed by the Stanford Institutional Review Board and did not qualify as human subjects research (Protocol ID #51946).

Results

The first year cohort included 7 Clinical Improvement and 2 Product Utilization projects formally led by 14 physicians (out of approximately 2300 faculty) across 6 departments (examples in Table 1). Interviews included the majority of participating physicians (n=12 of 14 available), department chairs (n=5 of 6 available) as well as a number of key informants including program administrators, non-physician clinical staff, and executive sponsors (n=10). We ceased interviews with the comparator group upon reaching thematic saturation (n=5) (Appendix C).

Table 1 Cost Savings Reinvestment Program – Year 1 Example Project Descriptions and Balancing Measures

Minimum Breadth but Strong Depth of Engagement in First Year Cohort

The primary goal of CSRP is to improve physician engagement in value-improvement efforts, which can be measured through both breadth and depth of engagement within a population. The absolute number of physicians who led a project through CSRP in the first-year cohort was low compared with the overall physician population (n=14 of approximately 2300 faculty physicians). While all participating physicians and key stakeholders could accurately describe CSRP in their own words, the physicians in the comparator group could not. Even some participating physicians reported happenstance circumstances that allowed them to participate: “She [local physician director] knew of it [CSRP], and she asked me if I would apply … Otherwise I would have never heard about this” (Participating Physician 3).

However, the depth of engagement amongst physician participants was strong. Time commitment varied from “hundreds” of hours (Participating Physician 5, 6) amongst physician leaders dedicated to a single project to “a couple” hours amongst department chairs who supported projects (Chair 1, 2); many individuals reported total project time for a physician leader was approximately 10–50 hours (Participating Physician 1, 2, 3, 7; Chair 3; Administrator 10). Physician leaders’ activities depended on project type. Clinical Improvement project time was spent primarily in communicating project goals and process changes to internal stakeholders across functions and departments. Physician leaders involved in Product Utilization projects were focused on supporting staff in negotiation with vendors.

Participating physicians also reported broader engagement outside of their core project team, as some projects required buy-in from entire work units. As with the physician participants leading the projects, this time was not formally measured or compensated. CSRP also encouraged conversations around value improvement that would not otherwise have happened:

CSRP fostered and engendered a lot of very active conversations in this space amongst physicians who otherwise couldn’t care less over whether … a [product] costs 50 bucks or 40 bucks … So you get physicians really thinking about that …. (Participating Physician 11)

Adherence to Program Requirements

Interviewees were asked to comment on stakeholder fidelity to two features essential to CSRP’s implementation protocol: 1) maintenance of quality despite downward pressure on cost, and 2) the appropriate utilization of reinvestment funds towards non-compensation uses (Table 2). A lapse in adherence to CSRP protocol in these areas could negatively impact patient safety or infringe upon anti-gainsharing laws, respectively.

Table 2 Fidelity to Intended Design in a Physician-Directed Reinvestment Program at an Academic Medical Center

Perceptions of CSRP’s efforts to ensure quality were favorable, despite the potential tension introduced by a cost-reduction incentive. Stakeholders reported that proposals presenting clinical concerns were reliably rejected during the review process. In this way, the CSRP selection process maintained a strict filter, admitting only those projects that upheld existing quality standards.

Following project selection, physician project leaders were also asked to provide regular reports on the quality balancing measures that were included in their application and previously discussed with the oversight committee. Consistent with the program’s intended design, diverse stakeholders reported balancing measures were agreed upon at the outset of project initiation and maintained throughout program participation. For Clinical Improvement projects, patient outcome measures were typically tracked on a month-to-month basis and could include codes, mortality, length of stay, and readmissions depending on the project (see Table 1). Alternately, balancing measures for Product Utilization projects were reported to be less relevant given these projects focused on negotiating lower prices for clinically similar products: “They are more-or-less variations of the same thing” (Chair 5). In many of these negotiations, the product itself reportedly stayed the same; the price was simply negotiated downwards.

Project leaders also reported incidental improvements to clinical throughput or safety as a result of their intervention. CSRP efforts were also felt to contribute directly to the overall quality improvement of the organization. Regarding a quality ranking, “We’ve gone from the sixth spot to the third spot … And CSRP is part of this journey where this work actually shows up” (Administrator 5).

Regarding appropriate use of the reinvestment funds, physicians, and department chairs, in particular, appeared to understand that redirected funds could not be used for salary support. One exception was a single physician who believed reinvestment funds could be used for bonus compensation, though her department chair understood otherwise. Fund use itself was not determined at the time of interviews, though participating physicians shared ongoing aspirations, including efforts that allowed the hospital to become the first in the nation to offer a particular diagnostic test. Program administrators tracked reinvestment fund use in the years following project completion, which included primarily clinical research, followed by investment in capital equipment and clinical education, all of which complied with program rules.

Factors Influencing Program Dissemination

Interviewees identified a number of facilitators and barriers to program dissemination across all domains of Consolidated Framework for Implementation Research,19 including the inner setting of the hospital, the outer setting of the broader healthcare landscape, the CSRP as an intervention itself, and the individuals and processes involved (Table 3).

Table 3 Facilitators and Barriers to Cost Savings Reinvestment Program Dissemination Using the Consolidated Framework for Implementation Research (CFIR)

Facilitators

Perspectives on the program were largely positive amongst all stakeholders, particularly participating physicians. Many facilitators related to characteristics of the intervention itself. Several interviewees shared that CSRP aligned incentives between the health system, physicians, and patients. Interviewees also described aspects of the intervention that motivated intrinsically, including heightened autonomy, purpose, and an opportunity to relate to others, as well as aspects that motivated externally, such as the funds themselves and the public recognition of their achievements. Particularly, the desire to reduce cost for its own sake – particularly in the context of a nation facing high health costs – was cited as a desirable characteristic. Strong emphasis was placed on the reinvestment dollars serving as a “common language” to facilitate engagement in dialogue around cost saving innovation. The administration’s commitment to sharing resources with physicians was seen as a positive for many, facilitating a greater sense of fairness for shared effort, and participants appreciated how their projects were prioritized in otherwise long queues to access administrative and operational data. Many physicians also acknowledged that CSRP provided a justification for time spent on value improving initiatives, overcoming other clinical and academic pressures. Finally, trust in the organization showed up as both a facilitator and a barrier and seemed to depend on the interviewee’s personal experience with the health system.

Barriers

Many interviewees corroborated the lack of widespread awareness amongst physicians about CSRP and some commented on the reasons for this: “I don’t think it’s something that is salient in people’s minds … it [the program] feels daunting to them” (Non-Participating Physician 1). Clinical demands on individuals time, suboptimal physician attendance within department meetings, a lack of programmatic oversight capacity, ongoing data access following, fear of reduced future resources and difficulty identifying high-yield cost-saving ideas were all cited as key barriers. Some suggested CSRP could do more in this area to provide physician operational education and cost data. Others felt program outreach was not personalized, with some physicians believing CSRP did not apply to them. Finally, fee-for-service inertia appeared as a barrier in both the inner and outer setting: first, in the system’s dominant reimbursement structure, which rewarded volume production more than value creation and second, in the US health system, which was cited as paying more for its clinical supplies and devices than other countries.

Notably, interviewees described an ongoing debate as to whether a project overlapped with the responsibilities of one’s job. A non-participating physician felt CSRP led to “double dipping” in which a physician was already paid “to be a good doctor” yet was awarded for “doing something that was already part of [one’s] job” (Non-Participating Physician 3). Alternately, one participating physician felt the project brought to CSRP constituted “an extraordinary amount of effort”, requiring heightened administrative and inter-department collaboration (Participating Physician 1). Learnings from this debate facilitated the introduction of formal criteria to support the committee’s selection of projects (Appendix D). Finally, there were also concerns that the existence of the incentive could lead to suboptimal behavior. Administrators noted that physicians may choose to wait to implement a value-improvement initiative until they were enrolled in the program, something “we already see a little bit now … That’s my number one concern.” (Administrator 3).

Directional Estimation of the ROI Using Administrative Data

A secondary goal of the program was to improve value across the organization, defined as decreasing direct cost of care while maintaining or improving the quality of care. As this goal was discussed by many interviewees, a directional estimate of the magnitude of the program’s benefit from the health system’s perspective is included here.

Administrative data drawn from standard cost-accounting software and validated internally as part of determining the size of reinvestment fund distribution suggested the first year cohort saved approximately $8.9M, with $7.5 million attributed to Clinical Improvement and $1.4M attributed to Product Utilization projects. Based on interview data, an estimated 23 individuals spent approximately 0.01 to 1.0 Full Time Equivalent (FTE) facilitating ongoing maintenance of the program, suggesting an annual maintenance cost of approximately $492,178 based on public salary estimates (Appendix E). This suggests an ROI of 11-fold if reinvestment funds are excluded as a benefit, or an ROI 17-fold if reinvestment funds are included. Notably, this directional estimate does not include time from physician participants and other individuals who contributed to successful project completion, as the program did not provide direct compensation or protected time for participants to complete project activities.

Discussion

Stanford Health Care’s Cost Savings Reinvestment Program is an example of physician-directed reinvestment, which calls on frontline physicians to identify and implement their own value-improvement projects with the promise of being able to direct a portion of attributable savings back into a professional area of their choosing. This evaluation found a small number of physicians were highly engaged in cost-reduction efforts as a result of the program. Diverse stakeholders followed program requirements designed to maintain quality and appropriate fund use. Further, various elements of the program both supported and hindered its ongoing dissemination as described above. The favorable 11-fold ROI from the health system’s perspective is included as a directional estimation only and must be interpreted with caution (Appendix F). Nevertheless, these collective data suggest additional evaluation focused on longitudinal outcomes as well as dissemination to non-academic and outpatient settings is warranted. Such efforts may help healthcare administrators understand the promise of physician-directed reinvestment to engage physicians in cost-reduction efforts.

Novel Engagement on Both Sides of the Value Equation

Physician-directed reinvestment is rare amongst physician engagement strategies in that it uses reinvestment funds to simultaneously engage physicians in cost and quality improvement (Figure 1). Whereas many physicians are comfortable leading quality improvement efforts, their role in cost reduction efforts (if any) typically follows a top-down approach without incentives—creating an ongoing source of tension between physicians and administrators.14,35,36 Interview data suggests the program aligns incentives across these stakeholders and facilitates inter-professional dialogue, which has previously been recognized to promote physician engagement.11 Further, as anti-gainsharing laws limit direct financial incentives for cost-reduction activities, our data suggests reinvestment funds are a unique tool for administrators to capture the attention of busy frontline physicians.

Figure 1 Initiatives to engage physicians in value improvement and the legality of direct financial compensation.

Benefits of a Multi-Factorial Incentive Program

While diverse aspects of the program were felt to promote uptake, the novel structure of the reinvestment funds was a dominant theme in interviews. These funds appeared to activate intrinsic motivators such as autonomy, purpose and inter-professional relations, and extrinsic motivators, such as the direction of resources and external recognition of achievement. Empiric studies of mixed incentives in real-world healthcare settings is scant.,11,13,37–40 and CSRP offers a rare opportunity to explore the co-presence of intrinsic and extrinsic motivators in healthcare.

The reinvestment funds appeared as a sort of kindling: first, they served as a “common language” to ignite conversations and projects. In addition, they offered a “justification” to spend time on value improvement activities when competing clinical and academic priorities called. Finally, they offered physicians the ability to make a positive workplace change through the project itself and subsequent redirection of reinvestment funds. These observations suggest reinvestment funds may be a distant cousin to traditional financial incentives. Rather than providing direct personal benefit, these funds may be a tool to capture others’ attention, a professional acknowledgement of hard work, and a vehicle to exercise autonomy in shaping one’s environment.

Barriers to Dissemination

One of the primary critiques of incentives is they can “crowd out” one’s intrinsic motivation to perform a desired behavior, leading to poor long-term performance.37,40 This phenomenon appeared to be less salient in these data but did arise in the debate around what activities constitute one’s existing job versus a special project meriting a reward. This issue also appeared in an administrator’s “number one concern” that physicians may perversely wait to implement a best practice change until enrolled in the program. Participating physicians generally disagreed with this premise—many cited the desire to improve cost of care as an end in itself and felt the program facilitated their latent desire to improve value. Despite these concerns, the health systems has continued to support and promote the program. Understanding this tension and its long-term consequences are areas for future work.

The data also revealed a number of opportunities to improve participants’ experience in the program and to increase uptake amongst new physicians. Access to clinical data to inform project development was seen as a positive resource that the program provided for physicians, but ongoing access to evolving data still remained a challenge in a system with limited information technology resources. Further, the reported lack of high yield project ideas is consistent with prior work showing physician awareness of healthcare costs is limited.41–44 Efforts to promote physician education around cost45–47 and/or present cost data within the electronic medical record48–50 may encourage stronger ideation amongst frontline physicians. Finally, the disconnect between project savings measured through internal accounting software in “soft” dollars and payout from the health system to the medical school in “hard” dollars was a previously recognized concern18 but appeared to be overshadowed by the oversight committee’s desire to support a diversity of projects.

Potential to Benefit Physician Satisfaction

The association between increased engagement and job satisfaction is recognized, though considerable debate exists as to the causal direction of this link.11 Increasing physicians’ sense of autonomy and purpose in the workplace, both active in CSRP, have been previously recognized as potential solutions to physician burnout.51 Interviewees also reported an increased sense of fairness as a result of receiving a direct benefit from their additional effort, which has been positively associated with professional satisfaction.52 Quantifying the program’s impact on job satisfaction may therefore be a direction for future work.

Evaluation Limitations

Selection bias in the first year cohort could lead to an overestimate of engagement and adherence relative to future participants, as volunteer “early adopter” physicians may be amongst the most engaged in the system. As the program expands beyond the initial group of high performing physician leaders who opted into the first year cohort, careful and continual monitoring of fidelity to target program requirements including quality and fund use will be needed. In addition, the potential for a project to cost shift to other, unmeasured areas is also possible and an area for future research, though this challenge did not emerge in the qualitative data presented here. The interview format also has potential to introduce social desirability bias, as interviewees may have sought to portray themselves or their organizations in a favorable light; interview probes related to unfavorable aspects of the program sought to minimize this bias.

Conclusion

These early data suggest a physician-directed reinvestment program at an academic medical center effectively engaged a minority of physicians towards value improvement through a simultaneous focus on quality and cost leading to a favorable ROI, though additional longitudinal analysis is needed. Program requirements to maintain quality and appropriate fund use were followed, but ongoing challenges included the need to increase physician awareness of healthcare costs to foster more high-yield project ideas and to clarify which projects rise above one’s job responsibilities to justify acceptance into the program. Finally, the novel structure of the reinvestment funds appeared to active intrinsic motivators such as autonomy, purpose and inter-professional relations, and extrinsic motivators, such as direction of resources and external recognition of achievement. The co-presence of both intrinsic and extrinsic motivators appeared to help facilitate, in the words of one department chair, the “magic” that has uncovered latent physician innovation towards value.

Ethical Approval

This project was reviewed by the Stanford Institutional Review Board and did not qualify as human subjects research (Protocol ID #51946).

Consent for Publication

All authors consent to the publication of this manuscript in its current form with included graphics.

Acknowledgments

The authors would like to acknowledge Renee Box, Michelle DeNatale, and Quinn McKenna from Stanford Health for their support of this project.

Funding

This study was supported through the Stanford-Intermountain Fellowship in Population Health, Delivery Science, and Primary Care, and neither institution played an editorial role in the writing of this manuscript.

Disclosure

The authors report no conflicts of interest in this work.

References

1. Keehan SP, Cuckler GA, Poisal JA, et al. National health expenditure projections, 2019–28: expected rebound in prices drives rising spending growth. Health Aff. 2020;39(4):704–714. doi:10.1377/hlthaff.2020.00094

2. Zuvekas SH, Cohen JW. Fee-for-service, while much maligned, remains the dominant payment method for physician visits. Health Aff. 2016;35(3):411–414. doi:10.1377/hlthaff.2015.1291

3. Porter ME. What is value in health care? N Engl J Med. 2010;363(26):2477–2481. doi:10.1056/NEJMp1011024

4. Scheurer D, Crabtree E, Cawley PJ, Lee TH. The value equation: enhancing patient outcomes while constraining costs. Am J Med Sci. 2016;351(1):44–51. doi:10.1016/j.amjms.2015.10.013

5. Chambers P, Benz L, Boat A. Patient and family experience in the healthcare value equation. Curr Treat Options Pediatr. 2016;2(4):267–279. doi:10.1007/s40746-016-0072-6

6. Porter ME, Lee TH. The strategy that will fix health care. Harvard Business Review; October 2013.

7. Agrawal S, Taitsman J, Cassel C. Educating physicians about responsible management of finite resources. JAMA. 2013;309(11):1115. doi:10.1001/jama.2013.1013

8. Crosson F. Change the microenvironment. Delivery system reform essential to control costs. Mod Healthc. 2009;39(17):20–21.

9. Byrnes J. Great physician engagement is key to great quality. Physician Leadersh J. 2015;2(2):40–42.

10. Snell AJ, Briscoe D, Dickson G. From the inside out: the engagement of physicians as leaders in health care settings. Qual Health Res. 2011;21(7):952–967. doi:10.1177/1049732311399780

11. Perreira TA, Perrier L, Prokopy M, Neves-Mera L, Persaud DD. Physician engagement: a concept analysis. J Healthc Leadersh. 2019;11:101–113. doi:10.2147/JHL.S214765

12. Flodgren G, Eccles MP, Shepperd S, Scott A, Parmelli E, Beyer FR. An overview of reviews evaluating the effectiveness of financial incentives in changing healthcare professional behaviours and patient outcomes. Cochrane Database Syst Rev. 2011. doi:10.1002/14651858.CD009255

13. Phipps-Taylor M, Shortell SM. More than money: motivating physician behavior change in Accountable Care Organizations: motivating physician behavior change in ACOs. Milbank Q. 2016;94(4):832–861. doi:10.1111/1468-0009.12230

14. Kaissi A. Enhancing physician engagement: an international perspective. Int J Health Serv. 2014;44(3):567–592. doi:10.2190/HS.44.3.h

15. Reynolds M. Gainsharing: a cost-reduction strategy that may be back. HFM. 2002;56(1):58–64.

16. Wilensky GR, Wolter N, Fischer MM. Gain sharing: a good concept getting a bad name? Health Aff. 2007;26(1):w58–w67. doi:10.1377/hlthaff.26.1.w58

17. Office of Inspector General | U.S. Department of Health and Human Services. Advisory Opinions. Available from: https://oig.hhs.gov/compliance/advisory-opinions/index.asp. Accessed December 20, 2018.

18. Vilendrer SM, Asch SM, Anzai Y, Maggio P. An incentive to innovate: improving health care value and restoring physician autonomy through physician-directed reinvestment. Acad Med. 2020;95(11):1702–1706. doi:10.1097/ACM.0000000000003650

19. Damschroder LJ, Aron DC, Keith RE, Kirsh SR, Alexander JA, Lowery JC. Fostering implementation of health services research findings into practice: a consolidated framework for advancing implementation science. Implement Sci. 2009;4(1). doi:10.1186/1748-5908-4-50

20. Carter N, Bryant-Lukosius D, DiCenso A, Blythe J, Neville AJ. The use of triangulation in qualitative research. Oncol Nurs Forum. 2014;41(5):545–547. doi:10.1188/14.ONF.545-547

21. Morse JM. Critical analysis of strategies for determining rigor in qualitative inquiry. Qual Health Res. 2015;25(9):1212–1222. doi:10.1177/1049732315588501

22. Birt L, Scott S, Cavers D, Campbell C, Walter F. Member checking: a tool to enhance trustworthiness or merely a nod to validation? Qual Health Res. 2016;26(13):1802–1811. doi:10.1177/1049732316654870

23. Harvey L. Beyond member-checking: a dialogic approach to the research interview. Int J Res Method Educ. 2015;38(1):23–38. doi:10.1080/1743727X.2014.914487

24. Yin R. Qualitative Research from Start to Finish. The Guilford Press; 2011.

25. Miles MB, Huberman AM, Saldana J. Qualitative Data Analysis. 4th ed. Los Angeles: Sage; 2019.

26. Proctor E, Silmere H, Raghavan R, et al. Outcomes for implementation research: conceptual distinctions, measurement challenges, and research agenda. Adm Policy Ment Health. 2011;38(2):65–76. doi:10.1007/s10488-010-0319-7

27. Mauskopf JA, Sullivan SD, Annemans L, et al. Principles of good practice for budget impact analysis: report of the ISPOR task force on good research practices—Budget impact analysis. Value Health. 2007;10(5):336–347. doi:10.1111/j.1524-4733.2007.00187.x

28. Metropolitan and nonmetropolitan area occupational employment and wage estimates. San Jose-Sunnyvale-Santa Clara, CA: U.S. Bureau of Labor Statistics; March 29, 2019. Available from: https://www.bls.gov/oes/2018/may/oes_41940.htm#(4). Accessed February 21, 2020.

29. News Release: employer costs for employee compensation - March 2020. U.S. Bureau of Labor Statistics; March, 2020. Available from: www.bls.gov/news.release/pdf/ecec.pdf. Accessed February 21, 2020.

30. Neumann PJ. Why don’t Americans use cost-effectiveness analysis? Am J Manag Care. 2004;10(5):308–312.

31. Asch DA, Pauly MV, Muller RW. Asymmetric thinking about return on investment. N Engl J Med. 2016;374(7):606–608. doi:10.1056/NEJMp1512297

32. Kangovi S, Mitra N, Grande D, Long JA, Asch DA. Evidence-based community health worker program addresses unmet social needs and generates positive return on investment: a return on investment analysis of a randomized controlled trial of a standardized community health worker program that addresses unmet social needs for disadvantaged individuals. Health Aff. 2020;39(2):207–213. doi:10.1377/hlthaff.2019.00981

33. Sim SY, Watts E, Constenla D, Brenzel L, Patenaude BN. Return on investment from immunization against 10 pathogens in 94 low- and middle-income countries, 2011–30: study estimates return on investment from immunization programs against ten pathogens for ninety-four low- and middle-income countries from 2011 to 2030. Health Aff. 2020;39(8):1343–1353. doi:10.1377/hlthaff.2020.00103

34. Bonnabry P, François O. Return on investment: a practical calculation tool to convince your institution. Eur J Hosp Pharm. 2020;27(2):111–113. doi:10.1136/ejhpharm-2018-001733

35. Levinson W, Born K, Wolfson D. Choosing wisely campaigns: a work in progress. JAMA. 2018;319(19):1975. doi:10.1001/jama.2018.2202

36. Markovitz AA, Rozier MD, Ryan AM, et al. Low-value care and clinician engagement in a large medicare shared savings program ACO: a survey of frontline clinicians. J Gen Intern Med. 2020;35(1):133–141. doi:10.1007/s11606-019-05511-8

37. Judson TJ, Volpp KG, Detsky AS. Harnessing the right combination of extrinsic and intrinsic motivation to change physician behavior. JAMA. 2015;314(21):2233. doi:10.1001/jama.2015.15015

38. Herzer KR, Pronovost PJ. Physician motivation: listening to what pay-for-performance programs and quality improvement collaboratives are telling us. Jt Comm J Qual Patient Saf. 2015;41(11):522–528. doi:10.1016/s1553-7250(15)41069-4

39. Doran T, Maurer KA, Ryan AM. Impact of provider incentives on quality and value of health care. Annu Rev Public Health. 2017;38(1):449–465. doi:10.1146/annurev-publhealth-032315-021457

40. Kao AC. Driven to care: aligning external motivators with intrinsic motivation. Health Serv Res. 2015;50:2216–2222. doi:10.1111/1475-6773.12422

41. Gandhi R, Stiell I, Forster A, et al. Evaluating physician awareness of common health care costs in the emergency department. CJEM. 2018;20(4):539–549. doi:10.1017/cem.2017.43

42. Allan GM, Lexchin J, Wiebe N. Physician awareness of drug cost: a systematic review. PLoS Med. 2007;4(9):e283. doi:10.1371/journal.pmed.0040283

43. Allan GM, Lexchin J. Physician awareness of diagnostic and nondrug therapeutic costs: a systematic review. Int J Technol Assess Health Care. 2008;24(02):158–165. doi:10.1017/S0266462308080227

44. Vijayasarathi A, Duszak R, Gelbard RB, Mullins ME. Knowledge of the costs of diagnostic imaging: a survey of physician trainees at a large academic medical center. J Am Coll Radiol. 2016;13(11):1304–1310. doi:10.1016/j.jacr.2016.05.009

45. Korn LM, Reichert S, Simon T, Halm EA. Improving physicians’ knowledge of the costs of common medications and willingness to consider costs when prescribing. J Gen Intern Med. 2003;18(1):31–37. doi:10.1046/j.1525-1497.2003.20115.x

46. Ginzburg SB, Schwartz J, Deutsch S, Elkowitz DE, Lucito R, Hirsch JE. Using a problem/case-based learning program to increase first and second year medical students’ discussions of health care cost topics. J Med Educ Curric Dev. 2019;6:238212051989117. doi:10.1177/2382120519891178

47. Zanotti K, Somasegar S, Hooper MW, Hopp E. Improving Value-based care education in a fellowship by incorporating ACGME competencies. J Grad Med Educ. 2019;11(6):668–673. doi:10.4300/JGME-D-19-00311.1

48. Feldman LS, Shihab HM, Thiemann D, et al. Impact of providing fee data on laboratory test ordering: a controlled clinical trial. JAMA Intern Med. 2013;173(10):903. doi:10.1001/jamainternmed.2013.232

49. Silvestri MT, Xu X, Long T, et al. Impact of cost display on ordering patterns for hospital laboratory and imaging services. J Gen Intern Med. 2018;33(8):1268–1275. doi:10.1007/s11606-018-4495-6

50. Goetz C, Rotman SR, Hartoularos G, Bishop TF. The effect of charge display on cost of care and physician practice behaviors: a systematic review. J Gen Intern Med. 2015;30(6):835–842. doi:10.1007/s11606-015-3226-5

51. West CP, Dyrbye LN, Shanafelt TD. Physician burnout: contributors, consequences and solutions. J Intern Med. 2018;283(6):516–529. doi:10.1111/joim.12752

52. Friedberg MW, Chen PG, Van Busum KR, et al. Factors affecting physician professional satisfaction and their implications for patient care, health systems, and health policy. Rand Health Q. 2014;3(4):1.

53. Ruiz ColónG, Yang J, Svec D, et al. Physicians Leading Physicians: A Physician Engagement Intervention Decreases Inappropriate Use of IICU Level of Care Accommodations. Am J Med Qual. 2021;36(6):387-394. doi:10.1097/01.JMQ.0000735480.43566.f9

Creative Commons License © 2022 The Author(s). This work is published and licensed by Dove Medical Press Limited. The full terms of this license are available at https://www.dovepress.com/terms.php and incorporate the Creative Commons Attribution - Non Commercial (unported, v3.0) License. By accessing the work you hereby accept the Terms. Non-commercial uses of the work are permitted without any further permission from Dove Medical Press Limited, provided the work is properly attributed. For permission for commercial use of this work, please see paragraphs 4.2 and 5 of our Terms.