Back to Journals » Journal of Healthcare Leadership » Volume 15

Comparing Virtual vs In-Person Immersive Leadership Training for Physicians

Authors Fernandez CSP , Hays CN , Adatsi G, Noble CC, Abel-Shoup M , Connolly A 

Received 4 May 2023

Accepted for publication 8 July 2023

Published 11 August 2023 Volume 2023:15 Pages 139—152

DOI https://doi.org/10.2147/JHL.S411091

Checked for plagiarism Yes

Review by Single anonymous peer review

Peer reviewer comments 2

Editor who approved publication: Dr Pavani Rangachari



Claudia SP Fernandez,1 Caroline N Hays,1 Georgina Adatsi,1 Cheryl C Noble,2 Michelle Abel-Shoup,1 AnnaMarie Connolly3,4

1Department of Maternal and Child Health, UNC Gillings School of Global Public Health, University of North Carolina at Chapel Hill, Chapel Hill, NC, USA; 2Evaluation Consultant, Scotts Valley, CA, USA; 3American College of Obstetricians and Gynecologists, Washington, DC, USA; 4Department of Obstetrics and Gynecology (Emeritus), UNC School of Medicine, University of North Carolina at Chapel Hill, Chapel Hill, NC, USA

Correspondence: Claudia SP Fernandez, Maternal and Child Health Department, Gillings School of Global Public Health, 135 Dauer Drive, University of North Carolina at Chapel Hill, 426 Rosenau Hall, Chapel Hill, NC, 27599, USA, Tel +1 919-451-6231, Fax +1-919-966-0458, Email [email protected]

Purpose: The COVID-19 pandemic caused a disruption of in-person workforce development programs. Our immersive physician-oriented leadership institute suspended in 2020, resumed in 2021 with a virtual program, and in 2022 reconvened in-person training. We used this opportunity to compare the participant experience, including reported knowledge acquisition and ability gains, between these nearly identical curricula delivered in vastly different circumstances and formats.
Participants and Methods: We describe the differences in immersive leadership training implementation and adaptations made for virtual vs in-person engagement of two cohorts of OB-GYN physicians. Data were collected from virtual (n=32) and in-person (n=39) participants via post-session surveys. Quantitative data reported includes participant ratings for knowledge gain and ability gain. Qualitative data were obtained via open-ended feedback questions per session and the overall experience.
Results: Knowledge and ability scores indicated strong, statistically significant gains in both formats, with some reported learning gains higher in the virtual training. Qualitative data of participant feedback identified a number of positive themes similar across the in-person and virtual settings, with virtual participants noting how construction of the virtual program produced highly effective experiences and engagement. Constructive or negative feedback of the virtual setting included time constraint issues (eg, a desire for more sessions overall or more time per session) and technical difficulties. Positive comments focused on the effectiveness of the experience in both formats and the surprising ability to connect meaningfully with others, even in a virtual environment. However, there were also many comments clearly supporting the preference for in-person over virtual experiences.
Conclusion: Immersive physician leadership training can be effectively delivered via virtual or in-person methods, resulting in significant reported gains of knowledge and skills. These programs provide valuable interpersonal connections and skills to support physician leadership. While both formats are effective, participants clearly prefer in-person leadership development experiences and interpersonal learning.

Keywords: workforce development, leadership, physicians, virtual training, onsite training

Plain Language Summary

Physician leadership training is most commonly provided in face-to-face settings where participants are brought together for a few days of immersive and intensive training. However, the pandemic of 2020 disrupted the ability to convene groups for this kind of development experience. The American College of Obstetricians and Gynecologists (ACOG)-Robert C. Cefalo Leadership Institute trains obstetricians and gynecologists who are active in their professional organization (ACOG) in a 4-day program each year. In 2020, this program was suspended, in 2021 it was offered virtually, and in 2022, it was offered in a face-to-face setting. With the curricula being nearly identical between the virtually deployed program and the in-person, onsite program, we compared participants’ reported knowledge and skills gains, as well as their open-ended comments about the two experiences. We found that participants’ learning grew by a statistically significant amount in both types of training situations, and in some cases, the numbers were even higher in the virtual setting. Participants’ comments indicated that the leadership training and development was engaging and helped them build meaningful connections and practical skills in both the virtual and in-person environments; however, there was a clear preference for in-person training.

Introduction

Physicians are recognized leaders in healthcare who have important influence over the internal policies and culture of their teams and organizations. Further, physician leaders can successfully impact systems of care through employing a sophisticated set of value-guided skills that go beyond basic and specialty medical training. Physician leadership extends to more than just the bottom line of the enterprise1 as these leaders and their organizations increasingly focus on reducing health disparities in their communities and addressing health equity.2–4

Given that medical and specialty training is an intensive experience, leadership strengthening and development opportunities typically occur at mid-career to senior-career phases, when physicians are more likely to impact organizations and systems to a greater extent. At these career points, they begin to fill roles in the leadership chain of command in their institutions1,5–7 and engage with impacting health equity in their communities.8,9 In noting the importance of effective physician leadership,10 both Frich et al11 and Hopkins et al5 describe how leadership programs targeted to physicians aim to strengthen common leadership competencies. They note that while programs generally show increased self-assessed knowledge and expertise, relating those outcomes to increased self-awareness and organizational or system-level impacts remains a persistent challenge.

Given the demands and impact of the pandemic, physician leadership training takes on even greater importance in terms of leader skills for creating psychological safety12,13 on healthcare teams, for addressing physician resilience,14 for improving communication skills during rapidly evolving and complex challenges,15,16 and for understanding equity concerns subsequent to the crisis.13,17 Yet, the pandemic interrupted capacity-building and leadership development opportunities.9 One such program was the American College of Obstetricians and Gynecologists (ACOG) Robert C. Cefalo National Leadership Institute (www.ACOGLeadershipInstitute.org) (hereafter referred to as ACOG-Cefalo), which was suspended in March 2020 due to COVID-19, held virtually in 2021, and then resumed in-person in 2022.

Since the inception of the program in 2006, the ACOG-Cefalo program has trained over 600 physician leaders in women’s healthcare. The ACOG-Cefalo leadership program serves OB-GYNs who are active in their professional organization and participate by nomination from their District leadership. The program provides an intensive, fast-paced experience in which participants learn how to employ “equity-centered leadership skills” to nurture thought diversity, promote psychological safety, and address the needs of diverse communities. These skills are developed through training in tools for building effective partnerships, for strengthening participants’ abilities to lead innovative teams, and to promote innovative and creative thinking. Participants gain skills to advocate effectively for their patients and profession as they advocate for change within their organizations and communities. Topics also include successful negotiation skills to create win/win outcomes for all stakeholders in order to foster results that strengthen, rather than diminish, their partnerships. Through a concrete tools-based communications core of the program, participants gain skills to translate complex scientific findings into understandable, accessible language for patients, other medical professionals, policy makers, and the media. Participants learn how to manage communications in times of high tension and crisis, helping them lead their teams through the turbulent waters that healthcare organizations world-wide so commonly face. The ACOG-Cefalo program also offers a focus on physician resiliency, providing both experiential learning and coaching on strategies to prevent personal and professional burnout.

In previous studies, the ACOG-Cefalo program has demonstrated effectiveness in moving the needle of competency and efficacy as well as of preparedness for new leadership opportunities.10 These findings were highlighted in the Geerts et al18 systematic review of physician leadership programs, which indicated that the outcomes of the ACOG-Cefalo program were notable when compared to the body of available literature. It is important to note that providing training in such sophisticated and nuanced skills became far more complex during fully remote operations and travel restrictions imposed starting in 2020. In this analysis, we seek to understand the impacts on physician leadership development given the pandemic-related disruption in learning deployment strategies coupled with the increased stress and demands on healthcare providers. We explore differences in program outcomes regarding skill development and leadership self-efficacy by comparing evaluation data from virtual and in-person leadership training programs with two cohorts of physician leaders participating in the ACOG-Cefalo program in 2021 (virtual) and 2022 (in-person).

Methods

Setting

The 2021 ACOG-Cefalo program was hosted virtually using the Zoom19 platform coupled with the Whova Meeting App20 to foster connection, networking, and organization from pre-program to 3 months post program. The Whova App stored recordings of virtual sessions and provided electronic forums/functionality for spontaneous participant-directed connection and “meet ups” either during breaks or after the program. The 2021 program was held over 4 days, convening between 11:00 AM eastern time through 6:00 PM eastern time to accommodate participants more reasonably across 4 time zones. The degree of the use of the App was not a focus of this study and these data were not collected.

The 2022 ACOG-Cefalo program was hosted in-person over 3.5 days from 8:00 AM through 5:00–6:00 PM eastern time. The Zoom platform was not used, although the Whova Meeting App was offered as a general meeting facilitation tool which allowed both cohorts to have access to identical opportunities for participants to meet and connect in self-directed ways, both during and post-program. The degree of the use of the Whova App was not a focus of this study, and these data were not collected. Table 1 provides a comparison of the convening formats, numbers of sessions offered, number of training hours, and number of attendees in both the virtual and the onsite programs. All programs were interactive, grounded in adult learning theory,21 and to the greatest extent possible, provided by the same instructors.

Table 1 Comparison of 2021 and 2022 ACOG Robert C. Cefalo National Leadership Institutes

Data Collection

Participant demographics were collected through self-report on exit surveys. Participants in the 2021 and 2022 cohorts were asked to electronically provide feedback through QualtricsXM22 on their experience at the end of each session. Participants rated sessions using a 7-point Likert scale, where 1 = Very Poor, 2 = Poor, 3 = Fair, 4 = Average, 5 = Good, 6 = Excellent, and 7 = Outstanding.23 For in-person training attendees not wishing to use the e-format, paper forms were made available. Ratings of shifts in knowledge and ability were gathered using a retrospective pre- and post-test, with participants asked at the end of each session to rate their knowledge of the session content and their ability to use the session content before and after attending the session on a 7-point Likert scale, with 1 being the lowest and 7 being the highest.23–27

Open-ended qualitative feedback via QualtricsXM22 was sought from participants at the end of each session and in an exit survey at the end of the institute. At the end of each session, the two open-ended prompts were 1) “Please provide any comments or feedback about this presentation. Is there anything we need to consider to make it better for the next cohort?” and 2) “Do you have any general feedback about the retreat thus far?” In the exit survey, the three open-ended prompts were 1) “Reflecting on your overall experience participating in this retreat, please provide any comments or feedback (i.e., What worked well? What did not work well? What should we keep for next time? Any suggestions for improvement?)”, 2) Were there any skills or lessons you learned this week that were particularly “sticky”? [“sticky” in that they stuck with you, strongly resonated with you, or moved you?] If so, please describe”, and 3) “Reflecting on your response to the previous question, do you feel like the lessons you listed resonated even more in the context of the COVID-19 pandemic? If so, why do you think that?” In the 2021 exit survey, an additional qualitative prompt was “What made virtual training an attractive option for you to commit to, given that society has not fully exited the COVID-19 crisis yet?” All IRB protocols and ethical procedures were followed (IRB protocol #18-2037).

Data Analysis

Data collected using Qualtrics XM survey software were exported into a secure MS Excel database for preliminary descriptive analyses. StataSE 16 64-bit software was used for statistical analysis and reporting. The final sample excluded individuals with missing data on a test-by-test basis. Individual session evaluation data were analyzed in MS Excel for descriptive analyses, including mean and standard deviation. The differences in means were calculated between the mean pre- and post-ratings for each knowledge and ability question in each individual session, and nonparametric testing in StataSE was used to assess the significance of the differences in means. A nonparametric testing approach was used due to low sample sizes in individual sessions. The Wilcoxon matched-pairs signed-rank test was used to explore for significant differences in the pre- and post-test ratings of participants’ reported knowledge and ability because of attending a specific session.

Qualitative feedback from each year was analyzed by two graduate-level research assistants to identify emergent themes. Each feedback statement was coded independently to determine the frequency of each theme in each year. Participants often covered multiple topics in a single feedback submission, so multiple codes may apply to each response. Data were analyzed anonymously, so there is the possibility that one participant may have submitted the same or similar feedback in response to multiple prompts.

Results

Descriptive results

Participants in the two program years were highly similar with respect to age, gender, race, years since residency and in their types of specialty practice (Table 2). Fewer attendees participated in the virtual session as compared to the in-person session (32 vs 39).

Table 2 Demographics* of ACOG Robert C. Cefalo National Leadership Institute Participants in 2021 and 2022

Quantitative Results

Regardless of virtual or in-person delivery format, the participants rated the sessions highly and retrospective pre- and post-test scores for knowledge and ability showed statistically significant differences for each of the sessions (Table 3). The mean difference in scores between knowledge and ability ratings from pre- to post- were larger for the virtual program than the in-person program for each session that had a direct counterpart, with the exception of Peer Coaching (an interactive exercise).

Table 3 Session Satisfaction and Changes in Knowledge and Ability Scores Across Virtual (2021) and in-Person (2022) ACOG Robert C. Cefalo National Leadership Institutes

Qualitative Results

Analysis of qualitative themes from open-ended response data collected after each session (Table 4) indicates highly similar responses from both the virtual and the in-person formats regarding success in connecting and networking with other attendees, reactions to breakout group experiences, learning new concepts, or general positive comments about the experiences. In general, there were more comments from virtual participants than in-person participants. Some differences that emerged were the desire for virtual participants to have more program content, comments on the challenges of the virtual setting itself, and desire for more small group time, which were not themes that emerged in the in-person setting.

Table 4 Comparison of Qualitative Results Between Virtual (2021)a and in-Person (2022)b Program Delivery

Discussion

This study compares differently delivered but nearly identical programs in equity-centered leadership training for physicians and indicates that the experience can be equally impactful whether the format is virtual or in-person. Our findings were similar to those of Nilaad et al28, when studying live virtual vs in-person pharmacy curriculum learning outcomes in medical and pharmacy students or to the team of Reddy et al29, when examining the acceptability of virtual vs in-person grand rounds. The data here show that participants responded positively in both situations and reported similarly large learning gains. Program curricula followed industry-standard models30 employing both practically focused skill-building sessions,31 grounded in adult learning theory,21 as well as valid and reliable psychological assessment instruments including a multi-rater 360 survey. Content topics were taught by expert faculty who remained largely identical between the two years and two deployment models (Table 3). In leadership development programs, psychological assessment tools are used to help increase both self-awareness as well as awareness of others, a crucial skill of emotional intelligence, as detailed by Fernandez, Steffen, and Upshaw30 and Fernandez and Fernandez.31

We have previously shown in the Clinical Scholars program that pivoting an in-person-based equity-centered leadership training to a virtual one did not sacrifice learning gains when training interprofessional teams of healthcare providers.9 Similar to this work, that research investigated the short-term outcomes of training. However, the Clinical Scholars program is a funded, 3-year development program serving interprofessional teams working to ameliorate health disparities in their local communities. The ACOG Cefalo program serves physician leaders participating as individuals and is a short-term intervention (4 days virtually, 3.5 days in-person). The finding of positive responses to our equity-centered leadership training regardless of the format (virtual or in-person) seem to be supported by the significant improvements seen in knowledge and abilities across sessions, whether the program was delivered virtually or in-person to the physicians in the cohorts here reported.

There were changes made to accommodate the virtual environment, such as shortening the convening time to prevent “zoom burnout”32 and to accommodate business hours for the four time zones in which participants resided. Based on prior work9 describing the efforts required to pivot an in-person program to a virtual one, we paid particular attention to crafting breakout sessions to be strategic in relationship to content, timing, and group make-up. Our research group was surprised and gratified (internal communications) that the virtual participants wanted more content, more examples, and more interaction given the tiring nature of day-long virtual meetings.32

There was an interesting trend for a higher proportion of the virtual participants in 2021 to respond to the session evaluation requests, although the virtual participants were fewer in number than the in-person cohort of 2022. In the virtual environment, we dedicated time at the end of each session and provided a visual reminder to evaluate the program along with a link that was loaded into the chat function. During that time screens were blanked-out and a “fluid balancing” break was offered. During the in-person program, at the end of each presentation a verbal reminder to tap into the electronic evaluation form was given, a break was offered, and participants would often immediately start forming small groups and chatting. The lower response rates of the in-person participants might be a function of the more social nature of the in-person setting and the greater challenges in getting a link to the evaluation conveniently before them. Links were sent via email each morning; however, as the program provides physical notebooks, not all participants had a computer active during the sessions. Indeed, we discouraged the use of computers during in-session time to focus the participants’ attention on the session content and interaction; quite a contrast to the virtual setting where everyone participated via computer. Links were functional via mobile devices as well as laptops.

An observation our team made post-program was that after the program, the 2021 virtual cohort only minimally used the program-based email connection system, or the Whova meeting app-based networking system provided. However, the 2022 in-person group, which had access to the same systems to foster further networking, connected dozens of times per week for a few months post-program, choosing to use the group email. The observed content of the 2022 group’s connection centered around strategizing policy issues and collaborating across states and districts. While we did not collect data around this difference, we hypothesize that the interaction could be attributable to several factors. First, there could be individual cohort differences. Secondly, significant contextual differences in issues in US healthcare policy around women’s health emerged in 2022,33 which would reasonably present a shared area of concern for these physicians. Thirdly, it is possible that a higher level of comfort developed in the cohort by being in the same shared space together. Any or all of these possibilities could have fostered the supportive connections observed in the weeks and months after the in-person program.

Participants in both programs noted success in connecting with other participants during the program. For the virtual program, connections were carefully planned and implemented to ensure that participants interacted with every cohort member during the 4 days of training. For the in-person program, some strategic planning around participant groupings occurred but most opportunities happened organically. While both delivery formats of the program were successful in terms of a variety of participant scores, the qualitative feedback was clear that a preference remains for such programs to be offered in-person whenever possible (Table 4).

One very interesting finding was the trend for knowledge acquisition scores and ability gains to be rated as higher for the virtual participants. In most cases, the content of the sessions, the speakers, even the slides were identical. To prevent “zoom fatigue”32 there was a great deal of attention to frequent breakouts into small groups for content processing and application for the 2021 virtual program, which could have played a role in this difference, in addition to the program days being shorter. The difference could have been related to the participants themselves, who agreed to take part in the first-ever virtually deployed ACOG-Cefalo program and thus might indicate a special degree of motivation and interest in leadership development and strengthening, particularly leadership with an equity-centered lens. There were several individuals nominated for the 2021 program who deferred their enrollment until the program would be held in person again, partially accounting for the lower cohort number in the virtual course. In both the virtual and in-person formats, participants rated the equity-centered leadership content as impacting their knowledge and skills, resulting in statistically significant shifts in scores regardless of the format of the learning and engagement.

The preference of mid-career and senior-level healthcare professionals for in-person training experiences incorporating multiple delivery methods that focus on practice-based learning and feedback is well known.9,11,18,34–37 Our work and that of others supported the usefulness of in-person, intensive “Leadership Boost” trainings for developing leadership skills.2,10,38–40 While across their careers, physicians can be exposed to a variety of experiences that ultimately contribute to their skills in building teams and leading their organizations through challenging times, this intensive “skills boosting approach” to training can be an important leadership strengthening component that offers a faster-track for self-insight and leadership skills acquisition.2,4,9–11,18,38–40 While such approaches are more supportive of work–life integration concerns, we also believe the immersive experience strongly supports learning, whether that learning is conducted virtually or in-person. We use the retrospective pre- and post-test with questions on ability as a proxy for measuring self-efficacy.10,24–27,40–48 “Training” is an experience (such as attending the program), while “development” refers to the ability to implement what one has learned in real-world settings. Implementation and use of skills in a workplace setting is fostered by first supporting participants’ development through a focus on specific competencies in a learning environment.2,40–49 In our evaluation approach, we use the “ability” measure as an indicator of self-efficacy.44,45 Monitoring such self-efficacy (ability) indicators across time and cohorts over time allows the program faculty to adapt and transition the curriculum to meet contextual changes in societal concerns, such as an increasing focus on physician resilience over the years and a greater emphasis on tools for managing interpersonal and organizational conflict over diversity-related tensions.

The work of Throgmorton et al6 found that building relationships and confidence in physicians through leadership training can contribute to ongoing physician resilience. The suspension of such development opportunities during the early phases of the pandemic had the unfortunate side-effect of robbing the physician workforce of yet another arm of support for their resilience. Our curriculum addressed the topic, with large gains noted in both knowledge (a delta of 2.25 for Virtual and 1.93 for In-person venues) and ability (a delta of 2.63 for Virtual and 2.43 for In-person). From this experience, we believe that it is markedly helpful to physician participants for programs to directly address the topic of resilience and to present skills and tools, regardless of the format of the interaction.

Leadership development programs10,11,18 focus on building leadership skills with the goal of strengthening the leadership potential and effectiveness of participants; thus, leadership development leads to both a greater number of leaders and a greater degree of leadership across the profession.50 Measures presented here show significant gains, even in the short-term, in participants’ reported ability, our indicator of self-efficacy, immediately after completing a retreat, regardless of whether that experience was virtual or in-person. Fassiotto1 and Throgmorton6 both showed that leadership training participants take on expanded leadership roles, are more likely to hold regional or national leadership positions, implement new projects and ideas, and engage in team development6 after completing leadership training. Our prior work has also found similar outcomes.10 Further, since the ACOG-Cefalo program launched in 2006, 19 presidents of national professional organizations for physicians in three countries have emerged from alumni of this program (unpublished data).

Limitations

As with all studies utilizing self-reported measures, this study is not without limitations. Social desirability bias may influence reports of satisfaction, knowledge, and ability ratings.51 Participants may desire to be seen as more knowledgeable or more skilled. A positive regard for program faculty may also influence question scoring. While this study offers comparison groups experiencing a nearly identical curriculum offered either virtually or in-person, it does not offer a traditional control group which was convened but not engaged in leadership content as that would be highly impractical.52 Indeed, the relatively few studies offering a control group in leadership and management training tend to be quite small53,54 or focus on delivering similar leadership materials via differing methods.55 Retrospective pre- and post-tests allow for the cohort to act as its own control by comparing ratings of knowledge and ability before and after the training and are highly useful when having a true control group for comparison is not practical or possible.

It was unfortunate that the meeting App (Whova) was not significantly used or accessed by the participants in either cohort. It may be that such a sophisticated application lends itself better to a more complex meeting with several competing sessions and that less complex events where the participants are generally all together simply do not need the organizational benefits such technology provides. The lack of participant’s significant use of the Whova platform denies us the ability to aid the better understanding of how alternative technologies can facilitate networking.

Conclusion

This comparison of nearly identical leadership training programs supports the efficacy of either virtual or in-person modalities as highly effective strategies for physician leadership development. While some participants may have a stronger motivation for a virtual environment, overall, the qualitative feedback data support a strong preference for an in-person experience, which seemed to facilitate stronger development of professional networks post-program. Significant improvements in knowledge and ability across a wide variety of sophisticated skills should be expected in physician leadership development whether the program employs a virtual or in-person platform. Our experience here indicates that tools for creating psychological safety and inclusive cultures, understanding motivation, how to engage in crisis communications and media communications, negotiation skills, leading change, physician resilience, etc, can be effectively imparted in virtual or in-person formats when instituting physician leadership programs. Further studies should assess the degree to which these improvements in physician learning endure over time, relate to career or leadership opportunity advancement, and serve to improve system-level problems.

Acknowledgments

The authors would like to acknowledge Emilie Mathura, DO, and Ms Suzanne Singer, MPH, for their contributions to this work. This work was supported by the American College of Obstetricians and Gynecologists.

Disclosure

Dr Claudia Fernandez: Mr Ruben Fernandez, JD, is the co-author of It-FACTOR Leadership, a text used in the ACOG-Cefalo Leadership Institute and is related to the corresponding author. He also serves as an executive coach in the program. Dr Claudia SP Fernandez reports grants to University of North Carolina at Chapel Hill, during the conduct of the study. The authors report no other conflicts of interest in this work.

References

1. Fassiotto M, Maldonado Y, Hopkins J. A long-term follow-up of a physician leadership program. J Health Organ Manag. 2018;32(1):56–68. doi:10.1108/JHOM-08-2017-0208

2. Fernandez CSP, Noble CC, Chandler C, et al. Equity-centered leadership training found to be both relevant and impactful by interprofessional teams of healthcare clinicians: recommendations for workforce development efforts to update leadership training. Consult Psychol J. 2022;2022:1.

3. Fernandez CSP, Corbie-Smith G, Green M, Brandert K, Noble C, Guarav D. Clinical scholars: effective approaches to leadership development. In: Fernandez CSP, Corbie-Smith G, editors. Leading Community Based Changes in the Culture of Health in the US: Experiences in Developing the Team and Impacting the Community. London: InTech Publishers; 2021:9–28.

4. Fernandez CSP, Corbie-Smith G. Leading Community Based Changes in the Culture of Health in the US: Experiences in Developing the Team and Impacting the Community. 1st ed. InTech Publishers; 2021.

5. Hopkins J, Fassiotto M, Ku MC, Mammo D, Valantine H. Designing a physician leadership development program based on effective models of physician education. Health Care Manag Rev. 2018;43(4):293–302. doi:10.1097/HMR.0000000000000146

6. Throgmorton C, Mitchell T, Morley T, Snyder M. Evaluating a physician leadership development program – a mixed methods approach. J Health Organ Manag. 2016;30(3):390–407. doi:10.1108/JHOM-11-2014-0187

7. Dannels SA, Yamagata H, McDade SA, et al. Evaluating a leadership program: a comparative, longitudinal study to assess the impact of the Executive Leadership in Academic Medicine (ELAM) program for women. Acad Med. 2008;83:488–495. doi:10.1097/ACM.0b013e31816be551

8. Corbie G, Brandert K, Noble CC, et al. Advancing health equity through equity-centered leadership development with interprofessional healthcare. J Gen Intern Med. 2022;37:4120–4129. doi:10.1007/s11606-022-07529-x

9. Fernandez CSP, Green M, Noble CC, et al. Training “pivots” from the pandemic: a comparison of the clinical scholars leadership program in-person vs. virtual synchronous training. J Healthc Leadersh. 2021b;2021:63–75. doi:10.2147/JHL.S282881

10. Fernandez CSP, Noble CC, Jensen ET, Chapin J. Improving leadership skills in physicians: a 6-month retrospective study. J Leadersh Stud. 2016;9(4):6–19. doi:10.1002/jls.21420

11. Frich JC, Brewster AL, Cherlin EJ, Bradley EH. Leadership development programs for physicians: a systematic review. J Gen Int Med. 2015;30(5):656–674. doi:10.1007/s11606-014-3141-1

12. Edmondson AC. The Fearless Organization. John Wiley & Sons; 2018.

13. Nembhard IM, Edmondson AC. Making it safe: the effects of leader inclusiveness and professional status on psychological safety and improvement efforts in health care teams. J Organ Behav. 2006;27:941–966. doi:10.1002/job.413

14. Rangachari P, Woods JL. Preserving organizational resilience, patient safety, and staff retention during COVID-19 requires a holistic consideration of the psychological safety of healthcare workers. Int J Environ Res Public Health. 2020;17(12):4267. doi:10.3390/IJERPH17124267

15. Schwartz RW, Pogge C. Physician leadership: essential skills in a changing environment. Am J Sur. 2000;180(3):187–192. doi:10.1016/S0002-9610(00)00481-5

16. Kain NA, Jardine CG. “Keep it short and sweet” Improving risk communication to family physicians during public health crises. Can Fam Physician. 2020;66(3):e99–e106.

17. Pang EM, Sey R, de Beritto T, Lee HC, Powell CM. Advancing health equity by translating lessons learned from NICU family visitations during the COVID-19 pandemic. NeoReviews. 2021;22(1):e1–e6. doi:10.1542/NEO.22-1-E1

18. Geerts J, Goodall AH, Agius S. Evidence-based leadership development for physicians: a systematic literature review. Soc Sci Med. 2020;246:112709. doi:10.1016/j.socscimed.2019.112709

19. Zoom Video Communications, Inc. Zoom Meetings; 2020. Available from: www.zoom.us. Accessed July 21, 2023.

20. Whova Event App. All-in-one event management software for in-person, hybrid, and virtual events; 2020. Available from: www.whova.com. Accessed July 21, 2023.

21. Merriam S. Andragogy and Self-Directed Learning: Pillars of Adult Learning Theory. New Directions for Adult and Continuing Education. © Jossey-Bass, A Publishing Unit of John Wiley & Sons, Inc; 2001.

22. QualtricsXM. Software. Provo, UT; 2020 Available from: www.qualtrics.com. Accessed July 21, 2023.

23. Lozano LM, García-Cueto E, Muñiz J. Effect of the number of response categories on the reliability and validity of rating scales. Methodology. 2008;4(2):73–79. doi:10.1027/1614-2241.4.2.73

24. Lam TCM, Bengo P. A comparison of three retrospective self-reporting methods of measuring change in instructional practice. Am J Eval. 2003;24(1):65–80. doi:10.1016/S1098-2140(02)00273-4

25. Pratt CC, McGuigan WM, Katzev AR. Measuring program outcomes: using retrospective pretest methodology. Am J Eval. 2000;21(3):341–349. doi:10.1016/S1098-2140(00)00089-8

26. Sprangers M, Hoogstraten J. Pretesting effects in retrospective pretest-posttest designs. J Appl Psychol. 1989;74(2):265–272. doi:10.1037/0021-9010.74.2.265

27. Rohs FR. Response shift bias: a problem in evaluating leadership development with self-report pretest-posttest measures. J Agri Educ. 1999;40(4):28–37. doi:10.5032/jae.1999.04028

28. Nilaad SD, Lin E, Bailey J, et al. Learning outcomes in a live virtual versus in-person curriculum for medical and pharmacy students. ATS Sch. 2022;3(3):399–412. PMID: 36312802; PMCID: PMC9585697. doi:10.34197/ats-scholar.2022-0001OC

29. Reddy GB, Ortega M, Dodds SD, Brown MD. Virtual versus in-person grand rounds in orthopaedics: a framework for implementation and participant-reported outcomes. J Am Acad Orthop Surg Glob Res Rev. 2022;6(1):e21.00308. PMID: 35044329; PMCID: PMC8772702. doi:10.5435/JAAOSGlobal-D-21-00308

30. Fernandez CSP, Steffen D, Upshaw V. Leadership for public health. In: Shi L, Johnson JS, editors. Novick and Morrow’s Public Health Administration: Principles for Population Based Management. 4th ed. Sudbury, MA: Jones and Bartlett Publishers; 2020.

31. Fernandez C, Fernandez R. It-Factor Leadership: Become a Better Leader in 13 Steps. Chapel Hill, NC: FastTrack Leadership; 2014.

32. Mheidly N, Fares MY, Fares J. Coping with stress and burnout associated with telecommunication and online learning. Front Public Health. 2020;8:672. doi:10.3389/FPUBH.2020.574969/BIBTEX;

33. Coen-Sanchez K, Ebenso B, El-Mowafi IM, Berghs M, Idriss-Wheeler D, Yaya S. Repercussions of overturning Roe V. Wade for women across systems and beyond borders. Reprod Health. 2022;19:184. doi:10.1186/s12978-022-01490-y

34. Grason H, Kavanagh L, Dooley S, et al. Findings from an assessment of state title v workforce development needs. Matern Child Health J. 2012;16(1):7–20. doi:10.1007/s10995-010-0701-9

35. Grimm BL, Johansson P, Nayar P, Apenteng BA, Opoku S, Nguyen A. Assessing the education and training needs of nebraska’s public health workforce. Front Public Health. 2015;3:161. doi:10.3389/fpubh.2015.00161

36. Lacerenza C, Reyes DL, Marlow SL, Joseph DL, Salas E. Leadership training design, delivery, and implementation: a meta-analysis. J Appl Psychol. 2017;102(12):1686–1718. doi:10.1037/apl0000241

37. Sonnino RE. Health care leadership development and training: progress and pitfalls. J Healthc Leadersh. 2016;8:19–29. doi:10.2147/JHL.S68068

38. McGrath ER, Bacso D, Andrews JG, Rice SA. Intentional interprofessional leadership in maternal and child health. Leadersh Health Serv. 2019;32(2):212–225. doi:10.1108/LHS-04-2018-0026

39. Margolis LH, Rosenberg A, Umble K, Chewning L. Effects of interdisciplinary training on MCH professionals, organizations and systems. Matern Child Health J. 2013;17:949–958. doi:10.1007/s10995-012-1078-8

40. Fernandez CSP, Noble CC, Jensen E, Steffen D. Moving the needle: a retrospective pre- and post-analysis of improving perceived abilities across 20 leadership skills. Matern Child Health J. 2014;19:343–352. doi:10.1007/s10995-014-1573-1

41. Tsoh JY, Kuo AK, Barr JW, et al. Developing faculty leadership from ‘within’: a 12-year reflection from an internal faculty leadership development program of an academic health sciences center. Med Educ Online. 2018;24(1):1.

42. Dave G, Noble C, Chandler C, Corbie-Smith G, Fernandez CSP. Clinical scholars: using program evaluation to inform leadership development. In: Fernandez CSP, Corbie-Smith G, editors. Leading Community Based Changes in the Culture of Health in the US - Experiences in Developing the Team and Impacting the Community. IntechOpen; 2021.

43. Chiaburu DS, Lindsay DR. Can do or will do? The importance of self-efficacy and instrumentality for training transfer. Hum Resource Dev Int. 2008;11(2):199–206. doi:10.1080/13678860801933004

44. Anderson DW, Krajewski HT, Goffin RD, Jackson DN. A leadership self-efficacy taxonomy and its relation to effective leadership. Leadersh Q. 2008;19(5):595–608. doi:10.1016/j.leaqua.2008.07.003

45. Fitzgerald S, Schutte NS. Increasing transformational leadership through enhancing self-efficacy. J Manag Dev. 2010;29(5):495–505. doi:10.1108/02621711011039240

46. Paglis LL, Green SG. Leadership self-efficacy and managers’ motivation for leading change. J Organ Behav. 2022;23(2):215–235. doi:10.1002/job.137

47. Paglis LL. Leadership self-efficacy: research findings and practical applications. J Manag Dev. 2010;29(9):771–782. doi:10.1108/02621711011072487

48. Pearlmutter S. Self-efficacy and organizational change leadership. Adm Soc Work. 1998;22(3):23–38. doi:10.1300/J147v22n03_02

49. Packard T, Jones L. An outcomes evaluation of a leadership development initiative. J Manag Dev. 2015;34(2):153–168. doi:10.1108/JMD-05-2013-0063

50. Grimm BL, Tibbits M, Maolney S, Johansson P, Siahpush M. Suggesting for strengthening the public health leadership development model. Pedagogy Health Promot. 2017;4(2):88–92. doi:10.1177/2373379917721721

51. Furnham A. Response bias, social desirability and dissimulation. Pers Indiv Differ. 1986;7(3):385–400. doi:10.1016/0191-8869(86)90014-0

52. Day CS, Tabrizi S, Kramer J, Yule AC, Ahn BS. Effectiveness of the AAOS leadership fellows program for orthopaedic surgeons. J Bone Joint Surg Am. 2010;92:2700–2708. doi:10.2106/JBJS.J.00272

53. LoVasco L, Smith L, Yorke AM, Talley SA. Students in a doctor of physical therapy program. J Allied Health. 2019;48(3):209–216.

54. Malling B, Mortensen L, Bonderup T, Scherpbier A, Ringsted C. Combining a leadership course and multi-source feedback has no effect on leadership skills of leaders in postgraduate medical education. An intervention study with a control group. BMC Med Educ. 2009;9:72. doi:10.1186/1472-6920-9-72

55. LoPresti L, Ginn P, Treat R. Using a simulated practice to improve practice management learning. Fam Med. 2009;41(9):640–645.

Creative Commons License © 2023 The Author(s). This work is published and licensed by Dove Medical Press Limited. The full terms of this license are available at https://www.dovepress.com/terms.php and incorporate the Creative Commons Attribution - Non Commercial (unported, v3.0) License. By accessing the work you hereby accept the Terms. Non-commercial uses of the work are permitted without any further permission from Dove Medical Press Limited, provided the work is properly attributed. For permission for commercial use of this work, please see paragraphs 4.2 and 5 of our Terms.