Back to Journals » Journal of Healthcare Leadership » Volume 13

Training “Pivots” from the Pandemic: Lessons Learned Transitioning from In-Person to Virtual Synchronous Training in the Clinical Scholars Leadership Program

Authors Fernandez CSP , Green MA, Noble CC, Brandert K, Donnald K, Walker MR, Henry E , Rosenberg A, Dave G, Corbie-Smith G

Received 19 September 2020

Accepted for publication 14 December 2020

Published 17 February 2021 Volume 2021:13 Pages 63—75

DOI https://doi.org/10.2147/JHL.S282881

Checked for plagiarism Yes

Review by Single anonymous peer review

Peer reviewer comments 2

Editor who approved publication: Professor Russell Taichman



Claudia SP Fernandez,1 Melissa A Green,2 Cheryl C Noble,3 Kathleen Brandert,4 Katherine Donnald,5 Madison R Walker,1 Ellison Henry,1 Angela Rosenberg,5 Gaurav Dave,2 Giselle Corbie-Smith2

1Department of Maternal and Child Health, UNC Gillings School of Global Public Health, University of North Carolina at Chapel Hill, Chapel Hill, NC, USA; 2Center for Health Equity Research, UNC School of Medicine Department of Social Medicine, University of North Carolina at Chapel Hill, Chapel Hill, NC, USA; 3Private Evaluation Consultancy, Scotts Valley, CA, USA; 4Office of Public Health Practice and Department of Health Promotion, Social and Behavioral Health, College of Public Health, University of Nebraska Medical Center, Omaha, NE, USA; 5Inside Out Enneagram Consulting, Pittsboro, NC, USA

Correspondence: Claudia SP Fernandez
Department of Maternal and Child Health, UNC Gillings School of Global Public Health, University of North Carolina at Chapel Hill, 426 Rosenau Hall, 134 Dauer Drive, Chapel Hill, NC, 27599, USA
Tel +1 919-843-5560
Fax +1-919-966-0458
Email [email protected]

Introduction: Since the inception of distance-based teaching modalities, a debate has ensued over the quality of online versus in-person instruction. Due to the COVID-19 pandemic, a number of teaching environments—including leadership development trainings for post-graduate learners—have been thrust into exploring the virtual learning environment more thoroughly. One three-year leadership development program for interdisciplinary healthcare professionals transitioned three simultaneous leadership intensives from in-person to online in the spring of 2020.
Methods: Documented changes in overall training length, session length, and session format are described. Further, evaluative data were collected from participants at both retreats via post-session surveys. Ninety-three participants attended the 2019 retreat, and 92 participants attended the 2020 virtual retreat. Quantitative data of three rating questions per session are reported: 1) overall session satisfaction, 2) participants’ reported knowledge gain, and 3) participants’ reported ability gain. Qualitative data were obtained via two open-ended feedback questions per session.
Results: In comparing pre/post scores for knowledge and ability, participants had meaningful (and in some cases higher) self-reported gains in knowledge and ability measures in the online environment, as compared to the in-person environment. Participants reported statistically significant gains in all sessions for both knowledge and ability. Qualitative data of participant feedback identified a number of positive themes similar across the in-person and virtual settings. Negative or constructive feedback of the virtual setting included time constraint issues (eg too much content in one session, a desire for more sessions overall), technical difficulties, and the loss of social connection and networking with fellow participants as compared to in-person trainings.
Discussion: While meaningful shifts in knowledge and ability ratings indicate that the transition to successful online learning is possible, several disadvantages remain. The preparation time for both faculty and participants was considerable, there is a need to reduce overall content in each session due to time restraints, and participants indicated feeling the loss of one-on-one connections with their peers in the training. Lessons learned of transitioning leadership training from in-person to an online experience are highlighted.

Keywords: leadership, training, virtual, clinical scholars, pandemic, workforce development

Plain Language Summary

Leadership training is commonly provided using face-to-face methods, where groups are brought together for multi-day onsite intensive skills development training. Clinical Scholars (funded by the Robert Wood Johnson Foundation) is one such leadership training program, comprised of 3 years of leadership and health equity training for interdisciplinary teams of healthcare providers across the United States. Due to the COVID-19 pandemic, the CS program was required to quickly pivot three of its seven in-person intensive retreats, with 92 participants in attendance, to a virtual format. We found that successful transformation from onsite to virtual training required shortened sessions (half days vs full days), trimmed content (about 35% of what was originally delivered), marked increases in staffing (approximately 6 hours per session, with three for technical support with faculty and three or more for faculty translations of the material). Nonetheless, measures of knowledge and skills indicate that virtual learning can be effective; however, participants report missing the face-to-face peer interactions and learning opportunities. While the adaptations can support successful learning and skill-building, the costs in terms of content and peer connection are significant and should be seriously considered before making such pivots to the virtual environment permanent post-pandemic.

Introduction

Since the inception of distance-based teaching modalities, a debate has ensued over the quality of online versus in-person instruction.1,2 While online teaching strategies were nascent, serious apprehensions were expressed,3 with some holding concerns over issues such as teaching quality, the experiences of faculty, technological capability, and student learning.2–4 These concerns held true whether the education was conducted in a “training and development” approach for post-graduate learners or for degree-bearing educational programs.3,5 During this time of rapid development, programs also demonstrated some success in distance-based virtual approaches.5–7 For example, Alexander and colleagues found high rates of acceptability (76% strongly agree, 24% agree to recommend the course) of a hybrid nursing course which provided a large proportion of the work via the internet.6 Further, they found that of 58 course completers (73% of course initiators), a full 80% strongly agreed that they intended to use the information gained in the course at their job.6

Garrison and colleagues posit that the transition to virtual environments requires dynamic inquiry and learning across the three elements of cognitive presence, social presence, and teaching presence.8 Effective programs, whether they are offered in an in-person or virtual format, foster interactions between faculty and participants, as well as between participants, and between participants and offered content.9 Kolb noted that higher social/peer connectivity (eg relationship to instruction and content) enables meaningful learning and experiences.5 As many of these authors discuss, transitioning a curriculum from a more traditional face-to-face learning environment to a virtual environment generally takes some level of expertise, effort and time, faculty training, user (participant) training, and infrastructure investment. Yet, whether educational or training programs had fully embraced online education or not, the arrival of the SARS-Co-V2 (COVID-19) pandemic quickly altered the academic landscape, requiring education and training programs nationwide to fully transition into distance-based online teaching.

Leadership training programs for post-graduate learners are typically rooted in immersive distraction-free face-to-face training methodology using a “retreat” or “intensive” model and now face many of the same challenges.5,10 If the programs are long term (more than 6 months), it is typical to offer some type of continuing connectivity, usually via a webinar series to develop and maintain shared identity and learning.5 Some programs have reported implementing ongoing executive coaching,11–13 online connectivity between faculty-participants,14 and/or online self-directed learning.2,3,15 Due to the COVID-19 pandemic, similar to higher education, leadership training programs across the nation have had to either halt training or pivot mid-course to distance-based/online technology. Those programs grounded in an intensive in-person leadership training model suddenly faced unique challenges to smoothly enact this pivot given that the pedagogy of such programs is often based on adult learning theory, which uses a high degree of interaction, may incorporate simulation-based learning, and suggests participants benefit greatly from informal peer-to-peer-based learning and networking. Leadership development programs traditionally incorporate several evidence-based psychological assessments and simulation activities into their pedagogy,16 which present additional challenges to translate meaningfully to a distance-based format.

The Clinical Scholars National Leadership Institute (online at ClinicalScholarsNLI.org), more commonly known as Clinical Scholars (CS), faced the need to quickly redesign the spring 2020 in-person training and development program, which offered three different, concurrent intensives to the 99 participants currently enrolled as 1st, 2nd, or 3rd-year Institute Fellows. Funded by the Robert Wood Johnson Foundation (RWJF) in 2015, CS is a three-year leadership development program enrolling interdisciplinary teams of mid-career and beyond healthcare professionals in cohorts of up to 35 members. Fellows of the CS program progress through seven unique onsite curricular experiences (referred to by a color-coding system for clarity: Red, Orange, Yellow, Green, Blue, Indigo and Violet). Onsite intensive programs traditionally convene every 6 months, with Red, Yellow and Blue offered each fall and Orange, Green and Indigo offered in the spring. The Violet curriculum represents a graduation program and is convened separately. In spring 2020, the Orange, Green, and Indigo intensive programs were quickly re-designed from multi-day in-person programs to a multi-day virtual format (Table 1). This sudden change gave the CS leadership team the opportunity to compare identical elements of the same curriculum when offered in person (2019) and virtually (2020), and to share lessons learned from the re-design and virtual deployment. This paper describes the transition from on-site to virtual learning of the three, simultaneous leadership training intensives originally planned for Spring 2020 (Orange, Green and Indigo curriculums) and includes session planning logistics, and a comparison of participant knowledge and ability acquisition between a face-to-face and virtual learning environment.

Table 1 Comparison of 2019 and 2020 Spring Retreat Programs

Materials and Methods

Descriptive Data Collection

In late February 2020, the original three residential onsite curricula (Orange, Green, Indigo) programs were rescheduled from April to June as residential programs and reduced from a five-day to four-day schedule. As the pandemic continued to worsen in April 2020, the format was re-planned for a virtual platform for June. The CS team examined the sessions planned, abbreviated the traditional eight-hour teaching days, and adapted select topics for virtual sessions. Table 1 compares the number of sessions and training hours for each color-coded retreat in both years, with 2019 representing onsite/in-person training and 2020 representing virtual training. Not all sessions from 2019 were offered in 2020. Descriptions of session structure and content represent a “teacher” perspective of those delivering the program.

Quantitative and Qualitative Data Collection

For all 2019 and 2020 retreat sessions, fellows completed evaluation surveys to rate each training session along seven dimensions, three of which are presented in this analysis: overall presentation satisfaction, reported change in session-specific knowledge, and reported change in the ability to utilize session-specific skills (analysis of all dimensions will be presented in a forthcoming publication). All ratings were obtained using a 7-point scale (strongly-, moderately-, mildly-disagree; neutral; mildly-, moderately-, strongly agree).17 A retrospective pre- and post-test format was used for both knowledge and ability ratings, with participants being asked to rate their session topic-specific knowledge and ability both before and after attending the session. This approach was utilized in order to diminish the effect of response-shift bias on the evaluations,18–21 is supported by previous research, and is commonly used in educational and training programs.12,20–28 All session-specific knowledge and ability questions were developed by the evaluation team based on the learning objectives provided by the presenters for each session. To ensure accuracy, questions were reviewed by presenters and curriculum staff for both the face-to-face residential sessions in 2019 and the newly adapted virtual sessions in 2020.

Open-ended qualitative feedback data were collected in both 2019 and 2020. Fellows were asked to respond to two open-ended questions. The first question solicited feedback on the specific session they had just attended, and the second question solicited overall feedback on the entire retreat so far. For the in-person 2019 retreat, session-specific feedback was collected via the following prompt: “Is there any other feedback you would like to share about this particular session? Anything we can do to make it better for the next cohort?” Overall session feedback was collected via the following prompt: “Do you have any feedback or thoughts to provide about the onsite institute thus far?” For the 2020 virtual/distance-learning retreat, the open-ended questions were adapted to capture Fellows’ experience in a virtual setting. Session-specific feedback was collected via the following prompt:

Reflecting on your experience participating in this session in a VIRTUAL setting, please provide any comments or feedback. (i.e., What worked well? What didn’t work well? Any suggestions for improvement?)

Overall retreat feedback was collected via the following prompt: “How can we improve the virtual retreat experience overall if we need to offer similar learning opportunities in the future?”

Data were obtained using online REDCap survey software (www.project-redcap.org). In 2019, Fellows were strongly encouraged to complete surveys online, though paper surveys were made available to those who preferred them. All surveys were completed online during the 2020 retreat. The Internal Review Board (IRB) determined that this study is educational in nature and exempted from IRB review and approval.

Data Analysis

Quantitative Data

Data were collected using REDCap software and exported into a secure MS Excel© database for preliminary descriptive analyses. IBM SPSS Statistics 27© software was used for statistical analysis and reporting. The final sample excluded individuals with missing data on a test-by-test basis. Data were assessed using descriptive analyses, including mean, maximum and minimum values, and 95% confidence intervals. In addition, the differences in means were calculated between the pre- and post-values for each knowledge and ability question, and nonparametric testing was used to assess significance (Table 2). A nonparametric testing approach was chosen because of low sample sizes from each training session and an assumption that the evaluative data had a non-normal distribution, likely skewed left. The Wilcoxon signed-rank test was chosen to explore whether there was a substantive difference between the pre and post levels of Fellows’ knowledge and ability as a result of attending a particular learning session. Two tests were run using data from each learning session – one to explore differences in knowledge, one to explore differences in reported ability. Missing data were treated using pairwise deletion for each test.

Table 2 Session Satisfaction and Changes in Knowledge and Ability Scores Across Retreat Sessions: 2019 Face-to-Face vs 2020 Virtual Instruction

Qualitative Data

Session-specific and overall retreat open-ended feedback were collected from Fellows at both the 2019 in-person and 2020 virtual retreat. Data were collected using REDCap software and exported into a secure MS Excel© database for analysis. For this section of the analysis, data from all sessions at each retreat were included, as compared to the quantitative analysis, which only included data from those sessions taught at both the 2019 and 2020 retreats. Data from the 2019 retreat included feedback from 46 total sessions. Data from the 2020 retreat included feedback from 26 total sessions. In order to gain an overall understanding of Fellows’ experiences, feedback from both open-ended questions (session-specific and overall) from each of the three cohorts in attendance at each retreat were combined for the analysis. Data from each retreat were analyzed independently by a graduate-level research assistant (MRW) to identify emergent themes. The analysis was conducted to determine feedback theme frequency. As such, multiple themes were coded from many individual submissions, owing to the fact that Fellows often covered more than one topic in their feedback. Data were collected anonymously, thus there is the possibility that the same person may have submitted the same or similar feedback topics multiple times. The frequency of topics addressed in the feedback was tabulated, and similar topics were combined into emergent themes (eg the theme of “Structural Adjustment” includes topics such as change session order, provide additional materials, etc.)

Results

Descriptive Data

The CS leadership team decided to reduce the number of curriculum sessions offered virtually because of the combination of the emotional toll of the coronavirus on healthcare professionals, the exhaustion concomitant with day-long virtual meetings, and recognition that many Fellows were balancing multiple other needs, particularly at home. Based on the judgement of best fit, the CS leadership team decided that specific sections would be taught virtually synchronously, some would be taught virtually asynchronously with live-webinar discussion to follow during the summer, and some sessions would be offered via self-directed learning without specific follow-up. Sessions that were based on simulations were removed from the schedule for inclusion at a future program when in-person convening might be possible. The change in date due to the original postponement also influenced the decision to reduce the number of sessions. The content from the 2019 in-person sessions was translated into a virtual format for the 2020 virtual retreat, with a total of 11 sessions held in both the Spring 2019 and Spring 2020 retreats. Seven out of 11 sessions (63.64%) were taught by the same instructors in both 2019 and 2020 (see Table 2). In 2019, all sessions were mandatory for the cohort members. In 2020, due to the effects of the pandemic, all sessions were encouraged but not mandatory. Table 2 describes the virtual synchronous sessions only and not those offered in later follow-on or self-paced modalities.

Quantitative Data

Table 2 compares the overall session satisfaction ratings and pre- and post-scores for knowledge and ability for the learning sessions presented at both the in-person (2019) and virtual (2020) retreats. Participants rated all sessions highly in both 2019 and 2020, with the lowest overall mean session satisfaction rating for 2019 being 5.81 (Media Communications) and the lowest for 2020 being 5.09 (Concept Mapping); the highest ratings for 2019 and 2020 being 6.89 and 6.79 (both for Presenting Feedback & Coaching). Analysis of participants’ reported knowledge increases showed a range of 1.46–2.42 in 2019 and of 1.22–2.42 in 2020. The reported increases in ability ranged from 1.22 to 2.29 in 2019 and from 1.39 to 2.67 in 2020. Wilcoxon signed-rank tests for each session indicated that post-session ranks were statistically significantly higher than pre-test ranks for all sessions in both the in-person and virtual retreats (See Appendix 1 for Wilcoxon signed-rank test results).

Qualitative Data

Fellows provided positive feedback and suggestions for future retreats. Table 3 compares top emergent themes identified in the qualitative content analysis from the 2019 and 2020 spring retreats. Six-hundred and nineteen total submissions were received across all sessions in the 2019 retreat, and 416 total submissions were received across all sessions in the 2020 retreat. Frequencies and code definitions are also provided.

Table 3 Combined Session-Specific and Overall Retreat Feedback: Face-to-Face vs Virtual Instruction

Examples of positive feedback provided by Fellows for the 2019 retreat include: an overall positive experience; content presented was helpful and useful for team WPIP projects, as well as in their careers more generally; in-person team time and time with other Fellows (to work on specific exercises, to get feedback, or to socialize) was greatly appreciated; and session facilitators were engaging and knowledgeable. Primary constructive feedback revolved around structural changes that could be implemented to improve upon the retreat experience, such as altering the session order, providing additional materials, and adding pre-retreat work. Additionally, some Fellows suggested content-specific adjustments, such as including more inclusive examples or that certain sessions were not relevant to all Fellows depending on their existing skill sets.

Fellows reported that the virtual/distance-learning format of the spring 2020 retreat provided unique benefits and barriers to their overall learning and experience. Examples of positive feedback for the 2020 virtual retreat included: overall positive experience despite the new virtual environment; appreciation of the innovative and frequent use of “breakout rooms” (ie, a Zoom meeting feature that allows a large meeting to be split into multiple smaller meetings, lending to more intimate and participatory-focused discussions); appreciation of the multiple opportunities to practice as a group; appreciation of the multiple opportunities to receive feedback from their peers; content presented was helpful and useful for team WPIP projects, as well as in their careers more generally; and appreciation of the utilization of engaging and dynamic speakers to help prevent Zoom fatigue and increase engagement with the curriculum.

While Fellows reported significant positive feedback for the 2020 virtual retreat, top-reported feedback revolved around constructive changes for future retreats. Feedback examples included more variable and frequent disadvantages within the virtual format, including: a decreased sense of social connection; too much content being crammed into a short session and/or important content left out due to time constraints; too little time in sessions and breakout rooms, as well as too few sessions overall; technical difficulties (eg internet connectivity, trouble hearing); logistical difficulties (eg having to keep track of multiple link locations, receiving a higher number of emails about the retreat); and the overall feeling that a virtual experience does not compare to an in-person experience, regardless of how well the virtual retreat was executed.

Discussion

Prior to 2020, the CS program did not have any plans to alter the retreat design to a virtual format. Due to concerns over the COVID-19 pandemic, in February 2020, the CS leadership team postponed the April spring onsite intensive programs to June. However, in April 2020 the severity of the outbreak became clear and the team decided to shift to a virtual retreat. While distance-based education has stirred a debate over both the quality of the learning experience and its impact on participant learning, the CS experience found meaningful self-reported gains in knowledge and ability are possible in a virtual format, even on nuanced and sophisticated topics (Table 2). In standard 360-degree review assessments, a 0.5 difference is held to be a meaningful and significantly notable difference with respect to behavioral ratings;29 however, the changes reported here are nearly always at least three times greater than that level. Previous studies22,27 have noted statistically significant changes in similar ratings of skills-development which were comparable to levels seen in Table 2. The smallest changes in knowledge ratings in each year were 1.46 (Peer Coaching, 2019) and 1.27 (Recovery, Recovery, Resilience & Resistance: Communities Responding to Disasters, 2020) while the largest changes in knowledge ratings were 2.42 [Presenting, Feedback, and Coaching (a practice-based session), 2019] and 2.54 (Policy Solutions for Wicked Problems, 2020). In terms of ability ratings, the smallest score changes each year were 1.22 (Media Communications, 2019) and 1.39 (Presenting Your Best You, 2020) while the largest ability gains measured 2.29 and 2.67 (both for Policy Solutions for Wicked Problems, 2019 and 2020, respectively). Statistical analysis further supported the meaningful difference between these pre- and post-test ratings. Clearly these participants report the “needle to be moving”22 to profound levels on knowledge and skills development. Our results are consistent with previous findings2,30–32 that effective online learning environments share three characteristics: (1) course material designed for learners delivered by well-prepared instructors who facilitate an interactive environment [Course materials]; (2) sense of community [Connection]; and, (3) engagement of newly available technology [Technology]. Each of these characteristics is described below.

Course Materials

In the transition from onsite to virtual learning, the Clinical Scholars staff team had to be incredibly diligent about which material to cover, how to properly prepare faculty, and how to best offer an engaging, immersive, and interactive learning environment.

Overall, when it came to course content, our experience showed that providing leadership training virtually was far less efficient than onsite training due to several factors. As noted above, unlike the typical in-person intensive held in 2019, it was not possible to hold 5 days of 8-hour programming in the virtual environment in 2020 for a variety of reasons. First, we were conscious that many of our participants were frontline healthcare responders in the COVID pandemic, were personally impacted by coronavirus infection, and/or were caretaking for family members. Next, because of the tiring nature of continual on-screen interaction, CS offered only half-day curriculums instead of full days. The co-convened CS virtual offerings in 2020 ran from 12:00 PM to 5:30 PM Eastern time (in-person times in 2019 were 8:00 AM to 5:30 PM plus optional evening social events). The shortened time resulted in the synchronous curricula offering only about 35% (range: 25–46%) of the content offered in 2019. A final concern was the range of time zones affecting our participants (six time zones from Puerto Rico to Hawaii), which limited the available time for the program to occur during reasonable business hours for all participants.

Due to these time constraints, the team was forced to make difficult decisions about which content to cut based on criteria including a) best fit for the new format (eg removing non-virtual-adaptable interactive simulations); b) context-relevant content (eg Communicating in High Stakes Situations), c) faculty experience with distance-based teaching, and d) ability to meet the overall program core competencies. Despite the hard choices about which sessions to drop, delay or transform to self-directed learning, the presented curriculum met all of the CS learning objectives and advanced the program core competencies for these sessions.14 However, in order to ensure as much content was covered as possible in the new, shortened time frame, some of the sessions required pre-work on the part of participants. This pre-work had not been required in previous years as a part of the in-person events and is not represented in Table 1. Pre-work involved completing short (~30 minute) online leadership and health equity modules (WeTrainLeaders.com, n.d.33), the content of which replaced some didactic session material, so that the focus could be on the next layer of learning and skills building. This approach modeled the flipped-classroom method. However, even with all of these efforts, not all the information and skills taught in 2019 were incorporated in the 2020 sessions; thus, while learning objectives were met, they were not met as deeply or as broadly.

Another way in which the content was shifted due to the virtual environment was around the time dedicated to sessions that were repeated in 2020 from the 2019 program. In an effort to facilitate participant attention, the program endeavored to offer sessions in 2-hour time slots, despite the fact that many of the topics were 3 to 4-hour sessions in their in-person format in 2019. These time-related format changes resulted in about 25% of the original session content being trimmed (example affected sessions included Concept Mapping, Communicating in High Stakes Situations, Negotiation Seminar, and Policy Solutions for Wicked Problems). In a few cases, the 2-hour time block format increased session time (usually by 25%), which allowed the session to continue similarly to the previous year (example affected sessions: Peer Coaching and Dealing with Conflict which were both 90 minutes in 2019). Overall, taking into consideration the reduced session time and the deleted content, the 2020 experience represented ~35% of the materials presented in 2019.

Additionally, our staff worked at length and directly with faculty to prepare them for the virtual teaching space and alleviate technical difficulties.30 Ni1 noted that faculty transitioning to virtual environments may experience a reduced sense of control over the learning environment, which was confirmed by one of the CS external speakers (personal communication). Sun and Chen’s review of effective online practices found that faculty effectively transitioning to online learning environments shift their teaching content towards more precise instructions and content delivery.2 In our program, helping external faculty transition to the virtual environment entailed on average 2–3 additional hours per speaker, plus an additional 2–3 hours of technical behind-the-scenes work in order to prepare for the sessions. As the CS Team is made up of many subject matter experts, internal program faculty typically provide more than 50% of the sessions for any intensive training experience. Internal faculty found that adapting an already highly interactive training session to a virtual environment required an additional 3 hours minimum of “translation” time. This work included adapting slides, adapting or creating new exercises to suit the virtual environment, creating detailed “Facilitator and Producer Guides”, creating collaborative workspaces for participants (eg Google.docs), and developing new learner-support handouts. The qualitative results above suggest more targeted work with internal and external speakers was needed to ensure they did not cover more materials than could be addressed in the given time frame.

As part of the adaptation to virtual deployment, the CS team worked purposefully to create virtual spaces that encouraged participant engagement and interaction. For example, the CS team required that for every 30 minutes (approximately) of virtual programming the session needed to offer at least one form of significant participant engagement and interaction. This engagement could occur via a variety of strategies but mostly took the form of break-out rooms with assigned activities, which were tracked on a web-based shareable space. Staff worked with both internal and external speakers to adapt the content of the most crucial lessons, in order to avoid a lecture format dominating the experience or the speaker rushing through content. In another strategy designed to ensure as much content was covered as possible with as much participant interaction as possible, some sessions embraced a “flipped classroom” approach, as described above. This approach allowed more of the synchronous time to be devoted to interactive application exercises, which is a possible explanation for the slightly higher average knowledge and ability gains in the virtual environment as compared to the in-person one. While greater time was devoted to those exercises, fellows were exposed to less content overall, which could translate to broader knowledge and ability gains in 2019 but with slightly lower perceived skills in that group.

Lessons Learned: Course Materials

● Time restraints for virtual learning—that do not exist for in-person trainings—include training across multiple time zones and screen-time fatigue.

● Virtual content delivery is far less efficient as compared to in-person delivery.

● Because intensive virtual leadership development was not as common pre-COVID19, program staff will need to work closely with faculty to ensure they are creating engaging, interactive spaces for virtual learning.

● Reduction of content to fit the required form-factor is necessary and can be challenging for faculty.

● Successful in-person trainings do not automatically convert to successful virtual trainings without attention and intention around engagement and interaction.

Connection

Quantitative results show that it is possible to move the needle on knowledge and ability in a virtual learning environment about as well as in an in-person environment. However, a significant goal of the Clinical Scholars program is not only to create knowledgeable and skilled healthcare leaders, but also to connect them to one another for stronger and broader peer (alumni) networks. Effective leadership initiatives pair immersive learning formats with reflexive self-assessment and opportunities to network with and receive feedback from peers.10 While the evaluation results suggest we were able to meaningfully recreate the learning part of that goal, the connection part fell short of our program’s expectations. Creating networking and relationship building opportunities as participants discuss and “make meaning” of the experience in a virtual environment presents unique challenges as compared to in an immersive in-person environment. During the 2019 programs, those opportunities often naturally fell into place through participants eating meals together, walking to meeting rooms together or exercising at the start or end of days, talking during program snack breaks, or relaxing together at social opportunities in the evenings at the venue. A program does not have to intentionally plan a walk to the snack table; it just happens. In the virtual environment, everything must be planned. We started by creating group guidelines and expectations that cameras are on, and microphones are at the ready. We continued that thinking by the intentional creation of engagement and interaction in sessions, partly for the learning component, and partly for peer interaction and sharing. While qualitative feedback themes from the Spring 2020 virtual deployment indicated a high degree of acceptance of the new approaches tried (given the pandemic), and an appreciation for the skills learned, a clear theme emerged: informal peer-to-peer connection time is critical. We needed to purposefully plan virtual small group “snack breaks” and “fireside chats.” Upon realizing this late in the game, in Spring 2020, the CS team experimented with 1-hour optional spontaneous “listening” sessions to hear participants’ thinking and responses to the curriculum. While those were helpful, in the future the CS team would offer more frequent and timely open spaces for relationship building and networking that are participant-driven and get to the heart of the connections that 2020 participants said they were lacking.

Although virtual sessions have been offered since the pandemic started with the intent of supporting general peer networking and coping with COVID, participants noted the importance of personal connection, networking, and informal peer-to-peer learning is more challenging in a virtual environment.

Lessons Learned

● Informal inter-participant connection time is important.

● Offer pre- and post-convening times each day during the virtual training for the participants to reconnect, network, share perspectives, and process the experience within their cohort.

Technology

The CS staff knew immediately when the decision was made to switch to online learning that they would need additional skills to produce high-quality training in the virtual environment. Given that this was our first deployment of an all-virtual retreat, the CS team participated in a Virtual Facilitation Bootcamp rooted in the Technology of Participation® methods34 to enhance skills in producing sessions. Additionally, CS staff reviewed the literature to glean best practices for hosting virtual meetings, and applied lessons from their own participant-centered training and experiences with Zoom convenings to the 2020 trainings. To share this learning, the team created a group resource hub, which was made available to all team members and external faculty presenters. In addition, to ensure the faulty felt comfortable with the technology, the CS staff adapted a checklist and speaker form used for onsite sessions to confirm audio, video, and internet connection quality as well as identify session components and Zoom technology needs such as polling features and breakout rooms, and other technology such as videos, Google documents and spreadsheets, and online visual team collaboration software.

What’s more, our team identified staff to serve as producers to manage virtual synchronous sessions. These individuals were responsible for tasks such as admitting program participants (preventing “Zoom bombing”) and managing late entries, questions, and technical assistance needs. We assigned two “producers” to each session and created Producer/Facilitator Guidelines for each session with explicit minute-by-minute instructions including links to group activities (such as on a Google doc or a website) for breakout exercises. The sessions led by external speakers also involved a faculty host. Thus, with this additional work, we found that virtual sessions required twice the staffing of our typical onsite sessions.

While even an onsite program requires advance planning including site visits, the amount of pre-planning to make a successful pivot to distance-based virtual synchronous delivery was considerable. In addition, every session was given a technology dry run to ensure a smooth deployment. One hidden cost of providing the curriculum in a virtual format was the time investment in working with speakers. Our team estimated that working with speakers’ virtual teaching and technology challenges required an average of 2–3 hours per speaker, as noted above. This time was in addition to the previously listed translation-to-virtual activities.

Lessons Learned

● Additional skills development and training for staff and faculty are highly desirable when making a pivot to a new technology or a new approach.

● Managing synchronous virtual sessions requires far greater staff resources.

In sum, the Clinical Scholars program experienced a highly successful transition to virtual delivery, but not without concomitant costs that effectively reduced the content delivered and increased the infrastructure requirements. While the program experienced financial savings in terms of transportation and venue costs, virtual deployment remains far less efficient and comprehensive than face-to-face delivery. While the inefficiencies for the teaching role are evident, it is important to recognize that this concept of “efficiency” for learners cannot be measured in time alone. Although travel time was not required and full days of engagement became half days, it is crucial to recognize that when effectiveness is sacrificed for either time constraints or the constraints of a virtual interactive platform, less learning happens, making it inherently less efficient. The learning may still be high quality–but there is invisible added time when that learning is either moved elsewhere (such as self-directed asynchronous) or simply does not occur. In the experience reported here, we carefully ensured that all the overarching competencies were still able to be addressed–and yet, some content was inevitably missed in making the pivot from onsite to virtual. In general, materials are able to be explored in both greater depth and breadth in onsite convenings, which also offer the benefit of participant-initiated informal processing of learning, which itself both enhances learning and promotes networking.

Were a workforce development program given in the virtual environment that matched all the learning objectives and skills training accomplished in the face-to-face environment, there would still be “invisible” costs to consider. The cost of participant’s time away from clinical and other duties to attend such programs is not insignificant. This must be considered, particularly given that the inefficiencies of virtual learning would result in the need to double or triple the total time required to deliver an identical program that offered a comparable amount of content and materials to an in-person training. While virtual delivery can be successful, our experience indicates it hits far more targeted and selective learnings than residential programs. We encourage others to assess the above factors before concluding that post-pandemic virtual delivery is an appropriate selection merely to save on venue and transportation costs.

Limitations

This examination has several limitations. First, our program seemed to benefit from the fact that all our participants had prior experience with one another and had engaged in many sessions in their Fall 2018 program (Red, Yellow, and Blue curricula) which were designed to develop trust and networking. Third-year Fellows had the benefit of working with their Cohort for five previous training retreats, second-year Fellows had three previous retreats, and first-year Fellows had one previous retreat. The insights might be far more limited when applied to groups who are convening as strangers and have not yet had the opportunity to bond. Second, given that this was the first major pivot to virtual delivery for the vast majority of our speakers and our staff, a certain learning curve is to be expected. While our staff has extensive experience administering a robust distance-based, multi-platform component of the program including webinars, executive and team coaching, expert consultants, etc., running three simultaneous virtual intensive retreats was quite different. It could be that if the pandemic lasts multiple years that system improvements and experience will streamline delivery. Further, as virtual learning becomes more acceptable and “normal” to participants, it may be that session times or intensity can increase. However, it may also be that the emphasis on participant interaction and engagement subsequently impact the format and delivery of in-person convened training instead.

There are limitations in the quantitative data and statistical procedures. As a result of small sample sizes from each training session, a nonparametric approach (Wilcoxon signed-rank test) was chosen in the analysis phase. Nonparametric analyses tend to have lower power which may be exacerbated by a small sample size.

Additionally, there are limitations in the way the qualitative data were collected and analyzed. First, the questions aimed to gather session-specific feedback and overall feedback differed from 2019 to 2020. Second, questions were asked to gain constructive improvement feedback rather than acknowledgement of elements that were working well, which could have resulted in biasing the data towards identifying problems. Third, all data were coded by only one graduate-level research assistant, introducing an increased likelihood of human error and bias into the coding and interpretation of results. Finally, because all evaluative data are collected anonymously in order to ensure confidentiality, it is possible that some Fellows may have submitted the same feedback topic for multiple sessions, which could skew the qualitative themes toward the experience of a small number of Fellows. However, the purpose of the qualitative analysis was to better understand the overall themes of Fellows’ experience and to inform internal programmatic discussion about how to improve the retreat curriculum and delivery. The data is not intended to be generalizable to all training programs, but to inform the lessons learned by Clinical Scholars program staff as they orchestrated the shift from in-person to virtual retreat delivery.

Future Research

Future research could benefit those making the pivot from onsite to virtual delivery of leadership training by focusing on (1) how to design course materials for learners specifically suited to virtual delivery; (2) how to prepare and support instructors to facilitate an interactive and engaging virtual environment; (3) how to create a sense of community or “cohort-ness” among participants in the virtual environment; and, (4) development and engagement of newly available technology.2

Conclusions

The current realities of globalization and public safety require educators to have effective skills in adapting face-to-face to online distributed environments. Rapid adaptation of leadership training for a virtual environment requires faculty and design teams to reimagine how learners and faculty interact with material and each other. Leadership training offered in a virtual environment can be a positive experience and advance knowledge and skills development. Educators need time and personnel support to determine the formula to navigate technology and manage sufficient connectivity across faculty-participant, participant-participant, and participant-content interactions. All content may not fit the virtual synchronous learning platform; thus, the practical costs of exclusive virtual environments sacrifice content, camaraderie, and potentially even competency development.

Data Sharing Statement

This data is not publicly available but can be shared upon request.

Ethics Approval

The Internal Review Board (IRB) determined that this study is educational in nature and exempted from IRB review and approval (IRB # 16-1817). Fellows in the Clinical Scholars Program provide consent for their data to be used for educational research and program improvement purposes.

Publication Consent

The authors of this manuscript consent for it to be published.

Acknowledgments

The authorship team would like to thank Ms. Lia Garman for her contributions to the manuscript through background research and Ms. Rachel Berthiaume for her contributions to transitioning the program to the virtual experience.

Angela Rosenberg is retired faculty from the University of North Carolina at Chapel Hill, Chapel Hill, North Carolina, USA.

Author Contributions

All authors contributed to data analysis, drafting or revising the article, have agreed on the journal to which the article will be submitted, gave final approval of the version to be published, and agree to be accountable for all aspects of the work.

Funding

This work was supported by a generous grant from the Robert Wood Johnson Foundation.

Disclosure

Dr Claudia SP Fernandez a principal author of modules in the FastTrack Leadership (FTL) Library, a private company that collaborates to provide the online leadership library used in the Clinical Scholars Program and with other leadership institutes. The company is owned and operated by Mr. Ruben Fernandez, J.D., (spouse), who is also a leadership expert and he is contracted as both a faculty member and an executive coach in the program (as one of the many consultants who do such work). This activity is governed by a Conflict of Interest Management Plan at the University and is reviewed annually. Dr Kathleen Brandert reports: I am a co-author on a FastTrack Leadership Library online module (mentioned in the paper). The authors report no other potential conflicts of interest for this work.

References

1. Ni AY. Comparing the effectiveness of classroom and online learning: teaching research methods. J Public Aff Educ. 2013;19(2):199–215. doi:10.1080/15236803.2013.12001730

2. Sun A, Chen X. Online education and its effective practice: a research review. JITE Res. 2016;15:157–190.

3. Silbergh D, Lennon K. Developing leadership skills: online versus face-to-face. J Eur Ind Train. 2006;30(7):498–511. doi:10.1108/03090590610704376

4. Maki RH, Maki WS, Patterson M, Whittaker PD. Evaluation of a Web-based introductory psychology course: i. Learning and satisfaction in on-line versus lecture courses. Behav Res Meth Instrum Comput. 2000;32(2):230–239. doi:10.3758/BF03207788

5. Kolb DG, Prussia G, Francoeur J. Connectivity and leadership: the influence of online activity on closeness and effectiveness. J Leader Organ Stud. 2009;15(4):342–352. doi:10.1177/1548051809331503

6. Alexander LK, Dail K, Davis MV, et al. A pilot hybrid internet/classroom-based communicable disease continuing education course for public health nurses in North Carolina: lessons learned. J. Public Health Manag. Pract. 2005;11(6):S119–S122. doi:10.1097/00124784-200511001-00020

7. Kleiman G, Wolf MA. Going to scale with online professional development: the friday institute MOOCs for educators (MOOC-Ed) initiative. North Carolina State University College of Education. In: Dede C, Eisenkraft A, Frumin K, Hartley A, editors. Teacher Learning in the Digital Age: Online Professional Development in STEM Education. Cambridge, MA: Harvard Education Press; 2016:49–68.

8. Garrison DR, Anderson T, Archer W. Critical inquiry in a text-based environment: computer conferencing in higher education. Internet High Educ. 1999;2(2–3):87–105. doi:10.1016/S1096-7516(00)00016-6

9. Marks RB, Sibley SD, Arbaugh JB. A structural equation model of predictors for effective online learning. J Manag Educ. 2005;29(4):531–563. doi:10.1177/1052562904271199

10. Haden NK, Hendricson WD, Killip JW, et al. Developing dental faculty for the future: ADEA/AAL institute for teaching and learning, 2006–09. J Dent Educ. 2009;73(11):1320–1335. doi:10.1002/j.0022-0337.2009.73.11.tb04824.x

11. Eide T, Dulmen SV, Eide H. Educating for ethical leadership through web-based coaching. Nurs Ethics. 2016;23(8):851–865. doi:10.1177/0969733015584399

12. Fernandez CSP, Noble CC, Jensen ET, Martin L, Stewart M. A retrospective study of academic leadership skill development, retention and use: the experience of the food systems leadership institute. J Leadersh Educ. 2016;15(2):150–171. doi:10.12806/V15/I2/R4

13. Patel N, Vemuri D, Frasso R, Myers JS. Perceptions of health care executives on leadership development skills for residents after participating in a longitudinal mentorship program. Am J Med Qual. 2019;34(1):80–86. doi:10.1177/1062860618786798

14. Fernandez CSP, Corbie-Smith G, Green M, Brandert K, Noble C, Dave G. Clinical scholars national leadership institute: effective approaches to leadership development. In: Fernandez CSP, Corbie-Smith G, editors. Leading Community Based Changes in the Culture of Health in the US: Experiences in Developing the Team and Impacting the Community. InTech Publishers; in press 2021.

15. Fernandez CSP, Noble C, Jensen E. An examination of the self-directed online leadership learning choices of public health professionals: the MCH PHLI experience. J Public Health Manag Pract. 2017;23(5):454–460. doi:10.1097/PHH.0000000000000463

16. Lucas R, Goldman EF, Scott AR, Dandar V. Leadership development programs at academic health centers: results of a national survey. Acad Med. 2018;93(2):229–236. doi:10.1097/ACM.0000000000001813

17. Lozano LM, García-Cuet E, Muñiz J. Effect of the number of response categories on the reliability and validity of rating scales. Meth Eur J Res Meth Behav Soc Sci. 2008;4(2):73–79.

18. Furnham A. Response bias, social desirability and dissimulation. Pers Individ Dif. 1986;7(3):385–400. doi:10.1016/0191-8869(86)90014-0

19. Howard GS, Dailey PR, Gulanick NA. The feasibility of informed pretests in attenuating response-shift bias. Appl Psychol Meas. 1979;3(4):481–494. doi:10.1177/014662167900300406

20. Rohs FR. Response shift bias: a problem in evaluating leadership development with self-report pretest-posttest measures. J Agric Educ. 1999;40(4):28–37. doi:10.5032/jae.1999.04028

21. Thomas EV, Wells R, Baumann SD, et al. Comparing traditional versus retrospective pre-/post-assessment in an interdisciplinary leadership training program. Matern Child Health J. 2019;23(2):191–200. doi:10.1007/s10995-018-2615-x

22. Fernandez CSP, Noble CC, Jensen E, Steffen D. Moving the Needle: a retrospective pre- and post-analysis of improving perceived abilities across 20 leadership skills. Matern Child Health J. 2015;19(2):343–352. doi:10.1007/s10995-014-1573-1

23. Fernandez CSP, Noble CC, Jensen ET, Chapin J. Improving leadership skills in physicians: a six month retrospective study. J Leadersh Stud. 2016;9(4):6–19. doi:10.1002/jls.21420

24. Lam TCM, Bengo P. A comparison of three retrospective self-reporting methods of measuring change in instructional practice. Am J Eval. 2003;24(1):65–80. doi:10.1177/109821400302400106

25. Pratt CC, McGuigan WM, Katzev AR. Measuring program outcomes: using retrospective pretest methodology. Am J Eval. 2000;21(3):34–349. doi:10.1177/109821400002100305

26. Rockwell S, Kohn H. Post-then-pre evaluation. J Ext. 1989;27(2).

27. Saleh SS, Williams D, Balougan M. Evaluating the effectiveness of public health leadership training: the NEPHLI experience. Am J Public Health. 2004;94(7):1245–1249. doi:10.2105/AJPH.94.7.1245

28. Sprangers M, Hoogstraten J. Pretesting effects in retrospective pretest-posttest designs. J Appl Psychol. 1989;74(2):265–272.

29. Musselwhite C Discovery leadership profile. Facilitator manual. MHS Talent Development, Toronto, Canada. 2015.

30. Arbaugh JB, Duray R. Technological and structural characteristics of student learning and satisfaction with web-based courses. Manag Learn. 2002;33(3):331–347. doi:10.1177/1350507602333003

31. Arbaugh JB, Benbunan-Fich R. The importance of participant interaction in online environments. Decis Support Syst. 2007;43(3):853–865. doi:10.1016/j.dss.2006.12.013

32. Roddy C, Amiet DL, Chung J, et al. Applying best practice online learning, teaching, and support to intensive online environments: an integrative review. Front Educ. 2017;2:59. doi:10.3389/feduc.2017.00059

33. FastTrack Leadership. Library of short online leadership and health equity modules. Available from: WeTrainLeaders.com. Accessed August 25, 2020.

34. Technology of participation Available from: https://www.top-training.net/w/. Accessed September 18, 2020.

Creative Commons License © 2021 The Author(s). This work is published and licensed by Dove Medical Press Limited. The full terms of this license are available at https://www.dovepress.com/terms.php and incorporate the Creative Commons Attribution - Non Commercial (unported, v3.0) License. By accessing the work you hereby accept the Terms. Non-commercial uses of the work are permitted without any further permission from Dove Medical Press Limited, provided the work is properly attributed. For permission for commercial use of this work, please see paragraphs 4.2 and 5 of our Terms.