Back to Journals » Advances in Medical Education and Practice » Volume 13

Measuring the Impact of a Faculty Development Program on Clinical Educators

Authors Nair BR , Gilligan C , Jolly B

Received 4 November 2021

Accepted for publication 27 January 2022

Published 9 February 2022 Volume 2022:13 Pages 129—136

DOI https://doi.org/10.2147/AMEP.S347790

Checked for plagiarism Yes

Review by Single anonymous peer review

Peer reviewer comments 3

Editor who approved publication: Dr Md Anwarul Azim Majumder



Balakrishnan R Nair, Conor Gilligan, Brian Jolly

School of Medicine and Public Health, and Academy of Clinical Educators, University of Newcastle, Callaghan, NSW, Australia

Correspondence: Conor Gilligan
School of Medicine and Public Health, and Academy of Clinical Educators, University of Newcastle, Callaghan, NSW, 2308, Australia
, Tel +61240420553
, Email [email protected]

Introduction: An Academy of Clinical Educators (ACE) was established at the University of Newcastle, to support and build capacity among existing and prospective medical educators. ACE established a Certificate of Clinical Teaching and Supervision (CCTS) program, the final assessment of which was a reflective piece on how the course has affected participants’ practice as clinical teachers or supervisors and how changes are expected to impact learner achievement. We conducted a qualitative evaluation of these to explore the impact of the CCTS on participants’ teaching.
Methods: Thirty-one participants (of 90 completers to date) consented for their written reflections to undergo qualitative thematic analysis and completed a survey exploring their preparation for, and experience of the program, and application of skills learnt.
Results: Most participants reported applying the skills gained through the CCTS to their teaching practice to a large (n=23; 72%) or very large (n=5; 16%) extent. Four themes emerged from the qualitative data, aligned with the topics of the CCTS: teaching structure; feedback; orientation; and assessment. Participants described application of more structured approaches to orientation, teaching and feedback, positive student responses, and self-reported satisfaction with adopted changes.
Discussion: The CCTS has motivated change in the teaching practice of participants. Although evidence presented here is limited by the self-reported nature, descriptions of actual changes in practice were detailed and specific enough to suggest they could act as a proxy for objectively measured change in behaviour and outcome.
Conclusion: A faculty development program delivered to clinicians with a range of teaching and education-related roles, from varied clinical disciplines and professions, can promote improved, structured teaching and feedback.

Keywords: clinical education, faculty development, feedback, supervision, assessment

Introduction

Medical education relies upon the contribution of health professionals to the teaching and supervision of students and more junior staff. In many cases, this teaching and supervision occurs without training, and often without adequate recognition. Faculty development is often conducted, or at least offered to some extent, in an effort to address this gap, but assessment of the effectiveness of these activities is often underdeveloped or underdone.1 Evaluation of the efforts of medical education academies and faculty development programs has largely been limited to survey-based self-reported satisfaction with the program, and change in knowledge, skills, and confidence.1 These outcomes represent lower levels of Kirkpatrick’s evaluation model,1,2 with only a minority measuring actual impacts on learning and performance, or institutional change.1 Institutional and structural change is likely to be required to promote long-term impacts of faculty development and a cultural shift in the value of educational skills and efforts. Some such changes have been reported as a result of the academy movement in US medical schools,3 but measures of the success of capacity building efforts can be challenging to define.4

An Academy of Clinical Educators (ACE) was established in 2017 at the School of Medicine and Public Health at the University of Newcastle. The mission was to bring people interested in health professions education together, and provide a forum for discussion, debate, and learning. We were concerned that with the modern technology and fast paced medicine, teaching was receiving diminishing attention and value.5 This phenomenon is well recognised internationally, with many academic health centres having greater focus on the patient care and research arms of their tripartite mission, to the detriment of the important role they play in education.6 The purpose of the ACE was to support existing medical educators, and to build capacity among clinicians interested in medical education. The core group and leadership team were the Dean of the medical program, a senior clinician as the Director, the Professor of Medical Education, and the Director of the Teaching and Learning Unit of the University.

The Academy design drew inspiration from those emerging internationally.5,6 However, most, including the eight academies described by Irby et al in their recent review are predicated upon a membership of distinguished educators carefully selected through peer review.3 But capacity building in institutions often requires a focus on faculty socialisation and collective action at all levels.7 There is often a gap between expectations of teachers at the clinical “coal face” and those of curriculum planners, so it is necessary to promote faculty development in specific ways targeted to inclusivity and good communication.7 Unless the academic faculty and the clinicians “knot -work”, theoretical concepts in learning may not translate into real outcomes.8 There is evidence that in order to change, it is necessary to have a critical mass of individuals who are sponsored and supported by the faculty. Therefore, ACE was open to all clinicians, including nursing and allied health clinicians, if they had involvement with the medical program and a track record of, or willingness to, teach. In this way, the ACE adopted a capacity building, strengthening approach, rather than merely rewarding those already active in health professions education. This goal is not at odds with the missions of other academies, with a shared a focus on mentoring junior faculty, enhancing teaching skills, and advocating for educational scholarship in promotion.3

So far, the ACE has attracted over 350 members through professional networks and university communications. The programs launched by the academy include webinars and seminars for updating the members on latest developments in medical education. To address the need and desire among clinical faculty for upskilling, a Certificate of Clinical Teaching and Supervision (CCTS) course was launched and accredited through the University of Newcastle Centre for Teaching and Learning. The CCTS program consisted of four interactive modules of three to four hours each. Modules have been delivered both face-to-face and online (live videoconference sessions) to accommodate participant location and pandemic-related restrictions. The module topics were Clinical bedside teaching, Supervision and feedback, Good practice in Assessment, and Workplace-Based Assessment. So far, 180 members have enrolled in the program and 90 have completed all four modules and the final assessment task. Participants usually complete the modules within a six month period, and upon completion, as a final assessment task before being awarded the CCTS, members are required to write a reflective piece on how the course has led them to reconsider their practice as a clinical teacher or supervisor and how they expect that change to impact student achievement.

Our aim was to study the outcome of the CCTS program. We were interested in the long-term outcomes and impact on the practice of teaching, assessment and feedback as described in these final assessments.

Methods

All participants who had completed the CCTS were invited to complete a brief online survey including questions exploring their teaching experience, preparation for teaching, and application of skills from the CCTS. The survey also invited participants to provide consent for the use of their written reflection in qualitative analysis. Thirty-two CCTS recipients completed the online survey and 31 (34.4% of completers to mid-2021) gave consent for the analysis of their written reflections.

The questionnaire was developed for the purpose of the study to collect information to support the qualitative analysis and describe the participant group. Survey findings were analysed descriptively only. The reflective pieces were based on the same trigger provided to all participants:

The assessment will allow participants to demonstrate that they have considered or implemented changes to their clinical teaching or supervisory practice based on skills derived from attendance at the Certificate workshops. These changes may already have been implemented, or may be ones that you are considering for the future. Identify a topic in one of the Certificate modules that would lend itself to your reconsidering your practice as a clinical supervisor or teacher. Describe your current or past practice and associated student learning outcomes.

Thematic analysis9 was conducted on the written reflections from the consenting participants. Two researchers (CG, KN) separately read the reflections to identify themes and met to discuss and agree upon final themes. Initial reading identified common concepts, keywords or concatenated words which were then categorised and grouped into themes and subthemes. Subsequent steps led to the consolidation of subthemes within the overall theme headings. All researchers met to ensure agreement on the themes and interpretation of the findings in the context of their experience as module designers, facilitators, and assessors.

This project was approved by the University of Newcastle Human Research Ethics Committee, approval number H-2020-0430.

Results

Participants included a range of clinicians in different fields including geriatric medicine, surgery, general medicine, psychiatry, medical administration, allied health fields (eg physiotherapy, pharmacy) and non-clinical academics, with clinical experience (years since graduation) ranging from 40 years to one year. The participants were representative of the larger pool of ACE members and CCTS participants. In addition to their formal clinical training, 19 respondents also listed a range of professional development programs in which they had previously participated. Six participants listed “Teaching on the Run”, and others listed various short courses and workshops offered through the university or the clinical specialty training colleges with which they were associated. Four participants described more formal programs including a masters of clinical education, and other professional certificates in education more generally.

All participants had some teaching experience, ranging from classroom teaching, clinical supervision, or bedside/ward-based teaching of undergraduate students, to supervision and bedside teaching of interns or residents. Most participants had taught medical students or trainees, while a minority reported having taught pharmacy students or interns, nursing, psychology, and other allied health students. The majority of participants (n=18; 56%) reported having taught or supervised more than 16 students in the previous 12 months, while eight (25%) reported teaching less than five students in that period and the remainder between five and 15.

Most participants reported feeling well (n=29; 90%), or extremely well (n=3; 10%) prepared for their teaching/supervision role as a result of the CCTS. All participants reported that the CCTS had improved their readiness for teaching, well (n=26, 81%) or extremely well (n=6; 19%). Most participants reported having been able to apply the skills gained through the CCTS to their teaching practice to a large (n=23; 72%) or very large (n=5; 16%) extent, while a minority reported being able to apply the skills slightly (n=1; 3%), or were not sure (n=3; 9%).

Four overarching themes emerged from the qualitative data, in line with the focus topics of the CCTS modules: teaching structure; feedback; orientation; and assessment. Within each theme, participants drew from the module content and described changes they had made or planned to make to their teaching practice. The evidence presented here is reliant upon the experiences and personal reflections of the participants, some of which contained reference to largely anecdotal feedback from students.

Structure

Several participants reflected on learning which motivated them to formalise (provide more structure to) teaching opportunities and learning goals, having predominantly conducted ad hoc, opportunistic teaching in clinical contexts. In both their teaching, and their own learning experiences, participants recalled predominantly ad hoc opportunities which relied on student self-direction to identify and consolidate learning. All participants recognised potential value in formalising teaching opportunities, many referring to the use of formal models to structure teaching. For example:

As the majority of this teaching and supervision is informal and unstructured which goes alongside clinical work, I have generally paid little attention to the format and content of this teaching and supervision … After attending the workshops as part of the [CCTS], I realised that there is great benefit to teaching and supervision which is structured and targeted. (Male, physician, ward-based supervision)

I noted on reflection that there was … no goal setting, minimal autonomy and little feedback to students from the members of the team who supervised them the most (residents/registrars). This I believe contributed to a sense of confusion from students around expectations, disengagement and passive involvement and made feedback often generic and less goal driven. (Male, resident, involved in clinical supervision)

Particularly in the classroom or simulation context, formal models were recognised as valuable and applied differently to suit different student groups and learning goals. This is illustrated by one participant who describes:

I have used the Set-Dialogue-Closure method for interacting with the students … I have found it very practical approach in helping students get engaged in my teaching sessions.

For the CBL sessions … I adopted the Doto model … (it) was very useful in making students understand the particular technique that they needed to learn. It has helped the students learn the clinical skills faster within the short period they had. (Male, international medical graduate [IMG], classroom teacher)

Several participants described having already implemented, or planning to implement, an approach involving working with students to establish learning goals and using these as the basis of teaching and feedback. Some participants who had already implemented this approach described positive feedback from students in response to the change, as well as an increased sense of satisfaction in their teaching.

Getting a background of their previous learning, getting their learning goals and then asking at the end for feedback, is a process that I have started to use in all my teaching sessions. This has been met with positive feedback so far. (Female, clinical supervisor and tutorial facilitator)

The improved articulation of learning goals was recognised as facilitating a more student-centred learning approach, and allowing for signposting of learning opportunities and greater clarification of the context for learning points and content.

A pearl that resonated with me was that we should put the learning in context first. This makes it easier to slot the learning into and around pre‐existing knowledge. I thought about times where I have simply taken one individual teaching point (ie a heart murmur) out of context and expected students to be able to conceptualise what is going on to a higher standard than is reasonable and give me a coherent answer. I also thought of times people had done that to me, and how much easier it would be if I had been given the context prior to being expected to come up with esoteric differentials (ie to say what my supervisor was thinking) with an element of stress regarding getting the answer wrong in front of my peers … that if instead of learning in the purely academic context, we frame the learning points in a clinical context, the learning is likely to be more effective. (Male, advanced trainee, clinical supervision)

By clearly identifying Goals, Tasks, Roles and Relationship between Trainee and Trainer a more productive and engaging bond could be established. The undergraduate who hold[s] back on asking due to feeling nervous and unsure would hopefully feel that raising any doubts would be welcomed. (Female, GP and clinical supervisor)

Orientation

In line with the formalisation of learning opportunities, and as a sub-theme within “structure”, several participants specifically described an intention to provide a more structured orientation to clinical placements. Such orientation is intended to create a safe learning environment and a climate of trust, as well as preparing students to receive feedback. One participant described the learning from the CCTS as having led her to “re-frame orientation as a useful teaching tool, rather than just an introductory formality”. (Female, junior medical officer, clinical teaching)

Through the provision of a clearer orientation to the rotation, the expectations and role of the supervisee, this will better identify the role of the supervisee within the greater team, granting them permission to participate, rather than be a passive observer. (Female, registrar, clinical supervision)

Those who had already implemented such an approach described the development of positive relationships with students as a result.

I now take 5 minutes with new medical students to provide a brief orientation, where I set out my expectations of their goals, tasks and roles within the term. I also give them the opportunity to set out their expectations of me as their supervisor and of the term in general … This helps set the term on the right foot, as well as reduces the need for me to repeat my expectations. Overall, it has led to better communication and less frustration. (Female, trainee, clinical supervision)

Feedback

Participants frequently referred to the value of models for the provision of feedback, in order to structure feedback conversations. Such conversations were commonly regarded as intimidating and uncomfortable prior to learning about these models and structured approaches.

The culture surrounding clinical supervision was poor and the suggestion of having supervision with older staff caused tension and division amongst the team. There was a feeling within the team that management were checking up on people rather than providing support and development. (Female, hospital pharmacist and clinical supervisor)

I now understand that the feedback is highly rated by the students. In the past I admit I have been daunted by having to ascribe a particular rating for each student as set out by the University. (Female, senior staff specialist, clinical supervisor)

I also adopted the SET-GO method10 of feedback that I learned in one of the modules of the clinical certificate workshop. I applied it during the CBL and PBL sessions. One difficulty that I faced with my PBL group was that [a] few students were not contributing enough during the PBL sessions. The SET-GO method of feedback is helping students to understand what goal they are expected to achieve from the sessions and what changes they can instigate to reach there. Students who were not contributing enough in the discussions started to participate more often. (Male, IMG, classroom teacher)

The articulation of learning goals as described above, facilitated a more goal-oriented and tailored approach to meet individual student needs.

To enhance my ability to deliver quality feedback, I [now] engage with a suite of feedback models. I find having a range of methods from Pendleton to Agenda-led outcomes-based analysis (ALOBA) means I am equipped to provide appropriate feedback depending on the needs and stage of the learner. The principle I have adopted reflects a student-centred, constructivist pedagogy. Therefore, the type and level of feedback is determined by the student and requires me to have an understanding of the skills and knowledge that the learners bring to the course. (Female, academic faculty, classroom teacher)

These principles rendered feedback discussions far less daunting for facilitators, and as a result, such discussions are likely to occur more routinely and be more effective for learners.

The idea of just telling students/trainees that “things are all right” and the reluctance in addressing specific issues … is a problem which I have encountered previously when there were deficiencies in a student or trainee’s performance but I have been reluctant and did not feel empowered to provide constructive feedback. The different models of providing feedback … have enabled me to see different ways of giving feedback and strategies that may be more suitable in different settings. (Male, physician, clinical supervision)

I have seen a significant improvement in the engagement of those I supervise and [their]interest in discussing the findings with me. I also think that they appear more comfortable, and as a result seem to be more willing to seek out further educational experiences with me. (Male, advanced trainee, clinical supervision)

My previous approach of minimal feedback and focussing on what was incorrect might also have left some students discouraged and more insecure about the procedure which might have set them up for failure in the acute setting. Since adopting the Pendleton approach and allocating more time to feedback I have received more positive feedback about the sessions as a whole and feel that students are leaving with key principles reinforced with a more positive focus. (Female, senior staff specialist, clinical supervision and classroom teaching)

Assessment

Several participants reflected more specifically on experiences of assessment in medical education, and described planned or already implemented changes to improve existing assessments. The participant’s comments reflect a recognition that assessment should align with other key elements of education, and importantly, reflect the learning objectives and include feedback. An increased recognition of the importance of assessment is reflected by one participant’s statement: “all tasks which are given to my candidate should be assessable” Several participants describe a recognition of challenges associated with assessment, and the value of workplace-based assessment (WBA) approaches. As a result of the CCTS modules, one participant described a conviction that “the time is ripe to actively work towards a new model of assessment … ” (male, senior physician, clinical supervisor) in their particular area. Another emphasises the value of WBA style assessment in “encouraging a learning environment that provides more reliable feedback and more reliable assessment … ” (Male, senior specialist, clinical supervisor). Another participant describes a shift to case-based discussion as both a teaching tool and assessment, bringing together concepts highlighted across all themes and CCTS modules; “effective feedback … promoting autonomy and self-directed learning” (Female, senior specialist, clinical supervisor).

Discussion

Based on the reflections of participants in this faculty development program, it appears that the CCTS has successfully motivated change in the teaching practice of the respondents. While the evidence presented here is limited by the self-reported nature (Kirkpatrick level 3a), the specific descriptions of actual changes in practice are promising and act as a useful proxy for objectively measured change in behaviour and outcome (level 3b).1,2 Changes described in the written reflections are also supported by the survey data indicating that the majority of participants had been able to apply their new skills to their teaching to a large extent.

Such change in teaching practice is crucial to support effective learning, and support future faculty development efforts. Further research is required to establish a link between the application of teaching skills and learner outcomes as a direct result of this faculty development effort, but the literature supports benefits for learners associated with the teaching approaches described.11 While structural and organisational support is a necessity for sustained change,4 the success of our program in the absence of the financial support and structural change enjoyed by some academies is heartening.3

In keeping with the movement towards Miller’s extended pyramid of assessment,12,13 it may be argued that the ultimate goal of faculty development programs and academies is to strengthen the professional identity of participants as teachers, along with their clinical and research roles and identities.14 Several researchers suggest that professional identity as teachers can be enhanced through faculty development and institutional support.14,15 The findings of the current evaluation indicate that CCTS completers have indeed adopted not only teaching approaches, but attitudes towards the value of teaching and their role in managing learning, which suggest enhanced identities as educators likely to impact on behaviour in the long-term.

The reflections of CCTS participants share a resounding sense of increased confidence in teaching and applying structured approaches to teaching, feedback, and assessment. These findings are in keeping with the self-reported improvement in teaching skills found in other faculty development evaluations16–19 but go further to include detailed descriptions of the application of these skills. Only a minority of faculty development evaluations have exceeded Kirkpatrick level 3, but those which have, report positive impacts derived from learner evaluations of teaching,20 and peer observations of teaching.18

The participants in this evaluation represent only one third of the CCTS participants to date, but even this proportion represents an opportunity for positive impact on teaching and learning. The participants represent a range of medical and allied health disciplines and teach in a variety of clinical and classroom contexts. While some faculty development programs have been reported to have less success in preparing participants to assess students,21 the relatively heavy assessment focus of the CCTS has led to increased confidence in this area. The application of this confidence is yet to be thoroughly evaluated.

Conclusion

A faculty development program delivered to clinicians from a range of clinical disciplines and with a range of teaching and education-related roles can lead to the application of improved, structured teaching and feedback which is likely to improve student learning. Longer-term observational studies will be required to better understand the potential long-term impacts of the program.

Data Sharing Statement

Complete data can be obtained from the authors upon request.

Ethics Approval

This project was approved by the University of Newcastle Human Research Ethics Committee, approval number H-2020-0430. Respondents consented for their de-identified data to be used in publication.

Acknowledgments

The authors would like to thank Brian Kelly and Carol Miles for their involvement in establishing the ACE and in developing and delivering CCTS modules. We would also like to thank all ACE members and CCTS participants for their engagement in the program and their important contribution to medical education.

Author Contributions

All authors made a significant contribution to the work reported, whether that is in the conception, study design, execution, acquisition of data, analysis and interpretation, or in all these areas; took part in drafting, revising or critically reviewing the article; gave final approval of the version to be published; have agreed on the journal to which the article has been submitted; and agree to be accountable for all aspects of the work.

Funding

No funding was obtained for the completion of this work.

Disclosure

Professor Balakrishnan R Nair is an Associate Editor for Advances in Medical Education and Practice of Dove Medical Press. The authors report no other conflicts of interest in this work.

References

1. Alexandraki I, Rosasco RE, Mooradian AD. An evaluation of faculty development programs for clinician-educators: a scoping review. Acad Med. 2021;96(4):599–606. doi:10.1097/ACM.0000000000003813

2. Kirkpatrick D, Kirkpatrick J. Evaluating Training Programs: The Four Levels. San Francisco, CA: Berrett-Koehler; 2006.

3. Irby DM, Cooke M, Lowenstein D, Richards B. The academy movement: a structural approach to reinvigorating the educational mission. Acad Med. 2004;79(8):729–736.

4. Salajegheh M, Gandomkar R, Mirzazadeh A, Sandars J. Identification of capacity development indicators for faculty development programs: a nominal group technique study. BMC Med Educ. 2020;20(1):163. doi:10.1186/s12909-020-02068-7

5. Thibault GE, Neill JM, Lowenstein DH. The academy at harvard medical school: nurturing teaching and stimulating innovation. Acad Med. 2003;78(7):673–681. doi:10.1097/00001888-200307000-00005

6. Wright SM, Kravet S, Christmas C, Burkhart K, Durso SC. Creating an academy of clinical excellence at Johns Hopkins bayview medical center: a 3-year experience. Acad Med. 2010;85(12):1833–1839. doi:10.1097/ACM.0b013e3181fa416c

7. Jolly B. Faculty development for organisational change. In: Steinert Y, editor. Faculty Development in the Health Porfessions: A Focus on Research and Practice. Dordrecht: Springer; 2014:119–140.

8. Elmberger A, Bjorck E, Nieminen J, Liljedahl M, Bolander Laksov K. Collaborative knotworking - transforming clinical teaching practice through faculty development. BMC Med Educ. 2020;20(1):497. doi:10.1186/s12909-020-02407-8

9. Braun V, Clarke V. Thematic analysis. In: Cooper H, Camic PM, Long DL, Panter AT, Rindskopf D, Sher KJ, editors. APA Handbook of Research Methods in Psychology, Vol. 2. Research Designs: Quantitative, Qualitative, Neuropsychological, and Biological. American Psychological Association; 2012:55–71.

10. Silverman J, Draper J, Kurtz S. The Calgary-Cambridge approach to communication skills teaching II. The SET-GO approach/method of descriptive feedback. Educ Gen Pract. 1997;8:16–23.

11. Regehr G, Norman GR. Issues in cognitive psychology: implications for professional education. Acad Med. 1996;71(9):988–1001. doi:10.1097/00001888-199609000-00015

12. Rethans JJ, Norcini JJ, Baron-Maldonado M, et al. The relationship between competence and performance: implications for assessing practice performance. Med Educ. 2002;36(10):901–909. doi:10.1046/j.1365-2923.2002.01316.x

13. Ten Cate O, Carraccio C, Damodaran A, et al. Entrustment decision making: extending miller’s pyramid. Acad Med. 2021;96(2):199–204. doi:10.1097/ACM.0000000000003800

14. Steinert Y, Ps O, Irby DM. Strengthening teachers’ professional identities through faculty development. Acad Med. 2019;94(7):963–968. doi:10.1097/ACM.0000000000002695

15. Lieff S, Baker L, Mori B, Egan-Lee E, Chin K, Reeves S. Who am I? Key influences on the formation of academic identity within a faculty development program. Med Teach. 2012;34(3):e208–e215. doi:10.3109/0142159X.2012.642827

16. Branch WT, Kroenke K, Levinson W. The clinician-educator–present and future roles. J Gen Intern Med. 1997;12(Suppl 2):S1–S4. doi:10.1046/j.1525-1497.12.s2.16.x

17. Delver H, Jackson W, Lee S, Palacios M. FM POD: an evidence-based blended teaching skills program for rural preceptors. Fam Med. 2014;46(5):369–377.

18. Knight CL, Windish DM, Haist SA, et al. The SGIM TEACH program: a curriculum for teachers of clinical medicine. J Gen Intern Med. 2017;32(8):948–952. doi:10.1007/s11606-017-4053-7

19. Steinert Y, Boudreau JD, Boillat M, et al. The osler fellowship: an apprenticeship for medical educators. Acad Med. 2010;85(7):1242–1249. doi:10.1097/ACM.0b013e3181da760a

20. Mazotti L, Moylan A, Murphy E, Harper GM, Johnston CB, Hauer KE. Advancing geriatrics education: an efficient faculty development program for academic hospitalists increases geriatric teaching. J Hosp Med. 2010;5(9):541–546. doi:10.1002/jhm.791

21. Burgess A, van Diggele C, Mellis C. Faculty development for junior health professionals. Clin Teach. 2019;16(3):189–196. doi:10.1111/tct.12795

Creative Commons License © 2022 The Author(s). This work is published and licensed by Dove Medical Press Limited. The full terms of this license are available at https://www.dovepress.com/terms.php and incorporate the Creative Commons Attribution - Non Commercial (unported, v3.0) License. By accessing the work you hereby accept the Terms. Non-commercial uses of the work are permitted without any further permission from Dove Medical Press Limited, provided the work is properly attributed. For permission for commercial use of this work, please see paragraphs 4.2 and 5 of our Terms.