Back to Journals » Advances in Medical Education and Practice » Volume 13

Developing and Mapping Entrustable Professional Activities with Saudi Meds Competency Framework: A Consensus Study

Authors Hmoud AlSheikh M , Zaini RG , Iqbal MZ 

Received 28 June 2022

Accepted for publication 16 October 2022

Published 28 October 2022 Volume 2022:13 Pages 1367—1374

DOI https://doi.org/10.2147/AMEP.S379184

Checked for plagiarism Yes

Review by Single anonymous peer review

Peer reviewer comments 3

Editor who approved publication: Dr Md Anwarul Azim Majumder



Mona Hmoud AlSheikh,1 Rania G Zaini,2 Muhammad Zafar Iqbal1

1Medical Education Department, College of Medicine, Imam Abdulrahman Bin Faisal University, Dammam, Saudi Arabia; 2Medical Education Department, Faculty of Medicine, Umm Al-Qura University, Makkah, Saudi Arabia

Correspondence: Mona Hmoud AlSheikh, Physiology Department, College of Medicine, Imam Abdulrahman Bin Faisal University, Dammam, Saudi Arabia, Tel +966 50 498 1912, Email [email protected]; [email protected]

Purpose: This study aimed at developing a national consensus on entrustable professional activities (EPAs) for Saudi undergraduate medical education and mapping them with the “Saudi Meds” competency framework.
Methods: A three phased approach was used. Phase 1 consisted of identifying and developing EPAs; Phase 2 consisted of building a national consensus on developed EPAs (validation process); and Phase 3 consisted of mapping the validated EPAs with the Saudi Meds competency framework. Nominal group and modified Delphi techniques were used to develop consensus on EPAs. Classical test theory-based item analysis was conducted to establish validity and reliability of finalized EPAs.
Results: Fifteen expert medical educationists and 109 academic leaders from 23 medical schools participated in the validation process. The study achieved a consensus on 10 core EPAs with an overall reliability (Cronbach’s Alpha) of 0.814. The item-total correlation ranged from 0.341 to 0.642.
Conclusion: This study results in a national consensus on generic, comprehensive and region-specific EPAs that have been mapped with Saudi Meds competency framework. Our study is the first step in the direction of facilitating EPA-based curricular reforms in Saudi medical schools.

Keywords: competency-based medical education, undergraduate training, entrustable professional activities, competency framework, Saudi Meds

Introduction

Medical students as future healthcare providers hold immense responsibility to provide safe and competent patient care due to increased societal, social and institutional accountabilities. In pursuit of competence, medical education is undergoing a transformative change towards competency-based medical education (CBME), which is becoming a resurgent paradigm of educational theory and practice.1 CBME is an evidence-based approach to preparing physicians for practice that possess the desired knowledge, skills and attitudes outlined by the careful consideration of societal and patient needs.2

The fundamental ideology of CBME is to develop standards that focus more on the practical aspects of medicine rather than mere gain of knowledge. Furthermore, it urges on systematically testing the competence of medical graduates throughout their training.3 In the past decade, many competency frameworks have been designed to guide the curricular design of undergraduate training programs. Some classical examples include Can-MEDS,4 outcome project,5 Scottish Doctor,6 and the Netherlands national framework.7 In line with competency-based training models, a recent development is the Saudi Meds competency framework8 designed for Saudi undergraduate medical programs. This framework was designed in response to the national “Saudi Future Doctor” vision9 and is serving as a benchmark in the national implementation program “Saudi Medical Education Directives-Ministry of Education, Saudi Arabia.”10 The resulting framework outlines six overarching competency domains (i.e., patient care, scientific approach, research and scholarship, professionalism, communication and collaboration, and community-oriented practice) and 30 associated competencies that are expected of medical graduates and reflect the principles of professional medical practice in Saudi Arabia.8 The essential purpose of designing this framework was not to use it as a unified national curriculum, but as a national framework that can ensure equivalent standards across all Saudi undergraduate training programs.

Although this framework is holistic in design, a major issue with this framework is the suboptimal translation of theoretical competencies into practice. Entrustable professional activities (EPAs) have been proposed to operationalize competencies into practice and eventually filling the longstanding theory to practice gap.7,11,12 EPAs have been defined as:

Core units of professional practice that can be fully entrusted to a trainee as soon as he or she has demonstrated the necessary competence to execute the activity unsupervised.13

EPAs serve as a valuable tool to assess the competence of a trainee through entrustment decisions that define learners’ level of expertise and autonomy. While competencies are descriptors of individuals, EPAs are units of professional activity or work that can be observed and entrusted to a trainee in a clinical workplace.14,15 An EPA usually requires proficiency in multiple competencies simultaneously and provides a more structured and realistic approach of competence assessment.16

Many scholars advocate that the key principles underpinning EPAs (i.e., workplace training and entrustment) are generalizable and an EPA-based curriculum can help in creating a framework of continuous and transparent assessment of undergraduate trainees, starting from first year of medical school till graduation.17–20 With growing acceptance and suitability of EPAs-based undergraduate training worldwide, there has been a call to develop a national EPA framework to streamline Saudi undergraduate medical education training programs.21 This study aimed to develop a national consensus on a generic, comprehensive and region-specific set of EPAs and map them with the existing national Saudi Meds competency framework. Locally contextualized EPAs linked with Saudi Meds framework will help the curriculum planners and medical educators in configuring the competency-based undergraduate training programs by increasing the focus of the training on actual professional activities and progress evaluation of our trainees.21

Materials and Methods

A three phased approach was used to develop EPAs (Phase 1), build a national consensus on developed EPAs (Phase 2) and map them with Saudi Meds competency framework (Phase 3). Below we provide a description of each phase.

Phase 1: EPA Identification and Development

In the first phase, nominal group technique was used to outline an initial list of EPAs. We recruited expert medical educators with a national profile in undergraduate curriculum development and CBME. Prior to the expert meeting, a preliminary list of EPAs was drafted which were drawn from a robust literature review of current EPA frameworks designed for undergraduate medical education. After devising a potential list of EPAs and recruiting participants, a face-to-face meeting of the working group was convened during Saudi International Medical Education Conference (SIMEC). During the meeting, each EPA was briefly described to the experts in order to provide context and scope. Participants were then asked to brainstorm, critically analyze, and identify the professional activities which are relevant to the local undergraduate medical education. The stages of our nominal group were: silent generation, round robin, clarification and voting (ranking).

Phase 2: EPA Validation Process

The aim of Phase 2 was to validate the initial draft of EPAs (resulted from Phase 1) using modified Delphi technique. In the first Delphi round, Phase 1 participants were approached again via online questionnaire using Question-Pro® (Survey Analytics LLC, Beaverton, Oregon, USA) survey tool. This time, we asked the experts to rate each EPA for clarity and relevance using a dichotomous (Yes/No) scale. We then calculated the recorded scores for clarity and relevance by adding the number of positive (Yes) responses. We used a cut-off 80% positive responses to keep or discard an EPA. For instance, we retained an EPA that received 80% or more positive responses, whereas an EPA was discarded if the positive response was less than 80%. The results of first Delphi round led to a revised list of EPAs that was used in second Delphi round.

In the second Delphi round, we targeted national academic leaders across the country who were involved in development and implementation of medical curricula in their respective institutions. Participants were purposively recruited using a snowball sampling technique. We then sent an email to the recruited participants in which we provided a brief study description and requested them to complete the electronic survey, which was active for six weeks. Participants were asked to rate each EPA according to its level of importance using a 5-point Likert scale (1 = not important at all; 2 = somewhat important; 3 = important; 4 = very important; 5 = extremely important). We did not use a neutral response (i.e., do not know) to avoid complacency in participant responses. We also asked them to suggest additional EPAs using open ended questions at the end of the online questionnaire. We decided to include any suggested EPA provided that it is an independent, discrete, observable and measurable task with a clear beginning and end.13 The questionnaire also included items requesting participants’ sociodemographic and academic information.

Data Analysis

We used a classical test theory-based item analysis to determine corrected item-total correlation and internal consistency (Cronbach’s alpha) for the full list of EPAs. Classical test theory was chosen because it is a reasonable reliability and validity testing approach when the sample size is relatively small.22 We calculated corrected item-total correlation for each EPA and decided to keep the EPAs with correlation values between 0.3 to 0.8. We chose these values because correlation higher than 0.8 suggests redundancy and lower than 0.3 suggests noise (measurement error) in true score estimation. Cronbach’s alpha was calculated to evaluate the internal consistency of each retained EPA with a desired value in the range of 0.7 to 0.9. Descriptive statistics (means and standard deviations) were calculated for each retained EPA. All statistical analyses were performed using the Statistical Package for Social Sciences (SPSS) version 24.0.

Phase 3: EPAs and Competencies Mapping

After developing consensus on the list of EPAs, an online mapping link was generated in which all competency domains of the Saudi Meds framework were listed against each EPA. This link was shared with the same group of national academic leaders who participated in the EPAs validation process. We asked the participants to decide if a certain competency domain is required or not to perform each EPA using a dichotomous (Yes/No) scale. If more than 70% of participants voted “Yes” then that competency domain was considered “Essential” to perform an EPA. If 30–70% voted then it was considered “Important” to perform an EPA, and If <30% voted then it was considered “Relevant” to perform an EPA.

Results

Phase 1: EPA Identification and Development

A group of 15 national medical education experts participated in Phase 1. They were presented with a list of 21 EPAs which was generated through an iterative literature review of existing EPA frameworks. After critical review, brainstorming and group discussions, 14 EPAs were finalized that were used in first Delphi round.

Phase 2: EPA Validation Process

Same working group of 15 national expert medical educationists participated again in the first Delphi round, and a consensus was achieved on 10 out of 14 EPAs. Participants did not suggest any further EPAs at this level. In the second Delphi round, 186 national academic leaders were invited, of which 109 (58.6%) participants from 23 medical schools completed the online survey. Most participants were Saudis (52%), males (61%) and had more than 10-year experience in medical education (46%). Table 1 provides demographic information of the participants.

Table 1 Participant Demographics of Second Delphi Round Survey

The overall reliability (Cronbach’s Alpha) of 10 EPAs was 0.814. The mean values of retained 10 EPAs ranged from 4.06 to 4.83 with standard deviation ranging ±0.42 – ±0.94. The item-total correlation ranged from 0.341 to 0.642. Table 2 provides descriptive and item wise reliability statistics of core EPAs. Participants also suggested some new EPAs but none of them met our inclusion criteria. The suggested EPAs were either micro level learning outcomes, competencies, or a part of an existing EPA.

Table 2 Descriptive and Item Wise Reliability Statistics of Core EPAs

Phase 3: EPAs and Competencies Mapping

Of 109 participants, 94 (retention rate = 86.23%) participated in Phase 3. Two domains, “Patient care” and “Communication and collaboration”, received highest value in majority of the EPAs. “Patient care” was found to be essential for seven EPAs and “Communication and collaboration” was found to be essential for five EPAs. The domain “Research and Scholarship” was found to be least relevant and was not considered important or essential for any of the EPAs (< 30% votes). Table 3 provides a complete mapping of validated EPAs with Saudi Meds competency domains.

Table 3 Mapping of EPAs with Saudi Meds Competency Domains

Discussion

This study aimed at developing a national consensus on generic, comprehensive and region-specific EPAs and mapping them with Saudi Meds competency framework in order to facilitate competency-based curricular reforms in local medical schools. We achieved a national consensus on the enlisted 10 EPAs and their mapping with competency domains through expert educationists and academic leaders from almost all medical schools within the kingdom. Majority of respondents 95 (87%) advocated the potential value of developing national EPAs for undergraduate MBBS program. This advocacy is in line with many international initiatives that have incorporated EPAs in undergraduate clinical curriculum.20,23,24 We used multiple methodological and analytical approaches including nominal group and modified Delphi techniques, and classical test theory based item analysis in order to make the validation process more robust. The final list of EPAs achieved desired psychometric properties that provide high reliability and validity evidence.

Expert medical educationists did not highlight any further activities in first Delphi round because they were involved in generating the initial list of EPAs in Phase 1 of the study. Some EPAs were suggested by academic leaders in second Delphi round but were not included as these suggestions did not meet our inclusion criteria. This is common finding in other EPA validation studies.25 For any activity to qualify as an EPA, it has be an independent, discrete, observable and measurable task with a clear beginning and end.13–16

EPA7 “Participate in health quality improvement initiatives” received variable importance with minimum mean score. This finding could be seen as a reflection of the current practices in Saudi healthcare system where health quality improvement is executed by either national regulatory bodies or specific departments with designated roles. The culture of health quality assurance and development as being the responsibility of undergraduate students is yet to be recognized by the Saudi medical education and practice systems. Second lowest mean was observed for EPA 8 “Perform general procedures of a physician”, which is probably because undergraduate trainees are mostly not allowed to perform procedures as part of patient safety protocols. Although prioritizing patient safety is commendable, restricting students from performing procedures might hinder in their skills development. Alternative solutions such as simulation-based training could serve as a viable approach to overcome this challenge.26 Two EPAs (“participating in health quality improvement initiatives” and “Educating patients on disease management, health promotion and preventive medicine”) have a strong representation of the “Research and scholarship” domain as most medical colleges in the kingdom acknowledge research as a main goal in their vision and mission statements. Another driver for this recognition is that the regulatory body (Saudi Commission for Health Specialty) urges each graduate to have at least one publication for enrollment in any postgraduate training program.

The EPAs and their mapping with Saudi Meds framework hold multiple potential implications to further CBME in Saudi Arabia. The Saudi Meds competency framework is already serving as a benchmark in undergraduate medical education programs. Introducing EPAs will provide means to better operationalize Saudi Meds competency framework by defining essential activities and their required competencies that students could be entrusted with. When systematically operationalized, the entrustment decisions are known to provide structured feedback opportunities to the trainees by pinpointing exactly what is needed for them to achieve entrustment.27–29 Additionally, designing medical curricula around EPAs will help in trimming the unnecessary science jargon, bridging the gap between theory and practice, and directing the assessment to the most relevant competencies. Incorporating EPAs in a curriculum will essentially direct teaching and training activities towards real-life professional activities and will assist in bridging the gap between the planned and actual curriculum. Finally, as the Saudi clinical postgraduate programs led by the Saudi Commission for Health Specialties (SCFHS) are already in the process of incorporating EPAs in the postgraduate curricula, EPA-based undergraduate training might help in creating a true continuum of medical education.30

This study is not without limitations despite that we used robust methods to achieve our objectives. We recruited expert medical educationists using purposive sampling technique that might have led to an unconscious bias. Although we targeted highly expert group of academic leaders in the second Delphi round, it is quite possible that some of them might not be familiar with the concept of EPAs. An orientation workshop on EPAs, their definition and purpose could have helped us in overcoming this limitation. However, this study was conducted in the early days of COVID-19 pandemic when online workshops were not a norm and it was difficult for us to organize such event in an uncertain situation. Finally, this consensus study does not provide specifications (i.e., context, expected work description, required resources for entrustment decisions) for the enlisted EPAs. We call academic leaders and researchers to address this crucial gap by designing contextually relevant specifications for each EPA that can be used to facilitate EPAs-based undergraduate training programs.

Conclusion

This study results in a national consensus on generic, comprehensive and region-specific EPAs that have been mapped with Saudi Meds competency framework. We achieved a national consensus through expert educationists and academic leaders from almost all medical schools within the kingdom. Our study is the first step in the direction of facilitating EPA-based curricular reforms in local medical schools. We have a long road ahead of us in order to successfully implement these EPAs in Saudi undergraduate medical education.

Ethical Considerations

Ethical approval for this study was obtained from the ethics review committee of Faculty of Medicine, Umm AlQura University. The study methods were carried out in accordance with relevant guidelines and regulations. All participants agreed to contribute to the study, and the questionnaire was anonymous to ensure confidentiality and enhance the validity of responses. All obtained data were treated with confidentiality.

Acknowledgments

We thank all expert medical educationists and academic leaders who participated in this study. At the time of paper submission, third author (Muhammad Zafar Iqbal) was working as postdoctoral fellow at School of Physical and Occupational Therapy, Faculty of Medicine and Health Sciences, McGill University, Montreal, Quebec, Canada.

Disclosure

The authors report no conflicts of interest in this work.

References

1. Hawkins RE, Welcher CM, Holmboe ES, et al. Implementation of competency‐based medical education: are we addressing the concerns and challenges? Med Educ. 2015;49(11):1086–1102. doi:10.1111/medu.12831

2. Frank JR, Snell LS, Cate OT, et al. Competency-based medical education: theory to practice. Med Teach. 2010;32(8):638–645. doi:10.3109/0142159X.2010.501190

3. Humphrey-Murto S, Wood TJ, Ross S, et al. Assessment pearls for competency-based medical education. J Grad Med Educ. 2017;9(6):688–691. doi:10.4300/JGME-D-17-00365.1

4. Frank JR, Danoff D. The CanMEDS initiative: implementing an outcomes-based framework of physician competencies. Med Teach. 2007;29(7):642–647. doi:10.1080/01421590701746983

5. Swing SR. The ACGME outcome project: retrospective and prospective. Med Teach. 2007;29(7):648–654. doi:10.1080/01421590701392903

6. Simpson JG, Furnace J, Crosby J, et al. The Scottish doctor - Learning outcomes for the medical undergraduate in Scotland: a foundation for competent and reflective practitioners. Med Teach. 2002;24(2):136–143. doi:10.1080/01421590220120713

7. ten Cate O, Scheele F. Competency-based postgraduate training: can we bridge the gap between theory and clinical practice? Acad Med. 2007;82(6):542–547. doi:10.1097/ACM.0b013e31805559c7

8. Zaini RG, Bin Abdulrahman KA, Al-Khotani AA, Al-Hayani AMA, Al-Alwan IA, Jastaniah SD. Saudi Meds: a competence specification for Saudi medical graduates. Med Teach. 2011;33(7):582–584. doi:10.3109/0142159X.2011.578180

9. Zaini RG. National consensus of the vision of the” Saudi future doctor”: current status and future perspective of medical education in Saudi medical schools; 2007.

10. Shadid AM, Abdulrahman AK, Dahmash AB, et al. SaudiMEDs and CanMEDs frameworks: similarities and differences. Adv Med Educ Pract. 2019;10:273. doi:10.2147/AMEP.S191705

11. ten Cate O. Competency-based education, entrustable professional activities, and the power of language. J Grad Med Educ. 2013;5(1):6–7. doi:10.4300/JGME-D-12-00381.1

12. ten Cate O. Entrustability of professional activities and competency-based training. Med Educ. 2005;39(12):1176–1177. doi:10.1111/j.1365-2929.2005.02341.x

13. Ten Cate O, Chen HC, Hoff RG, Peters H, Bok H, van der Schaaf M. Curriculum development for the workplace using entrustable professional activities (EPAs): AMEE Guide No. 99. Med Teach. 2015;37(11):983–1002. doi:10.3109/0142159X.2015.1060308

14. Ten Cate O, Pool IA. The viability of interprofessional entrustable professional activities. Adv Heal Sci Educ. 2019;25(5):1255–1262. doi:10.1007/s10459-019-09950-0

15. Iqbal MZ, Al-Eraky MM. Common myths about entrustable professional activities. Educ Med J. 2021;13(2):97–100. doi:10.21315/eimj2021.13.2.9

16. Ten Cate O, Taylor DR. The recommended description of an entrustable professional activity: AMEE Guide No. 140. Med Teach. 2021;43(10):1106–1114. doi:10.1080/0142159X.2020.1838465

17. Carrie Chen H, McNamara M, Teherani A, Cate OT, O’Sullivan P. Developing entrustable professional activities for entry into clerkship. Acad Med. 2016;91(2):247–255. doi:10.1097/ACM.0000000000000988

18. Englander R, Flynn T, Call S, et al. Toward defining the foundation of the MD Degree: core entrustable professional activities for entering residency. Acad Med. 2016;91(10):1352–1358. doi:10.1097/ACM.0000000000001204

19. Meyer EG, Chen HC, Uijtdehaage S, Durning SJ, Maggio LA. Scoping review of entrustable professional activities in undergraduate medical education. Acad Med. 2019;94(7):1040–1049. doi:10.1097/ACM.0000000000002735

20. ten Cate O, Graafmans L, Posthumus I, Welink L, van Dijk M. The EPA-based Utrecht undergraduate clinical curriculum: development and implementation. Med Teach. 2018;40(5):506–513. doi:10.1080/0142159X.2018.1435856

21. AlSheikh MH, Zaini RG. Time to develop entrustable professional activities for the Saudi-Med competency framework. Heal Prof Educ. 2018;4(3):159–160.

22. De Champlain AF. A primer on classical test theory and item response theory for assessments in medical education. Med Educ. 2010;44(1):109–117. doi:10.1111/j.1365-2923.2009.03425.x

23. Van Loon KA, Driessen EW, Teunissen PW, Scheele F. Experiences with EPAs, potential benefits and pitfalls. Med Teach. 2014;36(8):698–702. doi:10.3109/0142159X.2014.909588

24. Schick K, Eissner A, Wijnen-Meijer M, et al. Implementing a logbook on entrustable professional activities in the final year of undergraduate medical education in Germany–a multicentric pilot study. GMS J Med Educ. 2019;36:6.

25. Iqbal MZ, Könings KD, Al-Eraky MM, van Merriënboer JJG. Entrustable professional activities for small-group facilitation: a validation study using modified Delphi technique. Teach Learn Med. 2021;33(5):536–545. doi:10.1080/10401334.2021.1877714

26. Tiyyagura G, Balmer D, Chaudoin L, et al. The greater good: how supervising physicians make entrustment decisions in the pediatric emergency department. Acad Pediatr. 2014;14(6):597–602. doi:10.1016/j.acap.2014.06.001

27. Hauer KE, Ten Cate O, Boscardin C, Irby DM, Iobst W, O’Sullivan PS. Understanding trust as an essential element of trainee supervision and learning in the workplace. Adv Heal Sci Educ. 2014;19(3):435–456.

28. Ten Cate O, Chen HC. The ingredients of a rich entrustment decision. Med Teach. 2020;42(12):1413–1420. doi:10.1080/0142159X.2020.1817348

29. Ten Cate O, Hart D, Ankel F, et al. Entrustment decision making in clinical training. Acad Med. 2016;91(2):191–198. doi:10.1097/ACM.0000000000001044

30. Ten Cate O, Carraccio C. Envisioning a true continuum of competency-based medical education, training, and practice. Acad Med. 2019;94(9):1283–1288. doi:10.1097/ACM.0000000000002687

Creative Commons License © 2022 The Author(s). This work is published and licensed by Dove Medical Press Limited. The full terms of this license are available at https://www.dovepress.com/terms.php and incorporate the Creative Commons Attribution - Non Commercial (unported, v3.0) License. By accessing the work you hereby accept the Terms. Non-commercial uses of the work are permitted without any further permission from Dove Medical Press Limited, provided the work is properly attributed. For permission for commercial use of this work, please see paragraphs 4.2 and 5 of our Terms.