Back to Journals » Open Access Emergency Medicine » Volume 12

Attitudes Towards Introduction of Multiple Modalities of Simulation in Objective Structured Clinical Examination (OSCE) of Emergency Medicine (EM) Final Board Examination: A Cross-Sectional Study

Authors Alsulimani LK , Al-Otaiby FM, Alnofaiey YH , Binobaid FA, Jafarah LM , Khalil DA 

Received 11 August 2020

Accepted for publication 17 November 2020

Published 1 December 2020 Volume 2020:12 Pages 441—449

DOI https://doi.org/10.2147/OAEM.S275764

Checked for plagiarism Yes

Review by Single anonymous peer review

Peer reviewer comments 2

Editor who approved publication: Dr Hans-Christoph Pape



Loui K Alsulimani,1– 3 Fayhan M Al-Otaiby,4 Yasser H Alnofaiey,5 Fares A Binobaid,6 Linda M Jafarah,7 Daniyah A Khalil8

1Department of Emergency Medicine,Faculty of Medicine, King Abdulaziz University, Jeddah, Saudi Arabia; 2Department of Medical Education, King Abdulaziz University, Jeddah, Saudi Arabia; 3Clinical Skills and Simulation Center, King Abdulaziz University, Jeddah, Saudi Arabia; 4Department of Emergency Medicine, International Medical Center, Jeddah, Saudi Arabia; 5Department of Emergency Medicine, Faculty of Medicine, Taif University, Taif, Saudi Arabia; 6Department of Emergency Medicine, King Abdulaziz Hospital, Makkah, Saudi Arabia; 7Department of Emergency Medicine, King Fahad Medical City, Riyadh, Saudi Arabia; 8Primary Healthcare Center, King Fahad General Hospital, Jeddah, Saudi Arabia

Correspondence: Loui K Alsulimani Department of Emergency Medicine
King Abdulaziz University, 1 Jamha Street, Jeddah 80215, Saudi Arabia
Tel +966 503652778
Fax +966 126401000
Email [email protected]

Purpose: Objective Structured Clinical Examination (OSCE) is the current modality of choice for evaluating practical skills for graduating emergency medicine residents of final Saudi board examination. This study aims to evaluate the attitudes of both residents and faculty towards the idea of utilizing multiple modalities of simulation in a high-stakes emergency medicine (EM) examination. The goal is to propose a method to improve the process of this examination.
Participants and Methods: The data were obtained using a cross-sectional survey questionnaire that was distributed to 141 participants, including both EM residents and instructors in the Saudi Board of Emergency Medicine. An online survey tool was used. The data were collected and subsequently analyzed to gauge the general and specific attitudes of both residents and instructors.
Results: Of the 141 participants, 136 provided complete responses; almost half were residents from all years, and the other half were primarily instructors (registrars, senior registrars, or consultants). Most of the participants from both groups (70% of the residents and 86% of the instructors) would like to see simulation incorporated into the final EM board OSCEs. Most of the participants (78%), however, had no experience with using multiple modalities of simulation in OSCEs. Overall, the majority (74.82%) expressed the belief that simulation-based OSCEs would improve the assessment of EM residents’ competencies. The modalities that received the most support were part-task trainers and hybrid simulation (70.71% and 70%, respectively).
Conclusion: From this study, we can conclude that both parties (residents and instructors) are largely willing to see multimodality simulation being incorporated into the final board examinations. Stakeholders should interpret this consensus as an impetus to proceed with such an implementation of multimodality simulation. Input from both groups should be considered when planning for such a change in this high-stakes exam.

Keywords: assessment, multimodality, simulation, examinations, attitudes

Introduction

Healthcare workers should have the sufficient level of competency to practice medicine safely on real patients. Stakeholders give a special attention to ascertain that all practicing physicians have reached this level of competency. High-stakes examinations are required to assure that graduating residents have the acceptable level of competency (at the end of their training) to practice their specialties on real patients independently. Thus, these examinations should attain a high level of quality. Among the types of assessments implemented in high-stakes examinations, Objective Structured Clinical Examinations (OSCEs) are a common choice.1 An OSCE is an examining process made up of a series of stations of equal length set in a circuit that usually aims to test competency for trainees.2 OSCEs were conducted to assess the cognitive skills of learners on how to deal with emergencies.3 Testing the residents through an OSCE, however, should venture beyond cognitive skills; it should aim to achieve higher levels of quality, validity, and utility by targeting skills, behaviors, and attitudes.4 Incorporating simulation provides an additional dimension to the level of assessment achieved through the exclusive implementation of structured oral examinations.

Simulation in healthcare can be simply defined as: “The application of a simulator to training and/or assessment”.5 Subtypes of simulation include standardized patients, part-task trainers, high-fidelity simulators, virtual simulation, and hybrid simulation. Simulation-based OSCE has been implemented in various situations. For example, it has shown strong validity and reliability in the assessment of resuscitation.6 Moreover, a previous study revealed that internal medicine residents felt that using simulation in OSCE is reflective of their performance in real life 7.7 Many studies investigating the utilization of simulation in assessments have been published.8–10 Multiple studies have uncovered a favorable attitude towards incorporating simulation into competency assessment.11–13 In anesthesia, multimodality simulation has been implemented successfully in the final high-stakes examination for many years.14,15 To the best of our knowledge, however, EM residents’ attitudes towards incorporating simulation into high-stakes examinations have not been sufficiently highlighted in the medical education literature.

Concerning EM-related examination practice, one study explored the development and validation of simulation as a resuscitation assessment tool designed for EM residents 6.6 Another study explicitly described the potential utilization of simulation in the assessment of EM residents’ competencies.1 Multiple assessment milestones, which are not limited to knowledge domains, are essentially required throughout all residency training programs, including EM residency training programmes.7,16,17 An Australian study illustrated that OSCE simulation constitutes a valid, reliable, and acceptable assessment tool in the field of EM.7

The final certification examination of EM, as regulated by the Saudi Commission for Health Specialties (SCFHS), usually consists of several OSCEs and structured oral stations, in which exam structure can be variable based on the examination committee’s annual recommendations: (https://www.scfhs.org.sa/en/Media/OtherPublications/Documents/OSCE%20MANUAL.pdf). So, it is not a typical OSCE if we look at all stations. According to SCFHS, physicians are classified according to certain requirement into general practitioner, residents, registrars, senior registrar and consultant.18 Residents are eligible to be classified as senior registrars once they pass this OSCE; if they fail it, they can only be classified as registrars. The organization and regulations of the exam conduction are under continuous development over the years. In this study, we aim to compare the attitudes of residents and faculty (instructors) towards the idea of incorporating multiple modalities of simulation into the final EM Saudi board certification OSCE. This should provide an initial impression and guidance for stakeholders interested in the developmental implementation of multimodality simulation in this high-stakes exam.

Participants and Methods

Study Design

This is a cross-sectional study using a survey questionnaire, with closed-ended questions, that was distributed among emergency medicine stakeholders involved in the Saudi Board training of EM personnel, including residents and instructors (board-eligible/certified physicians). Residents from all levels classified by the Saudi Commission for Health Specialties (SCFHS), R1-R4, were involved. Also, instructors according to SCFHS classification included registrars (Saudi board eligible), senior registrars, and consultants (board-certified emergency physicians).

Inclusion and Exclusion Criteria

All residents included in the Saudi board training were included, along with all instructors eligible to provide training/supervision for them. The later category includes the more senior staff as per the SCFHS classification (registrars, senior registrars, and consultants). All those who were not registered as residents by SCFHS (not included in an EM board training) at the time of the study were excluded, even if they are provided the title of resident by their institution. Physicians from centers not recognized as training centers by SCFHS were not included.

Sampling and Sample Size

It was difficult to estimate the total number of registered physicians because there is no publicly available information concerning all EM physicians in the country, therefor a convenience sampling strategy was chosen. An updated list of examiners was provided by the head of the exam committee, however, and residents were directly accessed by their resident colleagues in the research team. The sample size required to achieve a statistically significant (power of 0.8) result was calculated to be 141 in each group using an online calculation tool (https://www.sealedenvelope.com/power/binary-superiority/).

Survey Instrument

There was no similar survey tool (questionnaire) described in the literature to serve the exact same purpose of this research. A new survey tool was developed for this study based on a previous survey 19. The survey starts with an informed consent; if the participants agree, they can proceed to fill the survey. The survey tool was created around three main constructs, besides the demographics of the participants, with the content of the questionnaire elements been adapted from previous studies 19. The three constructs were general attitudes towards the implementation, possibly targeted domains of competency and modalities of simulation. Two experienced EM board-certified physicians reviewed it for face validity; the first is an examiner for the final EM OSCE, and the other is experienced in clinical simulation. The plan was to take the first 10 responses as part of a pilot study to verify the survey’s clarity and consistency. If no issues were mentioned by the respondents, this sample would be added to the main study sample. The questionnaire was built and distributed using electronic survey tool SurveyMonkey (SurveyMonkey, Inc. (Palo Alto, California, USA, www.surveymonkey.com)). A Likert scale of 0 (not sure) to 4 (strongly agree) was used to assess the level of agreement for each key research variable.

The questionnaire consists of 11 questions divided into three sections. The first section includes demographic data. It starts with the status according to SCFHS registration, which includes residents from all four levels (R1, R2, R3, and R4) and attending board-certified/eligible physicians (registrar, senior registrar, and consultant). Consultants were further asked if they had previously served as examiners for the final certification OSCE. Other questions addressed their current group of practice (adult, pediatric, or both), age, and gender.

The second section includes questions designed to assess the general attitudes of both learners and instructors towards proposed implementation of simulation in the OSCE for final EM certification. Meanwhile, the third section contains questions that were designed to assess the participants’ attitudes towards the possibility of using simulation to assess specific competency domains, such as taking histories, performing the physical examination, procedural skills, clinical decision-making, communication with patients, communication with patients’ families, annotating medical records, conflict management, interaction with other healthcare workers, leadership, and teaching skills. The final section of the survey questionnaire addresses specific simulation modalities that could be incorporated into the final OSCE: standardized patients, part-task trainers, high-fidelity simulators, virtual simulation, and hybrid simulation.

Data Collection

The study was conducted during the period between October 2017 and February 2018. The survey was provided to targeted participants by different means, all of them were done without pressuring the participants. First, candidates were asked face-to-face to fill the survey in October 2017 during a national EM Conference. Second, the questionnaire was distributed through emails to examiners registered with the SCFHS on the list provided by the former head of the exam committee (second author). Third, EM residents were asked during their academic activities to fill the survey. Additionally, the survey’s link was distributed through various social media groups of EM attendees and residents in Saudi Arabia.

Data Analysis

Data entered in SurveyMonkey were transferred into a Microsoft® Excel (Office 365) spreadsheet (Microsoft Corporation, Redmond, Washington, USA). The research team reviewed the data to check for any obvious inconsistencies and to gauge the extent of missing data before analysis. Percentages and statistical tests were adjusted to compensate for missing data by considering the total numbers after adjustment of missed data. The data were coded and analyzed using IBM’s SPSS statistical package for Windows (version 24, Armonk, New York: IBM Corp). To understand and analyze appropriately attitudes towards the introduction of multimodality simulation-based OSCEs, we analyzed separately the results of the two categories of participants, the instructors (consultants, registrars, and senior registrars) and EM residents (R1 to R4). This facilitated comparison of the two groups’ answers to all questions.

Reliability test Cronbach’s alpha was used to test the internal consistency of the questions designed to measure attitudes. To compare items questioning the participants’ attitudes towards different variables included in the survey, Kruskal Wallace tests were employed to check for statistical significance of the non-parametric data. Multiple comparison tests were utilized to assess the relationship between the demographic variables and the total belief scores. For the age group comparisons, ANOVA was used to test the relationship between the participants’ ages and their beliefs. ANOVA was likewise employed for the training variable. Regarding the gender and classification variables, independent t-tests were employed in the comparison assessment. Significance (α) that was set to 0.05 prior to examining the data. P-values were presented to reflect the statistical significance of differences between the key research variables.

Results

Of the 141 participants, 136 provided complete responses to all survey questions (96.5%). There was a small number of missed responses to certain questions; analysis was based on the total number of responses, adjusted for missing data. Reliability test Cronbach’s alpha was performed to test the internal consistency of the attitudes questions, which was found to be excellent (Cronbach’s alpha = 0.922). Among the 141 Participants, almost half (51.45%) were residents from all levels; the remaining participants were instructors of varying levels (registrar, senior registrar, and consultant).

Only two registrars (board eligible) responded to the questionnaire. There were 5 responses (3.5%) among the 136 responses for which the classification levels were unknown; the analysis was adjusted for those missing responses. Their responses to the other questions were not omitted, however, as there was no difference in the agreement among all the categories of study participants. Of the 141 total responses, 94 (67.1%) were from males and 46 were from females (32.9%); the majority of the study participants belonged to the adult EM practice group. Most respondents were from the younger age group (even among instructors). The age category with the highest number of responses was 25–34 years old, with a total number of 98 (70.0%), followed by the age category of 35–44 years old, with 37 responses (26.4%). Finally, the 45–54 age group had only 5 associated responses (3.6%) (Table 1).

Table 1 Descriptive Statistics of Demographics of Participants

Regarding previous exposure to simulation, the residency programs training was the most common setting in which simulation had been encountered for both groups, with 44 responses (36.1%). Most instructors were Saudi board trained, followed by the number of instructors who were Arab board trained.

Among the 37 consultants who responded, 8 had previously served as examiners for the final OSCE; 7 of these (87.5%) exhibited a favorable attitude towards the implementation of multimodality simulation in the OSCE for EM board certification. Most respondents (67.1%) had not been involved in any in OSCE where simulation was applied, within the past 5 years. However, 16 respondents were exposed to simulation during their residency in-service OSCEs; half of these exposures occurred during Saudi board residency training. Interestingly, five senior registrars (new board graduates) were exposed to simulation in their Saudi final EM board certification OSCEs; two of these specified the year as 2016. Most likely, they got exposed to standardized patient as it was the modality that introduced to the exam by that time. Also, one of the consultants was exposed to simulation in her pediatric emergency fellowship final OSCE.

The principal theme of this study concerned attitudes towards the general concept of the implementation of simulation (Table 2). The first question was the key question, and the majority (75.7%) of the respondents want to see simulation implemented in final EM OSCEs. Overall, examining the responses to all the questions in this section, there is a positive attitude towards implementing simulation as a valid strategy to assess residents’ competencies. Still, there was a small degree of variability in the responses. For example, only 55.4% of the instructors exhibited a positive attitude towards implementing simulation in all OSCE stations vs 74.3% of the residents. Furthermore, there was an attitude of uncertainty towards certain variables. For example, around 23% of the instructors were unsure if simulation-based examinations would be superior to structured oral examinations in reflecting residents’ true performance: this is in contrast to only 7% of the residents.

Table 2 Percentages of Belief Scores Among Instructors and Residents Regarding Simulation-Related Questions

The second major theme of this study concerned the competencies that residents and instructors think can be assessed using simulation (Table 3). Participants’ positive attitudes towards the ability of simulation to assess EM competencies were apparent in their answers for all the competencies, including both technical and non-technical skills. Although there is consistent agreement on the ability of simulation to assess all competencies, the tendency was not towards the strong agreement.

Table 3 Percentages of Belief Scores Among Instructors and Residents Regarding the Probability of Applying Simulation to Assess the Following Domains of EM Competencies in the Final EM Board Exams

Another essential element of this study was to identify the simulation modalities on which both parties agree (Table 4). The results indicate a positive attitude towards implementing all of the common modalities of simulation: standardized patient, partial-task trainer, high-fidelity simulation, virtual simulation, and hybrid simulation.

Table 4 Percentages of Belief Scores Among Instructors and Residents Regarding Possible Implementation of the Following Modalities of Simulation in the Final EM Board Examinations

The positive attitudes expressed in the three main sections of the survey (assessment of competency, types of competencies to be assessed, and modalities of simulation to be used) reflect an overall acceptance of the idea of implementing simulation.

Discussion

This study indicates an overall acceptance of the idea of implementing multimodality simulation in the EM final board certification OSCE. Both residents and instructors expressed agreement regarding the three principal themes of this study: general acceptance of the concept, agreement on major competencies that can be assessed using simulation, and the simulation modalities that could be successfully used in the exam. Although there was a small degree of variability in responses to the questionnaire, this did not detract from the overall impression of general agreement. As the written final exam, which consists of Multiple-Choice Questions (MCQs), assesses the knowledge domain, the final OSCE usually targets the psychomotor and affective domains.1,20 This triangulation in using multiple assessment methods is preferable in high-stakes assessments to achieve the high level of quality desired. Although providing a quality assessment of the psychomotor and affective domains is challenging, especially with new, innovative simulation models, respondents to this study believe it is achievable.

Regarding demographics, the low number of registrars participating in the survey could be a reflection of their underrepresentation within the population of emergency physicians. This can be explained by the high pass rate for the OSCE; therefore, most graduates get classified by SCFHS as senior registrars immediately. Regarding the genders of the participants, male participants were almost double the number of females; however, there is no statistical difference in agreement on incorporating multiple modalities into the emergency medicine final board examination between both genders. This finding is consistent with what is described in the literature regarding a general gender indifference in preferences for using simulation in education.19 The overrepresentation of male emergency physicians in the survey (67.1%) could reflect their predominance in the specialty of EM in Saudi Arabia.

Most respondents belonged to the younger age group (25–34 years), even among board-certified instructors. This could be related to the relatively new presence of the specialty in Saudi Arabia and, in general, worldwide.21 Also, it is not unexpected to see newly board-certified physicians who are motivated to capitalize on opportunities to engage in various career development activities, such as serving as an examiner for high-stakes exams. Such motives are compatible with the common motives of the exam organizing committees, which seek continuous development each year, including integrating new examiners. Nevertheless, this does not preclude the positive response obtained in this study from the older age group of instructors (35–54 years of age).

Experiences with implementing various modalities of simulation in life support courses have been described in the literature.6,22,23 Although all participants had at least the Basic Life Support (BLS), only 18% reported previous participation in simulation-utilizing life support courses. This reflects a low perception of the utilization of simulation in these settings; this could represent an area for further study. On the other hand, there was a high perception among participants of the presence of simulation in the residency training (including in-service OSCEs). The newly graduated residents (senior registrars) who had experience with simulation in the final board certification OSCE exhibited a positive attitude towards such implementation. The perceptions of those who have experienced the introduction of a new modality of simulation in the exam could be further evaluated in a future study.

When looking at the first main construct about the general acceptance of the concept, many have shown positive attitude with statistically significant difference. For example, there is appositive attitude towards seeing simulation implemented (and in all stations), the possibility of better performance assessment and the possibility to conduct a high-quality examination. However, some variables (about comfort, fairness, and standardization) did not show a statistically significant difference. The presence of some concerns from participants does not preclude the acceptance of the concept of such an implementation for improvement.

Our respondents believe in using multimodality simulation as a positive development that might provide a fair assessment of residents’ competencies. This attitude is unsurprisingly in alignment with the growing trend towards incorporating simulation into certification and high-stakes examinations.10,14 Nevertheless, such implementation and possible assessment paradigm shifting presents certain concerns, challenges, and limitations. Training and certifying OSCE examiners, standardizing and validating the scenarios, and validating assessment tools represent major challenges. Additionally, cheating is a well-known concern when conducting OSCEs.22,24,25 The costs and logistics associated with conducting such exams represent another major challenge. How to overcome or mitigate these challenges, either prior to or following implementation, is beyond the scope of this study, and we believe that this would constitute an interesting topic for further research.

Standardized patient (SP) is the most used modality of simulation in high-stakes exams,26,27 yet it has its inherited limitations in that it practically fails to assess major competency domains. For example, assessing the technical competency of the emergency medicine resident in performing a technical procedure is usually not possible using SP alone. Supplementing SP with other modalities of simulation would increase the spectrum and improve the quality of assessment achieved by the exam. This study shows an acceptance of and belief in the potential for implementing various types of simulation in this variety of high-stakes assessment. This can signal a call for clinical simulation experts to apply their expertise to facilitating the implementation of such a modification to this high-stakes exam.

Limitations

This study has some limitations. There was no available registry from which to obtain the data of all emergency residents and board-certified physicians in Saudi Arabia. This explains the challenges associated with obtaining the definite total desired sample size and the response rate. Nevertheless, we started with the most recent available list of examiners and current residents. In addition, as an inherited limitation in the study design, the attitudes of respondents were affected by many influencing factors related to their previous experiences with simulation and current motives. This presents a potential direction for future studies based on observing real-life implementation rather than merely surveying attitudes.

Conclusion

From this study, we can conclude that both residents and instructors have a positive attitude towards the implementation of multiple modalities of simulation in the EM final board certification OSCE. This agreement should encourage stakeholders to take the exam a step further by initiating such an implementation. Input from both groups should be considered when planning for this major change in such a high-stakes exam.

Abbreviations

OSCE, Objective Structured Clinical Examination; EM, emergency medicine; ANOVA, analysis of variance; MCQs, multiple-choice questions; SP, standardized patient; ECGs, electrocardiograms; SCFHS, Saudi Commission of Health Specialties.

Data Sharing Statement

The datasets used and/or analyzed during the current study are available from the corresponding author on reasonable request.

Ethical Considerations and Consent Statement

This research got the ethical approval from the Unit of Biomedical Ethics Research Committee at King Abdulaziz University, Faculty of Medicine, Jeddah, Saudi Arabia (IRB No. 69-17). An informed consent was taken from all participants in the survey before they start filling it. This study complied with the Declaration of Helsinki, and patient data confidentiality was maintained.

Acknowledgments

Ahmed Bukhari, PhD. For the support to do statistical analysis.

English editing was done by Cambridge proofreading & Editing LCC® (https://proofreading.org/).

Author Contributions

All authors made substantial contributions to conception and design, acquisition of data, or analysis and interpretation of data; took part in drafting the article or revising it critically for important intellectual content; agreed to submit to the current journal; gave final approval of the version to be published; and agree to be accountable for all aspects of the work.

Funding

This study was self-funded by the authors. There was no other funding source.

Disclosure

The authors report no conflicts of interest in this work. There are no financial or non-financial competing interests to be declared.

References

1. O’Leary F. Simulation as a high stakes assessment tool in emergency medicine. Emerg Med Australas. 2015;27(2):173–175. doi:10.1111/1742-6723.12370

2. Objective Structured Clinical Examination, OSCE MANUAL. Saudi Commission for Health Specialties; 2014. Available from: https://www.scfhs.org.sa/en/Media/OtherPublications/Documents/OSCE%20MANUAL.pdf. Accessed October 26, 2020.

3. Gordon JA, Tancredi DN, Binder WD, Wilkerson WM, Shaffer DW. Assessment of a clinical performance evaluation tool for use in a simulator-based testing environment: a pilot study. Acad Med. 2003;78(10 Suppl):S45–7. doi:10.1097/00001888-200310001-00015

4. Wilkes M, Bligh J. Evaluating educational interventions. BMJ. 1999;318(7193):1269–1272. doi:10.1136/bmj.318.7193.1269

5. Healthcare Simulation Dictionary. Agency for healthcare research and quality; 2016. Available from: https://www.ahrq.gov/sites/default/files/publications/files/sim-dictionary.pdf. Accessed October 26, 2020.

6. Dagnone JD, Hall AK, Sebok-Syer S, et al. Competency-based simulation assessment of resuscitation skills in emergency medicine postgraduate trainees - a Canadian multi-centred study. Can Med Educ J. 2016;7(1):e57–67. doi:10.36834/cmej.36682

7. Nunnink L, Venkatesh B, Krishnan A, Vidhani K, Udy A. A prospective comparison between written examination and either simulation-based or oral viva examination of intensive care trainees’ procedural skills. Anaesth Intensive Care. 2010;38(5):876–882. doi:10.1177/0310057X1003800511

8. Calhoun AW, Bhanji F, Sherbino J, Hatala R. Simulation for high-stakes assessment in pediatric emergency medicine. Clin Pediatr Emerg Med. 2016;17(3):212. doi:10.1016/j.cpem.2016.05.001

9. Boulet JR, Murray D. Review article: assessment in anesthesiology education. Can J Anaesth. 2011;59(2):182–192. doi:10.1007/s12630-011-9637-9

10. Tavakol M, Mohagheghi MA, Dennick R. Assessing the skills of surgical residents using simulation. J Surg Educ. 2008;65(2):77–83. doi:10.1016/j.jsurg.2007.11.003

11. McMahon GT, Monaghan C, Falchuk K, Gordon JA, Alexander EK. A simulator-based curriculum to promote comparative and reflective analysis in an internal medicine clerkship. Acad Med. 2004;80(1):84–89. doi:10.1097/00001888-200501000-00021

12. Ziv A, Rubin O, Sidi A, Berkenstadt H. Credentialing and certifying with simulation. Anesthesiol Clin. 2007;25(2):261–269. doi:10.1016/j.anclin.2007.03.002

13. Orledge J, Phillips WJ, Murray WB, Lerant A. The use of simulation in healthcare: from systems issues, to team building, to task training, to education and high stakes examinations. Curr Opin Crit Care. 2012;18(4):326–332. doi:10.1097/MCC.0b013e328353fb49

14. Berkenstadt H, Ziv A, Gafni N, Sidi A. The validation process of incorporating simulation-based accreditation into the anesthesiology Israeli national board exams. Isr Med Assoc J. 2006;8(10):728–733.

15. Sidi A, Gravenstein N, Lampotang S. Construct validity and generalizability of simulation-based objective structured clinical examination scenarios. J Grad Med Educ. 2015;6(3):489–494. doi:10.4300/JGME-D-13-00356.1

16. Beeson MS, Vozenilek JA. Specialty milestones and the next accreditation system. Simulation Healthcare. 2014;9(3):184. doi:10.1097/sih.0000000000000006

17. Quinn SM, Worrilow CC, Jayant DA, et al. Using milestones as evaluation metrics during an emergency medicine clerkship. J Emerg Med. 2016;51(4):426–431. doi:10.1016/j.jemermed.2016.06.014

18. The executive regulations of Professional Classification and Registration. Saudi Commission for Health Specialties; 2014. Available from: https://www.scfhs.org.sa/en/registration/Regulation/Documents/The%20executive%20regulations%20of%20Professional%20Classification%20and%20Registration%20aims.pdf. Accessed October 26, 2020.

19. Ahmed S, Al-Mously N, Al-Senani F, Zafar M, Ahmed M. Medical teachers’ perception towards simulation-based medical education: a multicenter study in Saudi Arabia. Med Teach. 2016;38(Suppl 1):S37–44. doi:10.3109/0142159X.2016.1142513

20. Collins JP, Harden RM. AMEE Medical Education Guide No. 13: real patients, simulated patients and simulators in clinical examinations. Med Teach. 1998;20(6):508. doi:10.1080/01421599880210

21. Khattab E, Sabbagh A, Aljerian N, et al. Emergency medicine in Saudi Arabia: a century of progress and a bright vision for the future. Int J Emerg Med. 2019;12(1):16. doi:10.1186/s12245-019-0232-0

22. Srinivasan M, Hwang JC, West D, Yellowlees PM. Assessment of clinical skills using simulator technologies. Acad Psychiatry. 2006;30(6):505–515. doi:10.1176/appi.ap.30.6.505

23. Sawyer T, Laubach V, Yamamura K, Hudak J, Pocrnich A. Interprofessional Teamwork Training in Neonatal Resuscitation using TeamSTEPPS and Event-based Approach Simulation. MedEdPORTAL Publications. 2013. doi:10.15766/mep_2374-8265.9583

24. Berkenstadt H, Ziv A, Gafni N, Sidi A. Incorporating simulation-based objective structured clinical examination into the Israeli National Board Examination in Anesthesiology. Anesth Analg. 2006;102(3):853–858. doi:10.1213/01.ane.0000194934.34552.ab

25. Glavin RJ, Gaba DM. Challenges and opportunities in simulation and assessment. Simul Healthc. 2008;3(2):69–71. doi:10.1097/SIH.0b013e31817bb8f6

26. Harvey P, Radomski N. Performance pressure: simulated patients and high-stakes examinations in a regional clinical school. Aust J Rural Health. 2011;19(6):284–289. doi:10.1111/j.1440-1584.2011.01231.x

27. Adamo G. Simulated and standardized patients in OSCEs: achievements and challenges 1992-2003. Med Teach. 2003;25(3):262–270. doi:10.1080/0142159031000100300

Creative Commons License © 2020 The Author(s). This work is published and licensed by Dove Medical Press Limited. The full terms of this license are available at https://www.dovepress.com/terms.php and incorporate the Creative Commons Attribution - Non Commercial (unported, v3.0) License. By accessing the work you hereby accept the Terms. Non-commercial uses of the work are permitted without any further permission from Dove Medical Press Limited, provided the work is properly attributed. For permission for commercial use of this work, please see paragraphs 4.2 and 5 of our Terms.