Back to Journals » Advances in Medical Education and Practice » Volume 13

Evaluation of the Utility of Online Objective Structured Clinical Examination Conducted During the COVID-19 Pandemic

Authors Arekat M, Shehata MH , Deifalla A , Al-Ansari A , Kumar A, Alsenbesy M , Alshenawi H , El-Agroudy A, Husni M, Rizk D , Elamin A, Ben Salah A, Atwa H 

Received 19 January 2022

Accepted for publication 28 March 2022

Published 28 April 2022 Volume 2022:13 Pages 407—418

DOI https://doi.org/10.2147/AMEP.S357229

Checked for plagiarism Yes

Review by Single anonymous peer review

Peer reviewer comments 5

Editor who approved publication: Dr Md Anwarul Azim Majumder



Mona Arekat,1 Mohamed Hany Shehata,2,3 Abdelhalim Deifalla,4,5 Ahmed Al-Ansari,6 Archana Kumar,6 Mohamed Alsenbesy,1,7 Hamdi Alshenawi,8 Amgad El-Agroudy,1 Mariwan Husni,9,10 Diaa Rizk,11 Abdelaziz Elamin,12 Afif Ben Salah,2 Hani Atwa6,13

1Internal Medicine Department, College of Medicine and Medical Sciences, Arabian Gulf University, Manama, Kingdom of Bahrain; 2Family and Community Medicine Department, College of Medicine and Medical Sciences, Arabian Gulf University, Manama, Kingdom of Bahrain; 3Family Medicine Department, Faculty of Medicine, Helwan University, Cairo, Egypt; 4Anatomy Department, College of Medicine and Medical Sciences, Arabian Gulf University, Manama, Bahrain; 5Human Anatomy and Embryology Department, Faculty of Medicine, Suez Canal University, Ismailia, Egypt; 6Medical Education Unit, College of Medicine and Medical Sciences, Arabian Gulf University, Manama, Bahrain; 7Internal Medicine Department, Faculty of Medicine, South Valley University, Qena, Egypt; 8Surgery Department, College of Medicine and Medical Sciences, Arabian Gulf University, Manama, Bahrain; 9Psychiatry Department, College of Medicine and Medical Sciences, Arabian Gulf University, Manama, Bahrain; 10Psychiatry Department, Northern Ontario School of Medicine (NOSM), Ontario, Canada; 11Obstetrics and Gynecology Department, College of Medicine and Medical Sciences, Arabian Gulf University, Manama, Bahrain; 12Pediatrics Department, College of Medicine and Medical Sciences, Arabian Gulf University, Manama, Bahrain; 13Medical Education Department, Faculty of Medicine, Suez Canal University, Ismailia, Egypt

Correspondence: Mohamed Hany Shehata, Family and Community Medicine Department, College of Medicine and Medical Sciences, Arabian Gulf University, Manama, Kingdom of Bahrain, Tel +97333918183, Email [email protected]

Background: The COVID-19 pandemic led to profound restrictions on the face-to-face learning and assessment in all educational institutions, particularly the medical schools. The College of Medicine and Medical Sciences of the Arabian Gulf University (CMMS-AGU) conducted the final exams, both theoretical and clinical components, for its MD students online. This study was conducted to evaluate the utility of online clinical exams held at CMMS-AGU.
Methods: This is a cross-sectional, mixed method study that included samples from final year medical students, examiners, and heads of clinical departments. Data were collected through surveys, structured interviews, documents’ review, and calculation of online examination’s psychometrics. Descriptive statistics were used. Quantitative data were presented in the form of means and standard deviations. Responses of heads of clinical departments in the structured interview were transcribed and analyzed thematically based on three pre-established themes.
Results: Quantitative and qualitative data on the utility (validity, reliability, acceptability, educational impact, and cost and feasibility) of online objective structured clinical examination (OSCE) were collected. Content validity of the online clinical examination was established through high mean scores of content representativeness, which was confirmed by the heads of clinical departments regarding the proper coverage of clinical skills. Criterion validity was established through a high correlation between clinical and theoretical exam results (r = 0.75). Reliability of the exam was established through an acceptable Cronbach’s alpha value (0.70 to 0.78) over the four days of the examinations. The examinations were perceived as highly acceptable by both students and examiners. High educational impact was inferred from students’ responses and review of documents. The examination was found to be feasible and of reasonable cost.
Conclusion: Online OSCE might be a good alternative of conventional clinical assessments in times of crises and impossibility of having in-person contact between students, examiners, and patients. An important major drawback is still present in such initiatives, which is the inability to assess students’ physical examination skills.

Keywords: online clinical assessment, exam utility, OSCE, COVID-19

Introduction

The coronavirus disease 2019 (COVID-19) outburst swiftly transformed into an ongoing global pandemic, exerting profound restrictions on the face-to-face learning in all educational institutions, particularly the medical schools.1 Medical training was most disturbed because of the intrinsic nature of the program that requires working in teams, close contact with patients, and inevitable communication with patients and their families.2,3

Not only was the medical training disturbed but the clinical examinations were also disrupted. Medical schools were unsure whether to conduct the final year clinical examinations for clerkship students or defer it until the COVID-19 pandemic resolves. It soon became apparent that the pandemic situation would continue for an indefinite duration and many medical schools around the world made a bold decision to graduate their senior students with unconventional exams.4,5 This was also necessitated by the crucial need to speed up the process of graduation of more doctors and other health workers to bridge the imminent deficiencies in the health-care sector to handle the ongoing pandemic.4–7

This motivated the College of Medicine and Medical Sciences, Arabian Gulf University (CMMS-AGU) to reach the verdict of conducting online exams for their final year students.8 The theoretical component of the clinical courses was organized and distantly proctored through an online student assessment platform. For the clinical component, it was decided to implement ten online objective structured clinical examination (OSCE) stations. Zoom Meetings® platform was used due to its favorable features, including ease of use, Breakout Rooms feature, and Waiting Rooms feature. This experience was documented as a toolbox for conducting online OSCE in a publication by Shehata et al.9

In online clinical examinations, the real challenge is to redesign it to match the pedagogy of student assessment with keeping in mind important elements like secure identity, academic integrity, capacity building, as well as other fundamental utility features of student assessment, namely validity, reliability, acceptability, educational impact, and feasibility.10–12

Although there are few studies that capture the nuances of planning and implementation of online OSCE examination, there is very scarce information available on the utility of such exams. This study aims at evaluating the utility of online OSCE exams held at CMMS-AGU in terms of their validity, reliability, educational impact, acceptability, and cost and feasibility of these exams.

Subjects and Methods

Study Design

A cross-sectional, mixed method study conducted during June to August 2020. Triangulation was used to obtain different but complementary data on the same topic from different stakeholders.

Study Setting

The 6-year program at the CMMS-AGU is divided into 3 phases: Phase 1 (Year 1; Basic Sciences), Phase 2 (Years 2 to 4; Pre-clerkship Phase of Basic Medical Sciences), and Phase 3 (Years 5 and 6; Clerkship Phase of Clinical Sciences). At the end of Year 6, students sit for a comprehensive exit exam (MD exam) that is composed of a written component and a clinical component. Before the COVID-19 pandemic, the exams used to be conducted totally on campus at the prepared venues of the Arabian Gulf University and in the AGU affiliated hospitals.

Population and Sample Size

The target population was the students in Year 6 who sit for the MD exam (158 students). However, 124 students only (78.5%) responded to the survey. In addition, 60 full- and part-time faculty who were involved in the MD exam (40 examiners and 20 exam writers/coordinators) were included in the study.

Data Collection

Data were collected through three tools as follows.

Surveys

After the exam, evaluation data was collected through two short, semi-structured surveys designed by the authors, one for the students and another for the examiners. The surveys were developed and revised by all authors after reviewing the relevant literature and similar studies. Then the surveys were piloted on a few participants.

The surveys employed 5-point Likert scales (Strongly Agree = 5, Agree = 4, Neutral = 3, Disagree = 2, and Strongly Disagree = 1).

Structured Interview for Heads of Clinical Departments

A structured interview was conducted with the heads of clinical departments (n = 6). The interview focused on exploring their viewpoints regarding pre-established themes which are: (1) the extent of the online exam’s ability to assess students’ clinical skills, (2) the exam’s ability to replace the in-person clinical exams, and (3) extent of coverage of the clinical content of the rotations by the exam.

Documents’ Review and Calculation of Examination’s Psychometrics

The documents related to planning and conduction of the clinical examinations were reviewed to obtain data on the preparation of the students and examiners and the mock exams conducted. Also, validity and reliability studies were conducted through calculation of correlation coefficients and internal consistency of exam results.

Statistical Analysis

Quantitative data was analyzed using SPSS v.26. Simple descriptive statistics were used. Data were presented in the form of means and standard deviations. Responses of heads of clinical departments in the structured interview were transcribed and analyzed thematically based on three pre-established themes by four of the authors (MA, MHK, HA, and AK).

Ethical Approval

Informed consent was obtained from all the participants after explaining the process in detail. All the data regarding OSCE stations of the online exam, students, and assessors were maintained confidential. The study was approved by the Research and Ethics Committee of CMMS-AGU (E002-PI-6/20).

The Exam

Preparatory Phase

The process of digital adaptation of clinical exam was a complex task consisting of series of interconnected steps, facilitated by teamwork and collaboration, as described below:

  1. Institutional readiness Availability of digital infrastructure, functional e-learning unit and dedicated faculty empowered all stakeholders to plan forward with confidence.
  2. Getting everyone on board This transition process needed consensus of all stakeholders. It is recommended to remain transparent and spell out the expectations and limitations of online exams. Individual meetings were held with college leaders/curriculum committee/department heads/technical team/support staff and student representatives.
  3. Effective utilization of the infrastructure Venue containing all assessment rooms placed in same corridor/same floor was selected to facilitate logistics and communication. High-speed stable internet connectivity (preferably LAN – Local Area Network), big LCD screen for better monitoring of all stations by members of exam team, electric bell, stopwatch, additional back up rooms, computers, web cameras, speakers, extension boxes, additional furniture, etc. were procured accordingly.
  4. Selection of task force The members were selected across all departments and units and grouped under three teams namely faculty, secretarial staff, and technical support. Formal assignment was guaranteed by the college leadership to ensure availability of staff in exam dates.
  5. Capacity building Taking into consideration the diversity of team members and required competencies of faculty and students, several training sessions with focused hands-on training and instructional videos were conducted. Complex protocols were translated to simple checklists, specific for each task namely host, co-host, faculty invigilators, pretest invigilators, posttest invigilators, secretaries, assessment unit staff, technical support staff and so on.
  6. Customization of tool Educational Institutional License for Zoom® was purchased, and settings were customized to match the requirements of the online exams.
  7. Mock exams Multiple mock sessions were conducted to improvise the checklists, streamline the student allotment, recognize potential areas of “trouble shooting”, identify students with poor connectivity and faulty gadgets.
  8. Development of OSCE stations A blueprint was used for the selection and development of stations. Simulated patients were used wherever needed. The OSCE stations focused on range of clinical skills like history taking, decision-making, communication, interpretation, diagnostic skills, and patient management. Assessment of physical examination skills continued to be a challenge. Validated checklists were used by assessors.
  9. Communication Constant and clear communication was provided to all stakeholders through formal emails, assessors’ briefings, printed checklists, pre-/post-exam briefing and informal WhatsApp groups.

Implementation Phase

  1. Two hours before each exam, technical team prepare the virtual panels (renaming break out rooms, disabling live chat among students, enabling screen sharing, enabling video recording for examiners, checking audio/video, etc.).
  2. Small cohort of students receive the link for the exam in a sequential manner just 15 minutes before the scheduled time.
  3. Once all students are admitted to the meeting, they were transferred to a virtual pre-exam room for identity check before being assigned to virtual exam panels.
  4. After exams, the examined cohort of students is shifted to a virtual post-test room for sharing their feedback while the new cohort is assigned to exam panels (coming from the pre-exam virtual rooms).
  5. The same cycle was repeated till all students are examined. For every two cohorts the OSCE questions were changed to ensure confidentiality of exams.

The comprehensive data regarding the schedule of online OSCE exams (final MD - 2019) is shown in Box 1.

Box 1 Online OSCE Schedule

Potential Challenges and Feasible Solutions

Despite careful planning, there were few challenges which were encountered during the implementation of the online OSCE exams. The challenges along with strategies to overcome them are shown in Box 2.

Box 2 List of Challenges Encountered During Online OSCE Exams and Strategies Followed to Overcome Them

Results

Demographic Characteristics

Most of the students who participated in the study (60%) were Bahraini nationals, followed by Kuwaitis (21%) and Saudis (13%), while other nationalities constitute the remaining small percentage (Figure 1).

Figure 1 Demographic distribution of the students (n = 124).

Regarding the examiners, most of them were males (70%). The examiners came from six clinical departments, with equal percentages (20%) from each of Internal Medicine, Surgery, Pediatrics, and Obstetrics and Gynecology departments and smaller percentages (10%) from each of Family Medicine and Community Medicine and Psychiatry departments. The examiners had varying years of experience that ranged from one year to more than thirty years (Table 1).

Table 1 Demographic Profile of Examiners (n = 40)

Utility of Online OSCE

To assess the utility of online OSCE, quantitative and qualitative data were collected on five criteria of utility, namely, validity, reliability, acceptability, educational impact, and cost and feasibility.

Validity

Content Validity

Content validity was established through a short survey that targeted the examiners. The results of that survey are presented in Table 2.

Table 2 Means and Standard Deviations of Examiners’ Responses to Items Evaluating the Content of the Online Clinical Exam*

Table 3 Qualitative Analysis of the Structured Interviews with the Heads of Clinical Departments

Table 4 Means and Standard Deviations of Students’ Responses to Items Evaluating the Online Clinical Exam Experience

Table 5 Means and Standard Deviations of Examiners’ Responses to Items Evaluating the Online Clinical Exam Experience

Examiners participating in the online clinical exam highly valued the representativeness of the exam and its degree of reflecting real practice. They thought that the content properly sampled the content of the curriculum. However, the lowest mean scores were for the ability of the online OSCE to assess clinical skills (4.20±0.88).

The heads of clinical departments expressed their enthusiasm and understanding of the new online exam. They agreed on its effectiveness to evaluate the communication skills, content-specific knowledge, and history-taking skills as partial reflection of clinical skills. However, this process is justifiably not a perfect replacement of the in-person assessment of the psychomotor skills.

Moreover, the heads of clinical departments, except Psychiatry, estimated the percentage of clinical skills covered by the online exam to be 70–80%. The head of Psychiatry department argued that the exam covers even more than 90% of the content, due to the uniqueness of the mental state examination that relies on communication skills and does not depend on physical examination skills that need in-person contact between the student and the patient (which is highly missed in other specialties). Overall, the heads of clinical departments concluded that the online OSCE exam covered a broad range of course content and included a representative array of common problems (Table 3).

Criterion Validity

The online OSCE excellently correlated with the online written assessment of the same batch of students (r = 0.75). However, it was poorly correlated to the previous conventional clinical assessment marks of the same batch of students (r = 0.27).

Reliability (Internal Consistency)

Across the four days of clinical exams the Cronbach’s alpha of students marks in the OSCE stations ranged between 0.70 and 0.78 with a standard error of measurement that ranged between 2.69 and 2.99 at a confidence level of 95%.

Acceptability

Acceptability of the exam among the medical students and examiners has been explored through two survey questionnaires that were distributed just after the exam: one for the students and the other for examiners. The results of the survey are presented in Tables 4 and 5.

Table 4 shows high mean scores of student responses to all evaluation items. The highest mean score was for the item representing informing the students about the rules, regulations, and instructions of online clinical exams (4.93±0.32). The lowest mean score was for the item representing the benefit of the mock exam in preparing the students for the real exam (4.39±0.72). Importantly, the students tend to believe that the online exam was a suitable method to assess their clinical skills other than the physical examination skills (4.60±0.84).

Overall, the students were highly satisfied by the overall organization and implementation of the online clinical exams (4.70±0.58).

Table 5 shows high mean scores for all evaluation items. The highest mean score was for the item representing adequacy of the technical support (4.83±0.45). However, addressing respecting the physical distancing instructions had the lowest mean score (4.25±1.13).

Overall, the examiners were highly satisfied by the overall organization and implementation of the online clinical exams (4.68±0.57).

Educational Impact

As the students were inducted to the purpose of this clinical assessment, it positively affected their approach to learning through focusing on achieving the required clinical competencies either in clinical training or while preparing for the online OSCE. Students revealed that having multiple mock exams using the same method, together with the orientation sessions conducted by the clinical department, enlightened them about the expected level of performance. Students focused on the approach to common clinical presentations, interpretation of lab and radiology findings, as well as using communication skills for data gathering and providing explanations to role players.

Cost and Feasibility

Cost and feasibility of the online OSCE were acceptable as the planning team managed to use the available resources to conduct the exams. The preparation for the online exam consumed two months of almost daily work of the planning team, heads of clinical departments, and case writers. Additional time was mainly spent on the arrangement of the venue for the online exam and, most importantly, the training of the examiners on using technology to conduct the online OSCE. The cost could be estimated by transforming the total faculty time expenditure into US Dollars (USD), which was around 50,000 USD. Other costs were purchasing four laptop computers for meeting hosts (6000 USD) and an annual educational subscription in an online meeting platform to help conducting the meetings (3600 USD).

Discussion

The CMMS-AGU clinical departments held online OSCE MD exams during the COVID-19 lockdown. The OSCE stations were inclusive of various competencies including history taking, interpretation of investigation results, clinical communication, patient management, and clinical reasoning. Each student had to pass through two stations in each of Internal Medicine, Surgery, Pediatrics, and Obstetrics and Gynecology as well as one station in each of Psychiatry and Family Medicine.

This study aims to evaluate the utility of the CMMS-AGU online MD OSCE exam. A sample of diverse stakeholders of this exam was surveyed, including students, examiners, medical education experts, and heads of clinical departments. Psychometrics of the exam along with data from the final exam report were also used to complement the evaluation on the five criteria of utility, namely: validity, reliability, acceptability, educational impact, and cost and feasibility.

Discussion of the results of this study is presented here under five sections representing the five criteria of utility:

Validity

Medical education experts and representatives from all clinical departments reported that almost all the targeted important clinical competencies were tested in this online OSCE, except for physical examination skills. This was validated by the input of the heads of clinical departments who confirmed the proper coverage of most of the clinical skills by online OSCE. Similar results were reported by Shaban et al,13 who reported high content and face validity of their online exams that covered the most important clinical competencies except physical examination.

Furthermore, criterion validity study of the CMMS exam supports online OSCE as a valid tool for clinical assessment. It shows good correlation with the online written assessment of the same batch of students. Similar results were reported by Hamdy et al,14 who found high convergent validity between their online clinical examination tool and another previously validated tool that measures the same clinical competencies. However, weak correlation was detected between students’ results in online OSCE and their previous conventional clinical assessment. This is congruent with the results reported by Hasani et al15 in a similar study. One of the explanations for this weak correlation might be the difference in the clinical assessment methods used.

Reliability (Internal Consistency)

Cronbach’s alpha levels indicated an acceptable level of reliability for this online OSCE. Similar results were reported by Shaban et al,13 who reported comparable Cronbach’s alpha values between online and conventional OSCE at their institution. Furthermore, Felthun et al16 reported in their systematic scoping review that teleOSCE (online OSCE) could be able to improve the reproducibility of clinical assessments.

Acceptability

It was indicated by students, examines and heads of clinical departments that online OSCE is an acceptable tool for assessing clinical skills other than physical examination skills. In a similar study in Saudi Arabia by Shaiba et al,17 electronic OSCE (e-OSCE) was conducted in Pediatrics rotation and most of the respondents were very comfortable with this new virtual experience. More than half of the participants in that study even preferred the e-OSCE compared to the classic face-to-face clinical OSCE during the pandemic, which is supported also by a similar study by Elnaem et al.18

In another study, Shaban et al13 also found that implementing online OSCE was acceptable by the students and faculty members. This was also highlighted by Palmer et al19 in a pre-COVID study on the feasibility and acceptability of an online OSCE as well as Kakadia et al20 in a study on implementing online OSCE during the COVID-19 pandemic. Furthermore, Hamdy et al14 reported high satisfaction by both medical students and faculty members by the model of virtual clinical encounter examination they designed and implemented as an alternative for traditional OSCE.

Educational Impact

This online OSCE experience had a high educational impact. Through inducting the students to the purpose of this online assessment and the specific clinical competencies they need to achieve, this experience positively affected their approach to learning and guided the students toward paying more attention to the important components of common clinical presentations (like interpretation of lab and radiological findings and using communication skills in eliciting and providing information to simulated patients). This effect was mainly obtained by using multiple mock exams and orientation sessions conducted prior to the final MD exams. This is in congruence with the results of a study by Ganesananthan et al,21 who reported that the overall level of performance and learning of their students was more positively affected by the mock exam conducted prior to the final assessment.

We think that an added positive educational impact of this experience is training students on using technology in learning and, later after graduation, in health care. This is supported by the findings of Felthun et al16 in their systematic scoping review as they reported that teleOSCE could be able to equip the students with the requisite skills of practicing telemedicine in the future.

Cost and Feasibility

In addition to the available tangible and intangible resources available at the campus of the institution, the online OSCE conducted for the whole batch of students was affordable. The available human resources (faculty and assistant staff), campus offices, internet line, and computers were almost enough for successfully conducting the exam. Only a few computers and an annual educational subscription in an online meeting platform were needed for the completion of the online exam process. This is congruent with the results of a study by Shaban et al,13 who reported feasibility, cost effectiveness, practicality, and availability of needed resources locally within the institution. They reported that the only drawback is the inability to assess students’ physical examination skills, which is an understandable and unavoidable drawback of online clinical assessment in general. This is the same drawback that we encountered in our initiative. However, to partially compensate for this drawback, our examiners asked the students to tell what they would examine and how if they have in-person contact with the patients, as was done by Luke et al22 in their trial of implementing clinical competence evaluation in nursing.

Conclusion

The stakeholders’ evaluation of the online OSCE in CMMS-AGU was reassuring considering all five components of utility (which are validity, reliability, acceptability, educational impact, and cost and feasibility). This indicates that online OSCE might be a good alternative of conventional clinical assessments in times of crises and impossibility of having in-person contact between students, examiners, and patients. However, a major drawback of online OSCE is its inability to assess students’ physical examination skills which require physical contact between students and the patients. We believe that future technological advances in the field of haptic technology, as well as virtual and augmented reality, may provide a reasonable alternative for assessment of physical examination skills, where students can remotely touch and feel patients to physically examine them.

Acknowledgment

We would like to thank all the students, examiners, heads of clinical departments for participation in this study. We would also like to thank all CMMS-AGU faculty and staff who participated in planning and conducting the online OSCE during the pandemic.

Author Contributions

All authors made a significant contribution to the work reported, whether that is in the conception, study design, execution, acquisition of data, analysis and interpretation, or in all these areas; took part in drafting, revising or critically reviewing the article; gave final approval of the version to be published; have agreed on the journal to which the article has been submitted; and agreed to be accountable for all aspects of the work.

Funding

No funding was obtained for this study.

Disclosure

The authors declare that they have no conflicts of interest for this work.

References

1. Ahmed H, Allaf M, Elghazaly H. COVID-19 and medical education. Lancet Infect Dis. 2020;20(7):777–778. doi:10.1016/s1473-3099(20)30226-7

2. Mustafa N. Impact of the 2019–20 coronavirus pandemic on education. Int J Health Preferences Res. 2020:1–2. doi:10.13140/RG.2.2.27946.98245

3. Liang ZC, Ooi SB, Wang W. Pandemics and their impact on medical training: lessons from Singapore. Acad Med. 2020;95(9):1359–1361. doi:10.1097/acm.0000000000003441

4. Alexander L, Ashcroft J, Byrne MH, Wan J. All hands on deck: early graduation of senior medical students in the COVID-19 pandemic. MedEdPublish. 2020;13:9.

5. Sharif SP. UK medical students graduating early to work during the COVID-19 pandemic. Psychol Med. 2021;51(11):1951. doi:10.1017/s0033291720001488

6. Shehata MH, Abouzeid E, Wasfy NF, Abdelaziz A, Wells RL, Ahmed SA. Medical education adaptations post COVID-19: an Egyptian reflection. J Med Educ Curric Dev. 2020;7:2382120520951819. doi:10.1177/2382120520951819

7. Amin H, Shehata M, Step-by-step AS. Guide to create competency-based assignments as an alternative for traditional summative assessment. MedEdPublish. 2020;9(1):120. doi:10.15694/mep.2020.000120.1

8. Kumar AP, Al Ansari AM, Shehata MH, et al. Evaluation of curricular adaptations using digital transformation in a medical school in Arabian Gulf during the COVID-19 pandemic. J Microsc Ultrastruct. 2020;8(4):186–192. doi:10.4103/jmau.jmau_87_20

9. Shehata MH, Kumar AP, Arekat MR, et al. A toolbox for conducting an online OSCE. Clin Teach. 2021;18(3):236–342. doi:10.1111/tct.13285

10. Norcini J, Anderson B, Bollela V, et al. Criteria for good assessment: consensus statement and recommendations from the Ottawa 2010 Conference. Med Teach. 2011;33(3):206–214. doi:10.3109/0142159x.2011.551559

11. Jones SH. Benefits and challenges of online education for clinical social work: three examples. Clin Soc Work J. 2015;43(2):225–235. doi:10.1007/s10615-014-0508-z

12. Xiong Y, Suen HK. Assessment approaches in massive open online courses: possibilities, challenges and future directions. Int Rev Educ. 2018;64(2):241–263. doi:10.1007/s11159-018-9710-5

13. Shaban S, Tariq I, Elzubeir M, Alsuwaidi AR, Basheer A, Magzoub M. Conducting online OSCEs aided by a novel time management web-based system. BMC Med Educ. 2021;21(1):508. doi:10.1186/s12909-021-02945-9

14. Hamdy H, Sreedharan J, Rotgans JI, et al. Virtual Clinical Encounter Examination (VICEE): a novel approach for assessing medical students’ non-psychomotor clinical competency. Med Teach. 2021;43(10):1203–1209. doi:10.1080/0142159x.2021.1935828

15. Hasani H, Khoshnoodifar M, Khavandegar A, et al. Comparison of electronic versus conventional assessment methods in ophthalmology residents; a learner assessment scholarship study. BMC Med Educ. 2021;21(1):342. doi:10.1186/s12909-021-02759-9

16. Felthun JZ, Taylor S, Shulruf B, Allen DW. Assessment methods and the validity and reliability of measurement tools in online objective structured clinical examinations: a systematic scoping review. J Educ Eval Health Prof. 2021;18:11. doi:10.3352/jeehp.2021.18.11

17. Shaiba LA, Alnamnakani MA, Temsah MH, et al. Medical faculty’s and students’ perceptions toward pediatric electronic OSCE during the COVID-19 pandemic in Saudi Arabia. In: Healthcare. Multidisciplinary Digital Publishing Institute; Vol. 9, 2021:950. doi:10.3390/healthcare9080950

18. Elnaem MH, Akkawi ME, Nazar NI, Ab Rahman NS, Mohamed MH. Malaysian pharmacy students’ perspectives on the virtual objective structured clinical examination during the coronavirus disease 2019 pandemic. J Educ Eval Health Prof. 2021;18:6. doi:10.3352/jeehp.2021.18.6

19. Palmer RT, Biagioli FE, Mujcic J, Schneider BN, Spires L, Dodson LG. The feasibility and acceptability of administering a telemedicine objective structured clinical exam as a solution for providing equivalent education to remote and rural learners. Rural Remote Health. 2015;15(4):3399.

20. Kakadia R, Chen E, Ohyama H. Implementing an online OSCE during the COVID-19 pandemic. J Dent Educ. 2021;85(Suppl.1):1006–1008. doi:10.1002/jdd.12323

21. Ganesananthan S, Li C, Donnir A, et al. Changing student perception of an online integrated structured clinical examination during the COVID-19 pandemic. Adv Med Educ Pract. 2021;12:887–894. doi:10.2147/AMEP.S325364

22. Luke S, Petitt E, Tombrella J, McGoff E. Virtual evaluation of clinical competence in nurse practitioner students. Med Sci Educ. 2021;24:1–5. doi:10.1007/s40670-021-01312-z

Creative Commons License © 2022 The Author(s). This work is published and licensed by Dove Medical Press Limited. The full terms of this license are available at https://www.dovepress.com/terms.php and incorporate the Creative Commons Attribution - Non Commercial (unported, v3.0) License. By accessing the work you hereby accept the Terms. Non-commercial uses of the work are permitted without any further permission from Dove Medical Press Limited, provided the work is properly attributed. For permission for commercial use of this work, please see paragraphs 4.2 and 5 of our Terms.