Back to Journals » International Journal of General Medicine » Volume 15

Innovative Method to Digitize a Web-Based OSCE Evaluation System for Medical Students: A Cross-Sectional Study in University Hospital in Saudi Arabia

Authors Yousef AA , Awary BH, AlQurashi FO , Albuali WH , Al-Qahtani MH , Husain SI , Sharif O

Received 24 November 2021

Accepted for publication 18 January 2022

Published 3 February 2022 Volume 2022:15 Pages 1085—1095

DOI https://doi.org/10.2147/IJGM.S351052

Checked for plagiarism Yes

Review by Single anonymous peer review

Peer reviewer comments 4

Editor who approved publication: Dr Scott Fraser



Abdullah A Yousef,1,2 Bassam H Awary,1 Faisal O AlQurashi,1 Waleed H Albuali,1 Mohammad H Al-Qahtani,1 Syed I Husain,2 Omair Sharif2

1Department of Pediatrics, College of Medicine, Imam Abdulrahman Bin Faisal University, Dammam, Saudi Arabia; 2Vice Deanship for e-learning, College of Medicine, Imam Abdulrahman Bin Faisal University, Dammam, Saudi Arabia

Correspondence: Faisal O AlQurashi
Department of Pediatrics, College of Medicine, Imam Abdulrahman Bin Faisal University, P.O Box: 11286, Dammam, 31453, Saudi Arabia
, Tel +966555078804
, Fax +966138955088
, Email [email protected]

Purpose: The Objective Structured Clinical Examination (OSCE) is a standard academic assessment tool in the field of medical education. This study presents an innovative method for digitizing OSCE evaluation system for medical students and explores its efficacy compared to the traditional paper-based system, through the analysis of a User Satisfaction Survey.
Methods: A cross-sectional, questionnaire-based study involving a User Satisfaction Survey to evaluate assessors’ attitudes toward and acceptance of the Computerized Web-based OSCE Evaluation System (COES) was used. Fifth-year medical students at a College of Medicine were assessed clinically through their 2019 end-of-year OSCE examination by 30 examiners in five different OSCE stations. The traditional paper-based stations were converted into an online electronic version using QuestionPro software. Answers were filled in using smart tablets (iPads). QR codes were used for students’ identification at each station to fully digitize the process and save time. After the completion of the exam, a User Satisfaction Survey was sent electronically to all examiners to evaluate their experiences with the new system.
Results: The response rate for the survey was 100% with an internal consistency of 0.83. Almost all the examiners (29, 97%) were satisfied with the application of the COES. Further, 72% of the examiners indicated that the electronic system facilitated the evaluation of the students’ skills, and 84% found using a smart device (iPad) was easier than using a paper form. All examiners expressed their preference for using the electronic system in the future.
Conclusion: Users were satisfied with the utilization of the customized COES. This concept of fully digitizing the OSCE assessment process shortened the time needed for both the analysis of results and providing students with feedback. Further observational studies are needed to assess examiners’ behaviors when using this methodology.

Keywords: academic performance, clinical competency, clinical skills, medical education, undergraduate, Saudi Arabia

Introduction

The Objective Structured Clinical Examination (OSCE) is a standard academic assessment tool in the field of medical education. It is designed to evaluate practical and communicative skills among medical students.1 Some of its advantages include building examinee confidence and evaluating their clinical sense in different settings. Among its disadvantages are the significant time expenditure incurred and possible individual documentation errors.

In 1975, Haden et al introduced OSCE as a method to evaluate medical students’ skills.2 Since then, it has been used as a mode for assessing the skills and clinical competence of almost all healthcare practitioners.3 It is a timed examination in which students move systematically through a set of stations that are pre-determined and evaluated by a qualified examiner, using well-structured marking criteria.4 Historically, the concept of the OSCE examination has been subjected to numerous modifications throughout the years to better suit specific academic purposes.5,6 In most well-known colleges of medicine worldwide, OSCE is the standard tool for the evaluation of competency, clinical skills, communication skills, psychomotor skills, cognitive knowledge, and attitude through oral examination, counseling, data interpretation, and history and physical examination stations.7–10

The extent of the traditional clinical exam focused on the patient’s history and a demonstration of physical examination skills, with a minimal assessment of technical skills. The traditional clinical exam is broadly unreliable and unjust in evaluating students’ performance because of the wide range of variability concerning both the examiners and the selected patients.11 “The luck of the draw” in the selection of examiner and patient plays a significant negative role in the outcome when using the traditional method.12 Since the introduction of the concept of OSCE in 1975, researchers’ findings reported it to be reliable, objective, and valid with the cost and requirement for human resources being its main disadvantages.7–12 In OSCE, all students are examined on preselected criteria determined by a team of faculty teachers. Assessment of similar clinical scenarios or tasks with scores to meet specific criteria is undertaken. The diversity of stations, performance outcomes, degree of difficulty of questions, and overall students’ organization are some of the important parameters that can be used to analyze the teaching standards of the institution objectively. During the exam, students’ performance is judged by a team of examiners in charge of the various stations of the examination. Furthermore, OSCE is time-efficient, examining more students at any given time over a broader range of subjects than the traditional clinical exam.13

Classically, a paper-based methodology was the standard when executing OSCEs. However, several issues have been linked to this method, including illegible handwriting, missing students’ details, lost assessment sheets, individual manual calculation inaccuracy, data entry errors, and time consumption. Additionally, feedback and prospective input regarding students’ performance are not usually shared with students due to time limitations.14 Few published papers highlight practical approaches to overcome these shortfalls. Most of these efforts addressed the core deficits of the traditional paper-based OSCE assessment, including digital use of different software programs,14,15 use of computers and electronic handheld devices,16–19 and use of a web-based evaluation system.20

To our knowledge, this is the first audit addressing the topic of OSCE assessment of students using an electronic strategy involving the designing and implementing of online handheld digital OSCE assessment software. This study aims to present a novel academic assessment tool using a Customized Web-based OSCE Evaluation System (COES) for medical students and explore its efficacy in comparison to the traditional paper-based evaluation system, using a User Satisfaction Survey.

Materials and Methods

This is a web-based cross-sectional study that utilized a previously developed User Satisfaction Survey to explore the efficacy of the digitized OSCE system compared to the traditional paper-based system.16,22 As part of the Children’s Health academic requirement, fifth-year medical students at the College of Medicine must be assessed clinically through an end-of-year OSCE examination following their Children’s Health course. The e-learning unit at the College of Medicine decided to digitize the OSCE examination for all medical students to meet its strategic academic plans, and the Department of Pediatrics was chosen for the trial. Numerous meetings and brainstorming sessions were held to gauge such requirements and how effectively/efficiently they could be conducted utilizing available resources, customized to the needs of the department.

QuestionPro is web-based software for creating and distributing surveys. It consists of an interface for creating survey questions, tools for distributing surveys via email or website, and tools for analyzing and viewing the results.21 To fit our purpose, as an OSCE management solution, we created a full exam platform by utilizing survey features and went one step further by using QR codes. QuestionPro has been “globally recognized by multiple educational, business, research, and marketing institutes for over ten years.”21

Assessment documentation, stations selection, and scoring criteria were chosen, formulated, reviewed, and agreed on by the OSCE committee faculty members of the department. Subsequently, all information was handed to the e-learning support team to be uploaded to the newly generated assessment system using QuestionPro software. Prior to the OSCE date, OSCE assessors, circuit coordinators, and student invigilators were trained to use the electronic system, and technical support was available at the time of the OSCE assessments. Further, an introductory session was held to introduce the new electronic system to the students.

Regarding OSCE scoring, we provided assessors with 3–5 scoring options for each question of the assessment with which to rate each student’s performance: Not Done, Inadequately Done, Partially Done, Adequately Done, and Well Done. We assigned different scoring weights to each question based on its difficultly, complexity, and number of variable answers for each question (Figure 1).

Figure 1 Software Question Logic page which shows the Question (at the top), and Answer options with their score values.

This COES, which was customized “in-house” at the e-learning unit of the College of Medicine, was used to store and analyze data electronically. Moreover, student feedback was sent to students electronically using the student email system. This study was approved by the Institutional Review Board (IRB) of Imam Abdulrahman Bin Faisal University (IRB-2020-01-048). The datasets used and/or analyzed in the current study are available from the corresponding author on request. All participants provided informed consent to take part in this study, in accordance with the Declaration of Helsinki.

OSCE Layout for the Students

A total of 139 fifth-year medical students utilized the COES in December 2019. They were assessed by 30 examiners from the faculty board of the Department of Pediatrics using portable tablets (iPads) that were provided by the Deanship of e-learning. The examiners represent part of the faculty staff of the Department of Pediatrics with cumulative academic and students’ assessment experience of more than 80 years. All of the involved examiners are consultants from various pediatric sub-specialties with either local or/and international fellowship training certifications. The selection of thirty examiners was subjected to the layout of OSCE based on number of stations, circuits, and rotations. The OSCE comprised five separate stations. Students were divided into three parallel circuits (A, B, and C) operating simultaneously to accommodate numerous examinees. Each circuit comprised the same five stations in the same systematic order. A range of 12–14 students was assigned before the exam to four rotations per circuit. This distribution of students was meticulously generated using Excel Microsoft software and reviewed by three different members of the OSCE exam committee to eradicate any individual/technical errors. Each student completed a history taking and discussion, pediatric surgery case scenario, data interpretation, physical examination, and counseling station, each of which was allocated eight minutes. Along the presence of the examiner, the history taking station was supported by a pediatric resident acting as a simulated parent, as well as the counseling station. Regarding the physical examination station, a selected group of pediatric patients were approached by the OSCE coordination committee for participation and enrollment into the examination.

The Computerized Web-Based OSCE Evaluation System

The duration of the station was determined by the required time for the student’s assessment, for the student’s QR code to be scanned, and a two-minute safety zone should any electronic issue arise. A QR color-coded ID card was given to each student before entering the exam to be scanned by each assessor using their iPads at the beginning of the station (Figures 2 and 3).

Figure 2 Flowchart for OSCE Exam using QuestionPro which starts from Scanning the student’s QR code and ends at submitting the electronic assessment form.

Figure 3 Difference between QuestionPro and other online OSCE software. Probability of human errors may happen at the stage of logging into examiner’s page, selecting the student from a dropdown list, or absence of cross checking the student’s details prior to submitting the electronic assessment form.

The coded card showed the student’s data (Name, University number) and was encrypted to match the assigned circuit, assessor, and rotation for each student. Once the assessor scanned the QR code on each student’s ID card, an online designed page opened on the assessor’s tablet showing the student’s data, circuit, and rotation numbers for second step verification (Figure 4). Subsequently, the assessor was asked to choose their assigned station number from the five stations shown on that page, which also indicated the relevant assessors’ names under each station. After selecting a station, the assessor graded the performance of the student within the given period (Figure 5).

Figure 4 User Interface Introduction page which shows the student’s details, circuit and rotation number, and the names of the Examiners.

Figure 5 An example of the OSCE assessment page (Questionnaire sheet).

Once the time was up, the assessor submitted the form, and data were recorded in the system. After submission of the performance questionnaire, an automated text appeared containing the student data and a message confirming the submission of the questionnaire. Notably, the submission of performance forms was only allowed when all questionnaire items were completed to overcome missing data potential. Finally, the software can be used to download the raw data results in different formats (MS Excel, MS PowerPoint, Adobe PDF) during or after the completion of the exam. The serial number can be used to merge the data for each student in a single MS Excel file, and the sum and average formulas can be added manually through the MS Excel function “sort”.

Overall User Satisfaction Survey

After the completion of the exam, we asked the assessors to complete an overall satisfaction survey of their experience with the COES. The survey was originally developed from a previous work following an extensive literature review in the field of electronic OSCE management, expert review and agreement of the educationalists involved in OSCE preparation.16,22 A 25-item-questionnaire was divided into three sections: the OSCE Software user evaluation (3 Items); usage of the electronic OSCE system and its training (10 Items); and the OSCE assessment process itself (12 Items). For the questions related to the usage of the COES and the assessment process itself (22 Items), they chose from the following: Strongly agree, Agree, Disagree/strongly disagree, or No judgment. For the OSCE software user evaluation (3 Items), the choices were: Excellent, Good, Fair, or Poor. Additionally, three questions were added for a better analysis of the process: Assessor’s age, gender, and whether they possessed a tablet device at home. The assessors’ answers were recorded and analyzed.

Results

Two-third of the assessors were male (22, 73.3%), and the majority were above the age of 40 years (13, 43.3%), with six examiners (20%) being above the age of 50 years, and the rest were between the age of 30 to 39 years. Most assessors have personal tablets at home (23, 76.6%), with no observed statistical significance compared to their age or gender. The overall average time for completing all the students’ assessments was five minutes (out of a possible eight minutes).

User Satisfaction Survey for Assessors

All the assessors who received the survey electronically completed it with a response rate of (n = 30, 100%). The internal consistency (Cronbach’s alpha) of the survey was 0.83. In classical psychometric terms, internal consistency was preferred to best.23

All examiners had previous experience with assessing students using paper-based OSCE methodology. Answers that showed “No judgment” were excluded during the descriptive analysis since there was no preference regarding either agreement or disagreement and they were not considered significant when forming a conclusion for a satisfaction survey. Results of the satisfaction survey for assessors are displayed in (Table 1).

Table 1 User Satisfaction Survey Results (N= 30)

Almost all the assessors (29, 97%) were satisfied with the application of the COES to assess the medical students’ performance. Most examiners (18/25, 72%) indicated that the electronic system facilitated the evaluation of the students’ skills. Most examiners (22/26, 84%) found using a smart device (iPad) was easier than using a paper form; the remainder found the paper form easier (4/26, 16%). More than half of the examiners (16/26, 61.5%) felt it necessary to include a comment on students’ performance at the end of the assessment in the comments section. Over two-thirds of the examiners answered the questions related to the usage of the COES (Section A) positively with few exceptions. All examiners (28/28, 100%) expressed their preference for using the electronic system in the future.

Discussion

Overall, most of the assessors recorded good to excellent feedback on the User Satisfaction Survey. Our Cronbach’s alpha value is much higher than that reported by another similar Irish study (0.83 vs 0.14).16 This could be due to our larger number of examiners (30 vs 18) or because of a social desirability bias issue in that study. More than 95% of the examiners were satisfied with implementing the QR-coded COES to assess the fifth-year medical students. Such observation supports the findings of various studies that showed that the application of an electronic and/or online OSCE system was satisfactory.16,24–26 Furthermore, all our examiners agreed to use the same methodology, with some slight modifications, in the future. Suggested interventions included a better internet connection and adding a question assessing the student’s Global Rating Score. No examiner was required to manage network connection problems by themselves as we ensured the continuous presence and support of our technical team. Additionally, Wi-Fi connectivity was tested prior to the application of the online assessments. Multiple factors supported and augmented the new experience. These include the following: Preparatory training sessions; the system’s easy accessibility; clarity of the system’s layout and instructions; easy navigation of the multiple-choice scorings; and the availability of technical support. The newly tailored functionalities provided by the COES using QuestionPro enabled the examiners to easily evaluate, track, intervene, and comment on each student’s performance. To our knowledge, this work represents the first trial of a fully integrated QR-coded COES worldwide. A relatively recent Irish study by Meskell et al tracked a cohort of first-year nursing students over two consecutive years (n=203) using a “built-in” online OSCE management information system. It found that electronic software facilitated the analysis of overall results, thereby offering considerable time savings.16 Similarly, the application of our novel electronic system significantly shortened the time needed for the analysis of the results, allowing more time for data interpretation, better curriculum development, and clinical teaching improvements. As the survey was adopted from a previous work,16 we decided to limit our modifications to some of its items for comparison. For more clarification, items such as “Absent students were easy to manage” or “Late arrival students were easy to manage” could not be interpreted adequately as absentees and late arrivals were managed by the OSCE coordinators rather than the examiners themselves. Other survey items that assessed the COES were: “I felt that I had to include a comment on student performance for each student” or “I included a comment when I judged that a student had not completed an element of the skill appropriately.” These showed significant variability between the assessors; therefore, they were difficult to accurately interpret. However, to better suit the electronic system, we tried to anticipate and overcome all the obstacles that may affect students’ assessment. As the traditional paper-based method for OSCE assessment was easy to follow and allowed the examiners to add their comments on each student’s performance,26 the COES guaranteed this right by adding an optional “Comment section” functionality at the end of each student encounter. The use of QR-coding in OSCE assessment was not previously reported in the literature. A qualified e-learning team is required to deliver high-end technology for numerous simultaneous encounters. Approximately 695 QR-coded interactions between the students and their examiners were recorded during our COES. This requires good preparatory technical support as well as excellent cooperation between academic and technical teams. Our concept of fully digitizing the OSCE assessment process shortened the time needed for both the analysis of results and providing students with feedback. According to our preliminary exploration, we estimate that the new approach saved us more than 48 hours of data entry, data analysis, and final student grading.

This study had its limitations. First, the relatively small number of stations (n=5) might play a significant role in the feasibility of applying such a fully electronic OSCE assessment system on a larger scale. However, we believe that having more than 650 overall student encounters with the new system without any major shortcomings demonstrated its safety and applicability. Second, the small number of examiners (n=30) could impact the User Satisfaction Survey results negatively, but the number of our examiners is much bigger than other relevant studies in the same field with better consistency values. In addition, not including the students’ opinion and satisfaction towards such new intervention has its own negative implication. Also, the relatively higher cost of applying such computer-based assessment approach makes it difficult to be feasible and convenient in all settings. Finally, the possibility of assessors becoming distracted by using the COES tool is logically high. However, we did not perceive any negative effects in this regard, apart from what has been mentioned in the User Satisfaction Survey results. Additional observational studies on examiners’ behaviors during the online OSCE methodology are required.

Conclusion

In conclusion, our COES for medical students showed promising results for the potential to transition from traditional paper-based OSCE assessment into a fully digitized and online system. Users of this novel digital online assessment tool demonstrated a high level of satisfaction. Preparatory training sessions, the system’s easy accessibility, clarity of the system’s layout and instructions, easy navigation of the multiple-choice scorings, as well as the availability and readiness of technical support are integral to the success of this approach. Further observational studies are needed to assess examiners’ behaviors when using this new methodology.

Acknowledgments

The authors wish to thank the University & College of Medicine administration for their supervision of the process, and the Deanship of E-learning, Imam Abdulrahman Bin Faisal University, and the e-learning unit at the College of Medicine for their continuous technical support. Further, the authors express their gratitude to all the staff members of the Department of Pediatrics and the fifth-year medical students for their cooperation and thoughtfulness.

Disclosure

The authors report no conflicts of interest in relation to this work.

References

1. Patricio MF, Juliao M, Fareleira F, Carneiro AV. Is the OSCE a feasible tool to assess competencies in undergraduate medical education? Med Teach. 2013;35(6):503–514. doi:10.3109/0142159X.2013.774330

2. Harden RM, Stevenson M, Downie WW, Wilson GM. Assessment of clinical competence using objective structured examination. Br Med J. 1975;1(5955):447–451. doi:10.1136/bmj.1.5955.447

3. Oranye NO, Ahmad C, Ahmad N, Bakar RA. Assessing nursing clinical skills competence through objective structured clinical examination (OSCE) for open distance learning students in Open University Malaysia. Contemp Nurse. 2012;41(2):233–241. doi:10.5172/conu.2012.41.2.233

4. Pugh D, Touchie C, Wood TJ, Humphrey-Murto S. Progress testing: is there a role for the OSCE? Med Educ. 2014;48(6):623–631. doi:10.1111/medu.12423

5. Hodges B. OSCE! Variations on a theme by Harden. Med Educ. 2003;37(12):1134–1140. doi:10.1111/j.1365-2923.2003.01717.x

6. Stillman PL, Wang Y, Ouyang Q, Zhang S, Yang Y, Sawyer WD. Teaching and assessing clinical skills: a competency-based programme in China. Med Educ. 1997;31(1):33–40. doi:10.1111/j.1365-2923.1997.tb00040.x

7. Novack DH, Volk G, Drossman DA, Lipkin M Jr. Medical interviewing and interpersonal skills teaching in US medical schools. Progress, problems, and promise. JAMA. 1993;269(16):2101–2105. doi:10.1001/jama.1993.03500160071034

8. Jain SS, DeLisa JA, Eyles MY, Nadler S, Kirshblum S, Smith A. Further experience in development of an objective structured clinical examination for physical medicine and rehabilitation residents. Am J Phys Med Rehabil. 1998;77(4):306–310. doi:10.1097/00002060-199807000-00009

9. Baid H. The objective structured clinical examination within intensive care nursing education. Nurs Crit Care. 2011;16(2):99–105. doi:10.1111/j.1478-5153.2010.00396.x

10. Barry M, Noonan M, Bradshaw C, Murphy-Tighe S. An exploration of student midwives’ experiences of the Objective Structured Clinical Examination assessment process. Nurse Educ Today. 2012;32(6):690–694. doi:10.1016/j.nedt.2011.09.007

11. Barman A. Critiques on the objective structured clinical examination. Ann Acad Med Singapore. 2005;34(8):478–482.

12. Zayyan M. Objective structured clinical examination: the assessment of choice. Oman Med J. 2011;26(4):219–222. doi:10.5001/omj.2011.55

13. Hamann C, Volkan K, Fishman MB, Silvestri RC, Simon SR, Fletcher SW. How well do second-year students learn physical diagnosis? Observational study of an objective structured clinical examination (OSCE). BMC Med Educ. 2002;2(1):1186–1188. doi:10.1186/1472-6920-2-1

14. Treadwell I. The usability of personal digital assistants (PDAs) for assessment of practical performance. Med Educ. 2006;40(9):855–861. doi:10.1111/j.1365-2929.2006.02543.x

15. Segall N, Doolen TL, Porter JD. A usability comparison of PDA-based quizzes and paper-and-pencil quizzes. Comput Educ. 2005;45(4):417–432. doi:10.1016/j.compedu.2004.05.004

16. Meskell P, Burke E, Kropmans TJ, Byrne E, Setyonugroho W, Kennedy KM. Back to the future: an online OSCE management information system for nursing OSCEs. Nurse Educ Today. 2015;35(11):1091–1096. doi:10.1016/j.nedt.2015.06.010

17. Judd T, Ryan A, Flynn E, McColl G. If at first you don’t succeed … adoption of iPad marking for high-stakes assessments. Perspect Med Educ. 2017;6(5):356–361. doi:10.1007/s40037-017-0372-y

18. Dearnley C, Haigh J, Fairhall J. Using mobile technologies for assessment and learning in practice settings: a case study. Nurse Educ Pract. 2008;8(3):197–204. doi:10.1016/j.nepr.2007.07.003

19. Schmidts MB. OSCE logistics—handheld computers replace checklists and provide automated feedback: objective structured clinical examination. Med Educ. 2000;34(11):957–958.

20. Nackman GB, Griggs M, Galt J. Implementation of a novel web-based objective structured clinical evaluation. Surgery. 2006;140(2):206–211. doi:10.1016/j.surg.2006.05.004

21. QuestionPro. How it works. Available from: https://www.questionpro.com/home/howItWorks.html. Accessed September 14, 2020.

22. Olsen K. An examination of questionnaire evaluation by expert reviewers. Field Methods. 2010;22(4):295–318. doi:10.1177/1525822X10379795

23. Cortina JM. What is coefficient alpha? An examination of theory and applications. J Appl Psychol. 1993;78(1):98. doi:10.1037/0021-9010.78.1.98

24. Monteiro S, Sibbald D, Coetzee K. i-Assess: evaluating the impact of electronic data capture for OSCE. Perspect Med Educ. 2018;7(2):110–119. doi:10.1007/s40037-018-0410-4

25. Luimes JD, Labrecque ME. Implementation of electronic objective structured clinical examination evaluation in a nurse practitioner program. J Nurs Educ. 2018;57(8):502–505. doi:10.3928/01484834-20180720-10

26. Lim AS, Lee SWH. Is technology enhanced learning cost-effective to improve skills?: the monash objective structured clinical examination virtual experience. Simul Healthc. 2020. doi:10.1097/SIH.0000000000000526

Creative Commons License © 2022 The Author(s). This work is published and licensed by Dove Medical Press Limited. The full terms of this license are available at https://www.dovepress.com/terms.php and incorporate the Creative Commons Attribution - Non Commercial (unported, v3.0) License. By accessing the work you hereby accept the Terms. Non-commercial uses of the work are permitted without any further permission from Dove Medical Press Limited, provided the work is properly attributed. For permission for commercial use of this work, please see paragraphs 4.2 and 5 of our Terms.