Back to Journals » Advances in Medical Education and Practice » Volume 10

An evaluative study of objective structured clinical examination (OSCE): students and examiners perspectives

Authors Majumder MAA, Kumar A , Krishnamurthy K, Ojeh N , Adams OP , Sa B 

Received 6 December 2018

Accepted for publication 25 April 2019

Published 5 June 2019 Volume 2019:10 Pages 387—397

DOI https://doi.org/10.2147/AMEP.S197275

Checked for plagiarism Yes

Review by Single anonymous peer review

Peer reviewer comments 3

Editor who approved publication: Professor Sam Leinster



Md Anwarul Azim Majumder,1 Alok Kumar,1,2 Kandamaran Krishnamurthy,1,2 Nkemcho Ojeh,1 Oswald Peter Adams,1 Bidyadhar Sa3

1Faculty of Medical Sciences, The University of the West Indies, Cave Hill, Barbados; 2Department of Pediatrics, The Queen Elizabeth Hospital, Bridgetown, Barbados; 3Faculty of Medical Sciences, The University of the West Indies, St. Augustine, Trinidad and Tobago

Background: The objective structured clinical examination (OSCE) is the gold standard and universal format to assess the clinical competence of medical students in a comprehensive, reliable and valid manner. The clinical competence is assessed by a team of many examiners on various stations of the examination. Therefore, it is found to be a more complex, resource- and time-intensive assessment exercise compared to the traditional examinations.
Purpose: The objective of this study was to determine the final year MBBS students’ and OSCE examiners’ perception on the attributes, quality, validity, reliability and organization of the Medicine and Therapeutics exit OSCE held at the University of the West Indies (Cave Hill) in June 2017.
Methods: At the end of the OSCE, students and examiners were provided with a questionnaire to obtain their views and comments about the OSCE. Due to the ordinal level of data produced by the Likert scale survey, statistical analysis was performed using the median, IQR and chi-square.
Results: A total of 52 students and 22 examiners completed the questionnaire. The majority of the students provided positive views regarding the attributes (eg, fairness, administration, structure, sequence, and coverage of knowledge/clinical skills), quality (eg, awareness, instructions, tasks, and sequence of stations), validity and reliability (eg, true measure of essential clinical skills, standardized, practical and useful experiences), and organization (eg, orientation, timetable, announcements and quality of examination rooms) of the OSCE. Similarly, majority of the examiners expressed their satisfaction with organization, administration and process of OSCE. However, students expressed certain concerns such as stressful environment and difficulty level of OSCE.
Conclusion: Overall, the OSCE was perceived very positively and welcomed by both the students and examiners. The concerns and challenges regarding OSCE can be overcome through better orientation of the faculty and preparation of the students for the OSCE.

Keywords: OSCE, undergraduate medical education, students’ perception, examiners’ perception, medicine and therapeutics, Barbados

Introduction

Several methods of assessing the clinical competence of medical students exist. Traditional methods include short case and long cases and the viva voce examination, all of which have been criticized for lacking structure and standardization, having poor inter-rater reliability, and not minimizing examiner bias.1 Harden et al2 proposed the objective structured clinical examination (OSCE) in medical school as a means of overcoming these issues and improving the quality of clinical performance of the students. He described an OSCE as “a timed examination in which medical students interact with a series of simulated patients in stations that may involve historytaking, physical examination, counselling or patient management.”2 The OSCE with multiple standardized cases and scoring rubrics is now widely used as the gold standard and universal format to assess the clinical competency of medical students.3 Over the past few decades it has been proven to be a valid and reliable tool which can assess all three learning domains (cognitive, affective, and psychomotor)46 and the “shows how” level described in the Miller pyramid.7 It has been widely adopted as the examination for clinical competence all over the world in various medical disciplines in undergraduate, postgraduate and licensing examinations.4 However, it has been found to be a complex, resource- and time-intensive assessment exercise.7,8

The University of the West Indies (UWI), a regional indigenous university with campuses and medical faculties each with its own Dean in Barbados, Jamaica, Trinidad, and additionally a clinical site in the Bahamas, has a total enrollment of approximately 3,000 medical students in a 5-year MBBS degree. The degree is divided into the mainly preclinical phase 1 (years 1–3) and clinical phase 2 (years 4–5). There is an integrated modular curriculum with continuous in-course and final summative assessments. On the completion of the final (fifth) year of the MBBS, students have a final exit examination in the three major disciplines of clinical medicine—Medicine and Therapeutics which includes internal medicine, pediatrics, psychiatry and family medicine, obstetrics and gynecology, and surgery. Students passing this examination are eligible to be provisionally licensed as interns in most English-speaking Caribbean countries.

The final year MBBS exit exam for Medicine and Therapeutics has 2 written papers (both of which are a mixture of single best answer and extended matching questions) and an OSCE. It is held twice a year, in May/June (when the majority of the students sit) and November/December. For 52 years following the inception of UWI in 1948, the final MBBS clinical examination in Medicine and Therapeutics was carried out using traditional assessment methods.9 In 2000, an OSCE was introduced to examine clinical competence.9 Although, OSCEs are also used in the individual clerkship examinations of the specialties that are included in the final Medicine and Therapeutics exit examination (internal medicine, pediatrics, psychiatry and family medicine), these clerkship examinations in Barbados have fewer than eight different testing stations, under 15 candidates and local examiners. In contrast the Final MBBS Medicine and Therapeutics OSCE is about 4 hours duration, has 17 testing stations not including four to six rest stations, and involves 17 examiners many of whom are from other UWI campuses. In Barbados students are assigned to one of two concurrent circuits each with its own pool of examiners. Each circuit examines over 20 students. This same process is repeated sequentially at all four campuses of the UWI with total number of students sitting this exam exceeding 600 in any given examination year. There are two external examiners (usually an internist and a pediatrician) who move from campus to campus and observe these examination processes to ensure adequate standard and uniformity.

Above all, the process of maintaining harmonization and uniformity in terms of the testing material and methods employed at all of the four sites, each located in independent island nations, requires careful attention and organization. As a result, the process was found to be very stressful and tiring for both the students and the examiners (most of the examiners move from one campus to another). It is a common realization that even in the formative and low-stakes settings, OSCEs were found to be an anxiety-inducing experience for students and a challenging and labor-intensive experience for the examiners and organizers.1019 It was therefore conceptualized that any positive or negative perception on the implementation of the Medicine and Therapeutics exit OSCE may affect students’ anxiety and stress level as well as their performance in such a high stakes examination. Very few research studies have investigated the examiners perception of the OSCE process and on their role as examiners for this process.11,18,19 Examiners' opinions are required to be sought as examiners play a critical role in executing the OSCE, contribute to the design of the OSCE stations, identify the competences to be tested, and provide individual or group feedback to the students after examination.20 There is therefore a need for investigating both student and examiner perception of this complex and potentially overwhelming process to identify areas that need further attention to improve the standard and quality of this examination. Medical schools from other countries would benefit from our findings on the organization of such a complex, time-consuming, resource-intensive assessment exercise. In addition,there are significant financial costs associated with implementing the exit OSCE and all four campuses are incurring huge expenses from examiner travel between islands, recruiting standardized patients, external examiners, support staff, and the other administrative expenses. Though a number of studies have been published on OSCEs in other campuses of UWI,9,10,2124 this study was the first to evaluate the UWI exit OSCE.

Against this background, we aimed to conduct a study to determine examiner and student perception on the attributes, quality, validity, reliability, and organization of the Medicine and Therapeutics exit OSCE held at the Cave Hill Campus in 2017 and to identify areas that need further attention to improve the standard and quality of this examination.

Methods

OSCE setting

The exit MBBS in Medicine and Therapeutics OSCE consists of 17 examiner observed active stations and seven rest stations, thus in total 24 stations. Stations are of 7 minutes duration except for the history-taking stations which are of 23 minutes duration for the adult history and 15 minutes for the pediatric history. There is one minute transit time between stations. Of the 17 active stations, five are devoted to pediatrics, two to psychiatry, one to community health and the remaining nine stations are devoted to adult medicine (Table 1). The examination at each of the stations, except the history taking stations, lasted for 5 minutes with additional 2 minutes for presentation of findings and discussion. At the history taking station in pediatrics candidates were required to take a history for 10 minutes and then had 5 minutes to collect their thoughts, make a presentation and answer questions, while in the adult history station, they had 15 minutes to take history and 8 minutes to collect their thoughts, make a presentation and answer questions. Performance of the required tasks was calibrated according to specifically designed checklists, containing 10 to 20 items, each with an assigned score corresponding to the key skills.

Table 1 OSCE stations in medicine and therapeutics

Research design

To realize the study aims, a cross-sectional survey approach was adopted and the participants were asked to complete a self-administered questionnaire at the end of the examination.

Participants

All students and examiners who participated for the May/June OSCE held at the Cave Hill campus, UWI on May 20, 2017 formed the part of the study population recruited through voluntary response sampling technique.

Study instrument

Study instruments for student10,21,25 and examiner11,19 were designed based on questionnaires used in previous studies. The student questionnaire evaluated perception of the recently completed OSCE under the following domains using Likert scales as follows: attributes from strongly disagree (score of 1) to strongly agree (score of 5); quality from not at all (score of 1) to a greater extent (score of 4); validity and reliability from not at all (score of 1) to a greater extent (score of 4); organization from very poor (score of 1) to excellent (score of 5); and comparison of the OSCE, multiple choice questions (MCQ), essay/short answer (SAQ) and clerkship assessment formats on 3-point scales in terms of their level of difficulty, fairness, learning opportunities and most preferable usage during clinical years. Additionally, there were three open-ended questions.

The examiner questionnaire evaluated perception of the overall fairness of the OSCE, range of clinical skills and knowledge tested, validity of the measure of clinical competence, exam administration, provided the level of information required and clarity of instructions at each station, adequacy of time allocation for each station, level of stress experienced by the students and minimized their chances of failing. In addition,examiners were asked to rate the impact of the OSCE on student learning, whether it is preferable to other formats of clinical examination and if it should be used more often in the clinical years of undergraduate programme. All questions were answered on a 5-point Likert scale (1 “strongly disagree” to 5 “strongly agree”). There was one open-ended question allowing examiners to express their opinion about the OSCE.

Data collection

In collecting the data for the present study, participants (students and examiners) were asked to fill-out the questionnaire at the end of the OSCE, which was guided by voluntary participation, anonymity, and assured confidentiality of the collected data. A printed sheet providing a brief description of the study and a request of consent was attached to the questionnaire. Those who consented to participate in this study signed the consent form and filled out the questionnaire. This process was completed on site at conclusion of the OSCE for all of the candidates.

Data analysis

The data was entered into the Statistical Package for the Social Sciences (SPSS) version 23.0 (IBM Corporation, Armonk, NY, USA). Mean and SD were calculated for data on age of students and year of experience as an OSCE examiner. Cronbach’s alpha was calculated to establish the reliability of the instruments used. Specifically, percentages, median, IQR, and chi-squared analysis with Yates’ correction were used to determine if the distribution of frequencies of responses in each item were sufficiently different to reject the null hypothesis that the distribution was due to chance. The critical value used to reject the null hypothesis was P≤0.01. The information collected from open-ended questions was collated and presented thematically.

Ethical approval

Ethical approval was obtained from the Institutional Review Board, The University of the West Indies, Faculty of Medical Sciences, Cave Hill Campus, Barbados (IRB No: 180209-B).

Results

Fifty-four students completed the questionnaire. The response rate was 100% and of the respondents 66.7% were female, mean age was 24.4 (SD ±1.6) years, 48.1% were from Barbados and 44.4% from Trinidad. There were 22 examiner respondents (response rate: 55%); more than half were male (54.5%). The mean years of experience of the respondents being an OSCE examiner is 7.60 (SD ±6.80).

Median, IQR and chi-squared values are provided for each survey question in Table 14. The frequency distribution and percentage of student and examiner responses are also shown in each table. The internal consistency of the survey was good with Cronbach’s alpha values of 0.89 and 0.90 for student and examiner questionnaire respectively.

Table 2 Student perception of attributes and organization of OSCE (N=54)

Table 3 Student perception of quality and validity/reliability of OSCE performance (N=54)

Table 4 Students’ perception of assessment format

Table 5 Examiners' perception of OSCE (N=22)

Students’ perception of OSCE

Table 2 reveals that a majority of students (63–91%) perceived positively about the attributes of OSCE, i.e. fair examination, covered required knowledge and competencies, well-administered and well-sequenced OSCE stations. However, most of the students (67–79%) found the OSCE stressful and intimidating, and constrained with time. It was further revealed that the majority of the students remained neutral with the following statements: examination process minimized their chance of failing and allowed student to compensate in some areas. Approximately 80% students did not agree that the OSCE was less stressful than other examinations.

Table 2 also shows a majority of the respondents rated the organization of the OSCE as “excellent” and “above average”. However, more than half of the students felt dissatisfied with the clinical procedures’ revision done before the examination.

In relation to quality of OSCE performance, the majority of students were satisfied; however, approximately one quarter or less of the respondents had expressed their dissatisfaction with the allotted time, authenticity of setting and context at each OSCE station, and opportunities to learn provided by the OSCE (Table 3). More than half of the students thought that the tasks included in the OSCE “somewhat” reflected those taught in the clerkships.

Though the majority of the respondents were satisfied with the OSCE performance criteria, one quarter to one third of the students showed concerns as to whether the OSCE scores provided a true measure of essential clinical skills (29.7%) and whether the OSCE is a practical and useful experience (22.2%). Three quarters of the students believed that factors like gender, ethnicity, and personality did affect the results of OSCE.

When students were asked to compare the assessment instruments, the majority of the students identified the OSCE as the most difficult and MCQ as the fairest assessment formats. Students also mentioned that they learnt most from essay/SAQ format and suggested to use clerkship rating more in the clinical years (Table 4).

Answers to open-ended questions showed students were satisfied with all aspects of the organization of the OSCE and suggested a number of ways to improve future OSCE experiences. The positive aspects of the OSCE highlighted by the students included: well-structured OSCE; cooperation of examiners, staff, and patients; inclusion of wide variety of stations including multiple rest stations; availability of adequate staff with good mix of local/foreign examiners, and coverage of wide area of knowledge and skills. Respondents felt that the time allocated to perform expected tasks was insufficient, and that the procedure was stressful and intimidating. Suggestions for improvement included more mock examination sessions, more time per station, split OSCE into pediatrics and internal medicine, feedback after OSCE clerkships, and better layout of OSCE to avoid confusion.

Examiners' perception of OSCE

With regard to perception of examiners about their respective stations, majority of the examiners agreed to strongly agree that the examination was fair, covered a wide range of clinical skills and knowledge, was well-organized and well-administered, students were aware of the level of information needed, tasks asked to perform at each station were fair, and the OSCE was a standardized examination for all students (Table 5). However, some of the examiners felt that the OSCE failed to create a positive impact on student learning and OSCE scores did not truly reflect competence in clinical skills.

Discussion

In the present study, the response rates for the students and examiners were adequate26 and the number (N=17) and timing of the stations (>5 minutes) exceeded the requirements to achieve a reliable OSCE.27 Both students and examiners expressed their satisfaction with the attributes, quality indicators, validity/reliability criteria, organization/settings of the Medicine and Therapeutics exit OSCE. In FMS, UWI, Mona Campus (Jamaica), Pierre et al10 used OSCE as an assessment instrument during the pediatric clerkship in year five to examine student acceptance of the OSCE and evaluation of the clerkship. Similar to our study, the authors recorded “overwhelming acceptance of the OSCE”: comprehensiveness (90%), transparency (87%), fairness (70%), and authenticity of the required tasks (58–78%). In both campuses, students expressed some concerns regarding OSCE, e.g., stressful and intimidating event and insufficient time to complete the stations. Similar acceptance and concerns of the OSCE have been previously described by the students in other medical schools worldwide.11,12,28,29

In the present study, more than three-quarters of the students felt that the OSCE induced higher levels of stress compared to other examination formats. Similar concerns were reported in other studies with medical and other health professional students.10,1214,30,31 Examinations and assessment procedures in medicine were found to be anxiety provoking and stressful. Brand and Schoonheim-Klein13 reported that students had to prepare better for OSCEs and their expectation to pass OSCE was also significantly higher which might be the cause of stress. Moreover in OSCEs, timed test, interaction with patients and examiners, close monitoring and observation by examiners, and rude and apathetic approach of some of the examiners were responsible for students' increased stress levels.14,15,3133 It was demonstrated that stress level did not decrease with increasing exposure to OSCEs.33,34 These increased stress levels, as Marshall and Jones14 mentioned, might be due to the different material tested rather than the assessment methods themselves. As high levels of stress may interfere with performance,35 careful preparation of students prior to OSCE is required to minimize the students’ anxiety level.16 In the present study, 70% of the students identified the OSCE as intimidating and this has been reported in other studies.10,36 Approximately 80% of the students disagreed that an OSCE was less stressful than other examinations and half of the students (51%) identified the OSCE as being more difficult than other forms of assessment instruments such as MCQ, essay/SAQ, and clerkship rating. Brand and Schoonheim-Klein13 reported greater stress levels among dental students during an OSCE in comparison to written and practical assessment methods. In a study conducted in Pakistan,12 more than 90% of the students found OSCEs “more stressful and mentally tougher” than other traditional examination formats. These views appear to contrast with findings from a study conducted in Ethiopia that students found long case and short case examinations in the clinical years more stressful than OSCE.17 This may be due to “an unsympathetic interaction between examiner and examinee” along with other factors;17 which include “poor briefing for the student or training for the examiners, poorly designed stations or a mismatch between what is tested in the OSCE and the curriculum and teaching and learning programme”.20 At the Cave Hill Campus, we do not use long and short cases for examinations in the Medicine and Therapeutics course and students perceived OSCE more stressful than written and other examinations. The inclusion of long and short cases needs to be considered as Khan37 questioned the validity of OSCE in clinical examinations and recommended to use workplace-based assessments by reducing the role of OSCEs “to put the ‘art’ back into medicine”. The author argued that students prepare strategically to pass the OSCE and adopt a robotic “tickbox” approach, but “struggle to translate this into skills which are practically useful in the dynamic, ever changing environment of real frontline medicine”. These concerns are supported by the view of the examiners in the present study—36% of the examiners felt that OSCE failed to create positive impact on student learning and 45% of examiners disagreed that OSCE scores truly reflect competence in clinical skills. These areas need further investigation.

Though students and examiners were in agreement with various aspects of the OSCE, there was a divergence in opinion among students and examiners regarding timing at OSCE stations. The majority of the students (66.7%) reported that they needed more time at stations whereas examiners felt that the time at stations was adequate (59.1%). A similar difference of opinion was also observed between students (41.7%) and examiners (11.1%) in a study conducted by Omu et al11 in Kuwait. Some studies have raised concerns that time was a problem and suggested that they focus on restricting time but rather should concentrate on how well students can perform.38,39 However, Schoonheim-Klein et al40 demonstrated that increased time per station had no impact on student’s performance. Stowe and Gardner41 suggested that the instructions in the OSCE form should be short and clear to give time to complete the task by the students.

The present study has a number of limitations. This cross-sectional study involved only one campus (out of four campuses) and had a small sample size; therefore, caution needs to be taken to generalize the data to other settings. One of the strengths of the study is seeking feedback both from students and from examiners; however, examiners’ comments may be biased as they examined only one station.

Conclusion

In this study, students and examiners reported favourable opinions on the process and organization of the OSCE conducted during the Medicine and Therapeutics exit exam. However, students felt that the OSCE was stressful and intimidating, and time allocation was inadequate for the assigned tasks. More practice sessions/mock exams with adequate feedback may better prepare students and create a better environment to assess skills expected of a doctor in clinical practice. Further, multicentred studies are required to be carried out to assess whether there is any difference in actual clinical performance between students assessed by traditional formats compared to those assessed by an OSCE and to ascertain long-term impacts of OSCE on clinical management of patients later in their professional life.

Availability of data and materials

The datasets of the current study are available from the corresponding author on reasonable request.

Acknowledgments

The authors would like to thank the students and the examiners who completed the questionnaire.

Disclosure

Dr Md Anwarul Azim Majumder is the Editor-in-Chief of the journal Advances in Medical Education and Practice. The other authors Prof. Dr Alok Kumar, Dr Kandamaran Krishnamurthy, Dr Nkemcho Ojeh, Dr Oswald Peter Adams, and Dr Bidyadhar Sa report no conflicts of interest in this work.

References

1. Khan KZ, Ramachandran S, Gaunt K, Pushkar P. The objective structured clinical exam (OSCE): AMEE guide no. 81—part I: an historical and theoretical perspective. Med Teach. 2013;35(9):e1437–e1446. doi:10.3109/0142159X.2013.818634

2. Harden RM, Stevenson M, Downie WW, Wilson GM. Assessment of clinical competence using objective structured examination. Br Med J. 1975;1:447–451. doi:10.1136/bmj.1.5955.447

3. Patricio M, Juliao M, Fareleira F, Young M, Norman G, Vaz Carneiro A. A comprehensive check list for reporting the use of OSCEs. Med Teach. 2009;31:112–124. doi:10.1080/01421590802578277

4. Sloan D, Donnelly MB, Schwartz R, Strodel W. The objective structured clinical examination – the new gold standard for evaluating postgraduate clinical performance. Ann Surg. 1995;222(6):735–742.

5. Newble D. Techniques for measuring clinical competence: objective structured clinical examinations. Med Edu. 2004;38:199–203. doi:10.1111/j.1365-2923.2004.01755.x

6. Carraccio C, Englander R. The objective structured clinical examination, a step in the direction of competency-based evaluation. Arch Pediatr Adolesc Med. 2000;154:736–741.

7. Miller GE. The assessment of clinical skills/competence/performance. Acad Med. 1990;65(suppl 9):63–67. doi:10.1097/00001888-199009000-00045

8. Khan KZ, Gaunt K, Ramachandran S, Pushkar P. The objective structured clinical exam (OSCE): AMEE guide no. 81—part II: organisation & administration. Med Teach. 2013;35(9):e1447–e1463. doi:10.3109/0142159X.2013.818635

9. Teelucksingh S, Ali Z, Fraser HS, Denbow CE, Nicholson GD. Fifty years of clinical examinations at the University of the West Indies. West Indian Med J. 2001;50(Suppl 4):50–52.

10. Pierre RB, Wierenga A, Barton M, Branday JM, Christie CD. Student evaluation of an OSCE in paediatrics at the University of the West Indies, Jamaica. BMC Med Educ. 2004;4:22. doi:10.1186/1472-6920-4-6

11. Omu AE, Al-Azemi MK, Omu FE, Al-Harmi J, Diejomaoh MFE. Attitudes of academic staff and students towards the Objective structured clinical examination (OSCE) in obstetrics and gynaecology. Creat Educ. 2016;7:886–897. doi:10.4236/ce.2016.76093

12. Khan A, Ayub M, Shah Z. An audit of the medical students’ perceptions regarding objective structured clinical examination. Educ Res Int. 2016:Article ID 4806398:4. doi:10.1155/2016/4806398.

13. Brand HS, Schoonheim-Klein M. Is the OSCE more stressful? Examination anxiety and its consequences in different assessment methods in dental education. Eur J Dent Educ. 2009;13:147–153. doi:10.1111/j.1600-0579.2008.00554.x

14. Marshall G, Jones N. A pilot study into anxiety induced by various assessment methods. Radiography. 2003;9:185–191. doi:10.1016/S1078-8174(03)00062-2

15. Zartman RR, McWhorter AG, Seale S, et al. Using OSCE-based evaluation: curricular impact over time. J Dent Educ. 2002;66:1323–1330.

16. Race P, Pickford R. Making Teaching Work: Teaching Smarter in Post-Compulsory Education. London: SAGE Publications; 2007.

17. Shitu B, Girma T. Objective structured clinical examination (OSCE): examinee’s perception at department of paediatrics and child health, Jimma University. Ethiop J Health Sci. 2008;18(2):47–52.

18. Moeen-Uz-Zafar A, Shammari O, Aljarallah B. Evaluation of interactive OSCE for medical students in the subject of medicine; reliability and validity in the setting of internal vs. external examiners. Ann Public Health Res. 2015;2(4):1030.

19. Chong L, Taylor S, Haywood M, Adelstein BA, Shulruf B. The sights and insights of examiners in objective structured clinical examinations. J Educ Eval Health Prof. 2017;14:34. doi:10.3352/jeehp.2017.14.34

20. Harden RM. Misconceptions and the OSCE. Med Teach. 2015;37(7):608–610. doi:10.3109/0142159X.2015.1042443

21. De Lisle J. Phase 2, OSCE Student Evaluation Form. Trinidad: The Centre for Medical Science Education, Faculty of Medical Sciences; 2001.

22. Bodkyn C, Sa B, Omer MI. An appraisal of Objective structured clinical examination (OSCE) examination at the University of West Indies. Ambikeya J Educ. 2011;2(2):295–302.

23. Pierre RB, Wierenga A, Barton M, Thame K, Branday JM, Christie CDC. Student self-assessment in a paediatric objective structured clinical examination. West Indian Med J. 2005;54(2):144–148.

24. Hickling FW, Morgan KAD, Abel W, et al. A comparison of the objective structured clinical examination results across campuses of the University of the West Indies (2001 and 2002). West Indian Med J. 2005;54(2):139–143. doi:10.1590/S0043-31442005000200011

25. Ali GAE, Mehdi AY, Ali AH. Objective structured clinical examination (OSCE) as an assessment tool for clinical skills in Sohag University. Nursing students perspective. J Environ Stud. 2012;8:59–69.

26. Nulty DD. The adequacy of response rates to online and paper surveys: what can be done? Assess Eval High Educ. 2008;33:301–314. doi:10.1080/02602930701293231

27. Epstein RM. Assessment in medical education. N Engl J Med. 2007;356:387–396. doi:10.1056/NEJMra054784

28. Elfaki OA, Al-Humayed S. Medical students‘ perception of OSCE at the Department of Internal Medicine, College of Medicine, King Khalid University, Abha, KSA. J Coll Physicians Surg Pak. 2016;26(2):158–159. doi:02.2016/JCPSP.158159

29. Skrzypek A, Szeliga M, Stalmach-Przygoda A, et al. The Objective structured clinical examination (OSCE) from the perspective of 3rd year‘s medical students - a pilot study. Folia Med Cracov. 2017;57(3):67–75.

30. Furlong E, Fox P, Lavin M, Collins R. Oncology nursing students’ views of a modified OSCE. Eur J Oncol Nurs. 2005;9:351–359. doi:10.1016/j.ejon.2005.03.001

31. Sarid O, Anson O, Bentov Y. Students’ reactions to three typical examinations in health sciences. Adv Health Sci Educ. 2005;10:291–302.

32. Anderson M, Stickley T. Finding reality: the use of objective structure clinical examination (OSCE) in the assessment of mental health nursing students interpersonal skills. Nurse Educ Pract. 2002;2:160–168. doi:10.1054/nepr.2002.0067

33. Allen R, Heard J, Savidge M, Bittengle J, Cantrell M, Huffmaster T. Surveying students‘ attitudes during the OSCE. Adv Health Sci Educ. 1998;3:197–206. doi:10.1023/A:1009796201104

34. Troncon LE. Clinical skills assessment: limitations to the introduction of an “OSCE” (objective structured clinical examination) in a traditional Brazilian medical school. Sao Paulo Med J. 2004;112:12–17. doi:10.1590/S1516-31802004000100004

35. Yerkes RM, Dodson JD. The relation of strength of stimulus to rapidity of habit-formation. J Comp Neurol Psychol. 1908;18:459–482. doi:10.1002/(ISSN)1550-7149

36. Duffield KE, Spencer JA. A survey of medical students’ views about the purpose and fairness of assessment. Med Educ. 2002;36:879–886.

37. Khan H. OSCEs are outdated: clinical skills assessment should be centred around workplace-based assessments (WPBAS) to put the ‘art’ back into medicine. MedEdPublish. 2017. doi:10.15694/mep.2017.000189

38. Al-Mously N, Nabil NM, Salem R. Students feedback on OSCE: an experience of a new medical college in Saudi Arabia. J Int Assoc Med Sci Educ. 2012;22:10–16.

39. Jindal P, Khurana G. The opinion of post graduate students on objective structured clinical examination in anaesthesiology: a preliminary report. Indian J Anaesth. 2016;60(3):168–173. doi:10.4103/0019-5049.177869

40. Schoonheim-Klein ME, Hoogstraten J, Habets L, et al. Language background and OSCE performance: a study of potential bias. Eur J Dent Educ. 2007;11:222–229. doi:10.1111/j.1600-0579.2007.00459.x

41. Stowe CD, Gardner SF. Real-time standardized participant grading of an objective structured clinical examination. Am J Pharm Educ. 2005;69(3):272–276. doi:10.5688/aj690341

Creative Commons License © 2019 The Author(s). This work is published and licensed by Dove Medical Press Limited. The full terms of this license are available at https://www.dovepress.com/terms.php and incorporate the Creative Commons Attribution - Non Commercial (unported, v3.0) License. By accessing the work you hereby accept the Terms. Non-commercial uses of the work are permitted without any further permission from Dove Medical Press Limited, provided the work is properly attributed. For permission for commercial use of this work, please see paragraphs 4.2 and 5 of our Terms.