Back to Journals » Advances in Medical Education and Practice » Volume 13

Perception and Satisfaction of Undergraduate Medical Students of the Mini Clinical Evaluation Exercise Implementation in Orthopedic Outpatient Setting

Authors Alomar AZ

Received 21 May 2022

Accepted for publication 19 September 2022

Published 23 September 2022 Volume 2022:13 Pages 1159—1170

DOI https://doi.org/10.2147/AMEP.S375693

Checked for plagiarism Yes

Review by Single anonymous peer review

Peer reviewer comments 2

Editor who approved publication: Dr Md Anwarul Azim Majumder



Abdulaziz Z Alomar

Department of Orthopaedic Surgery, College of Medicine, King Saud University, Riyadh, Kingdom of Saudi Arabia

Correspondence: Abdulaziz Z Alomar, Department of Orthopaedic Surgery, College of Medicine, King Saud University, Riyadh, Kingdom of Saudi Arabia, Tel +966554655665, Email [email protected]

Purpose: The Mini Clinical Evaluation Exercise (mini-CEX) is a brief and direct observational assessment of trainee-patient interactions that helps to assess several clinical domains. There is limited evidence of mini-CEX implementation in orthopedics and undergraduate perceptions toward such an approach. This study investigated the perception of mini-CEX among undergraduate medical students through a questionnaire-based survey in an orthopedic outpatient setting.
Patients and Methods: Undergraduate medical students completing their orthopedic clinical posting were invited to participate in an anonymous, self-administered questionnaire written in English to evaluate their perceptions toward mini-CEX implementation in the orthopedic outpatient setting for the 2016– 2017 academic session. The questionnaire comprised 28 closed-ended questions with a five-point Likert rating-scale, and five open-ended questions. The survey responses were analyzed for reliability, validity, and quantitative and qualitative analyses.
Results: A total of 350 students completed the questionnaire; the questionnaire was proven to be valid and reliable. The closed-ended questions were designed to assess the knowledge of the mini-CEX as an assessment tool. The participants demonstrated a satisfactory understanding of the mini-CEX methodology, purpose, clarity, comprehensiveness, and as a self-assessment tool for undergraduate medical students. Instructor support for the implementation of mini-CEX appeared inadequate and was rated with non-confidence among most students. Most participants appreciated better clinical skills, which was reflected through improvements in clinical exam preparation, the Objective Structured Clinical Examination, and clinical judgment.
Conclusion: Undergraduate medical students perceived the mini-CEX as an effective tool for clinical teaching in an outpatient orthopedic setting. However, most students indicated suboptimal instructor involvement in the teaching and assessment process; this raises concerns regarding inadequate direct observation and limited feedback for student performance. Additional measures are needed to ensure high quality clinical encounters, teacher training, integration with other assessment tools, and standardized coverage mini-CEX implementation in orthopedics.

Keywords: mini clinical evaluation exercise, orthopedics, outpatient, undergraduate

Introduction

Orthopedic training for undergraduate medical students is challenging due to the limited clinical exposure and the need for complex musculoskeletal examination skills, which are often difficult to learn.1 Conventionally, students observe and learn clinical skills during bedside teaching and outpatient clinics observation.2–4 However, it is difficult to assess students for each topic covered in individual teaching sessions. The formative and summative assessments are based on examiner-selected clinical topics that, although relevant, may not reflect the overall clinical proficiency of the students. Thus, a student may be inadequately trained in orthopedic clinical skills and remain unassessed through conventional assessment methods. Several methods have been proposed for orthopedic clinical skills assessments.4 While written exams containing context-based multiple-choice, short-answer, and structured long-answer questions, may be appropriate for knowledge assessment; however, the methods for performance assessment of clinical skills require observation of students in actual or simulated clinical scenarios.4 The objective structured clinical examination (OSCE) and similar tools like the objective structured assessment of technical skills are widely used to assess orthopedic skills.4 The OSCE-based assessment was introduced to efficiently observe students’ skills in a clinically relevant and objective manner, which is not feasible through traditional clinical case presentations.5 Multi-station OSCEs are helpful for assessing the psychomotor skills involved in orthopedic clinical examinations.6 However, the OSCE evaluates the methodology of clinical skills and not their implementation in real-life scenarios, as in a clinic-based setting. The knowledge gap can best be addressed from the start of the clinical training. In addition, students are often assessed through a summative assessment when clinical teaching has already been completed. Consequently, students may complete their medical training without being assessed for the clinical skills learned during individual clinical teaching sessions. Orthopedics, as a clinical specialty, requires significant face-to-face patient interaction and applied clinical skills and judgment, which may be difficult to assess through the OSCE. Better assessment methods are needed to observe trainee performance in a live-clinical setting and create a timely opportunity for constructive feedback and effective development of clinical skills. Workplace-based assessments are potential alternatives since the examiner can directly observe student interaction with patients and assess their clinical skills.7 Direct observation of procedural skills (DOPS) has also been globally adopted in competency-based curricula, which specify the target level of expertise along with the relevant settings, including real clinical settings.7

DOPS is helpful in assessing multiple aspects of orthopedic training, including knowledge and psychomotor skills, which are required for procedural skill assessments.8 However, this does not apply to clinical skill assessments. Furthermore, a short and relevant clinical assessment tool would be more appropriate for an outpatient setting; this would allow for quick observation and assessment of student skills, and timely feedback from the teacher.9 This can also help students to tailor their time-management skills, as the available evaluation time in outpatient clinics is often limited, along with improving formative assessment outcomes and helping students improve themselves in areas of deficiency.10 The Mini Clinical Evaluation Exercise (mini-CEX) is a method that can assess student performance in a clinical setting within these parameters, including short duration, clinically relevant, objective, checklist-based, and simultaneous feedback opportunity.11 The American Board of Internal Medicine has developed a clinical skill assessment tool for graduate medical education. The mini-CEX involves direct observation and assessment of trainee-patient interactions to assess student clinical skills, attitudes, and behaviors. The exercise is followed by constructive feedback on trainee performance and areas for improvement.10,11 The mini-CEX has been used worldwide in undergraduate medical education.11 It is a valid and reliable performance indicator for many vital parameters of clinical training (such as patient interviews, physical examinations, empathy, professionalism, clinical judgment, counseling, and efficiency).11,12 While the mini-CEX has been demonstrated as a satisfactory assessment tool in various clinical disciplines for both inpatient and outpatient settings, there is very limited evidence of its role in orthopedic teaching and assessment. Moreover, the tool has mostly been studied for postgraduate-and resident-level medical training. Little is known about its effectiveness in undergraduate medical teaching and assessment in general and orthopedic specialties.

In orthopedics, most diagnoses are based on accurate and relevant clinical evaluations through diagnostic procedural skills.13 Moreover, it is imperative to make correct observations in a manner that is convenient for the patient while employing the correct examination methodology. Thus, it is important to know how students reach a diagnosis, in addition to the accuracy of their diagnoses. Additionally, essential aspects of clinical evaluations, professionalism, and empathetic behavior, may be missed if the evaluation is solely focused on objective skills. The mini-CEX would therefore be helpful in situations that require a precise, short, and relevant, multi-domain assessment.

Additional factors should be considered prior to implement mini-CEX in orthopedics, specifically, in most outpatient settings, the time allotted for an individual patient is limited, hence should be wisely utilized with relevant and short clinical history, clinical examination, and appropriate special tests. Clinical examination in orthopedics is often complex and poorly understood by medical students;14–16 it requires a strong grasp of anatomy, mechanisms, and execution of special tests in addition to general examination skills.14 The mini-CEX potentially provides a customized experience to every student aware of the basic steps for patient assessment over a short period. In addition, every specialty is different considering the competencies covered, settings, students’ and teachers’ perceptions, and resources available. Moreover, in individual specialties, the competencies covered differ in the learning domains and the teaching methods. Considerable inter-specialty variation has been observed in mini-CEX implementation, highlighting multiple specialty-specific factors involved.17–34 Currently, the evidence concerning barriers to the implementation of mini-CEX in orthopedics is limited and must be investigated. Such evidence can potentially help to better preparedness and resource development for mini-CEX implementation in orthopedics. Furthermore, the duration of orthopedic exposure during undergraduate medical training is also limited, thereby hindering orthopedic clinical skills development for students unless standardized. Thus, determining how the mini-CEX approach helps to tackle this shortage of time is essential.

In our institute, the mini-CEX has been an integral part of the undergraduate medical curriculum, including orthopedics. There are multiple curricula-guided mini-CEX encounters in outpatient settings to assess the relevant competencies. While multiple studies have assessed the usefulness of the mini-CEX in evaluating undergraduate students, the student perception of such an approach has not been well studied. Student perceptions may also vary between medical disciplines, raising concerns about the true quality of specialty training. Therefore, we conducted a cross-sectional survey to investigate undergraduate medical student perceptions of mini-CEX-based assessments in orthopedic outpatient clinical settings.

To our knowledge, no study has specifically focused on the implementation of mini-CEX in undergraduate orthopedic teaching in a comprehensive manner. Our study is the first to investigate the mini-CEX efficacy from students’ point of view and will help us to understand its general acceptability, strengths, limitations, areas requiring improvement, the scope for wider implementation, and shortcomings in an orthopedic outpatient clinical setting.

Materials and Methods

Study Design

This cross-sectional study was conducted in a tertiary care institute with associated teaching medical colleges. Approval was secured from the institutional review board of King Saud University (KSA) for one academic session from 2016 to 2017; we conducted a survey of undergraduate medical students who completed their orthopedic clinical training during this time to understand their perceptions of the mini-CEX model implementation in an outpatient setting.

Study Setting

In our institution, the undergraduate medical curriculum extends over 7 years with a 4-week orthopedic course in the fifth year when the students are exclusively taught orthopedics.

For the first 2 weeks, there were six different teaching sessions for clinical skills, including history taking and physical examination, through didactic bedside teaching and simulated patients; this was paired with an additional six sessions to teach skills on patient interaction, interpreting investigations, and decision making, through didactic case-based learning (a mini-CEX assessment sheet is provided in Supplementary File 1). In the latter 2 weeks, students were required to attend three outpatient clinics and undergo at least three mini-CEX sessions. The individuals conducting mini-CEX sessions had undergone prior training, with an experience of over one year. The summative assessment was conducted using the OSCE at the end of the 4-week training period.

Participants

Immediately following completion of the training period, the students were asked to anonymously complete a physical questionnaire form to assess their perceptions of mini-CEX implementation in orthopedic outpatient teaching. Only students who attended all the mini-CEX encounters were included in this survey. Informed consent, including consent to publication, was obtained prior to participation in the survey. Answers to all the questions in the questionnaire were required for completion.

Questionnaire

The questionnaire was developed and validated by the teaching faculty of the Department of Medical Education and approved by the institutional review board composed of experts from the field of medical education and orthopedic faculty. The questionnaire was in English and based on a literature review of the mini-CEX implementation in orthopedics and other specialties for medical teaching. The questionnaire comprised a set of 28 closed questions relevant to multiple aspects like mini-CEX understanding, feasibility, reliability, teachers’ role, feedback, satisfaction, clinical skills improvement, and personality development, with additional open-ended type questions related to additional advantages, shortcomings, less useful aspects, and suggestions for better implementation and improvement of the mini-CEX from students perspective. The details of the closed and open ended questions are provided in Tables 1 and 2. The closed-ended questions were used to assess agreement on the Likert scale ratings (5=strongly agree/ 4= agree/ 3=unsure/ 2=disagree/ 1=strongly disagree).

Table 1 Likert Scale Scores of Closed-Ended Questions on Undergraduate Medical Student Perceptions of the Mini-CEX Assessment Model in an Outpatient Orthopedic Setting. Likert Scale Scores: 1 = Strongly Disagree, 2 = Disagree, 3 = Unsure, 4 = Agree, 5 = Strongly Agree

Table 2 Undergraduate Medical Student Responses to Open-Ended Questions Regarding Their Perception of the Mini-CEX Assessment Model in an Outpatient Orthopedic Setting

Questionnaire Validation

Prior to implementation, the closed-ended questions segment was assessed by a group of 30 undergraduate medical students; no issues were noticed regarding the content or comprehension of the questionnaire and high internal consistency with a Cronbach alpha of 0.87 was observed. We also assessed the internal consistency of the final survey which was assessed using Cronbach’s alpha for all closed questions. Factor analysis using the principal components method was performed with varimax rotation to group the Likert-scale items while assessing validity. The Kaiser-Meyer-Olkin (KMO) measure with Bartlett’s test of sphericity was used as an indicator of the appropriateness for the factor analysis.35,36 Factors with eigenvalues of > 1.0 were extracted. In factor analysis, items with loadings of > 0.4 were retained. The open-ended questions were validated through the Delphi method after consensus among the review board members was reached as these questions provided further insights that might have been missed in closed questions. Anonymized feedback was received from the review board members after finalizing the closed questions. Based on those feedbacks, discussions in three serial meetings were held, following which the final open questions framework was finalized. The mean agreement on the questionnaire items was 92.3% among the review board members.

Data Collection and Analysis

The response rate to the questionnaire was measured as the percentage of students who completed the questionnaire form out of the total number of invited students. The Likert scale scores of the closed-ended questions were entered into IBM SPSS Statistics for Windows, version 21.0, and the frequency and means were calculated for each question. For open-ended questions, an automation approach was used, in which the free text of the responses to the open-ended questions was analyzed for their content using an online free text analysis tool (Textalyser, https://seoscout.com). Open-ended responses were categorized based on the repetition of keywords. Two investigators independently categorized the responses after subsequent discussions and mutual consensus. The frequency of each categorical response was then calculated.

A qualitative analysis by triangulating the findings of closed- and open-ended responses was performed by the author. Narrative analysis, in which the investigators put their perspective on the closed and open-ended responses to the survey, was conducted.

Results

A total of 350 students (187 males, 163 females) responded to the questionnaire out of 480 invited students (response rate: 72.9%).

The reliability test (internal consistency) revealed a Cronbach’s alpha of 0.944 for the 28 items in the questionnaire. The KMO measure was 0.922, with a significant Bartlett’s test of sphericity (p <0.001). Six factors were extracted through principal component analysis with eigenvalues > 1, explaining 70.3% of the cumulative variance (Supplementary File 2). After interpreting the six factors of principal component analysis, the individual factors were determined to represent the following six components: a) basic understanding of the mini-CEX, b) teacher’s support, c) skills development, d) overall satisfaction, e) professional development, and f) personality development. The questions were grouped in these six components based on their factor loadings (Supplementary File 3). The validity of the construct was confirmed because the loadings of all closed questions on the Likert scale were > 0.4 and ranged from 0.422 to 0.879 (Supplementary File 3).

The mean Likert scale ratings for the closed-ended questions are listed in Table 1, while the categorical responses to the open-ended questions and their frequencies are listed in Table 2.

Basic Understanding of the Mini-CEX (Questions 1–6)

Most students had a satisfactory basic understanding of mini-CEX. The mean Likert ratings ranged from 3.14 to 3.44; however, approximately 25% of students disagreed with the different aspects of basic understanding with the lack of proper methodology communication and inadequate addressal of queries related to mini-CEX.

Teacher’s Support (Questions 7–12)

The overall agreement on teacher’s support in mini-CEX was borderline with mean Likert ratings ranging from 2.96 to 3.19. The disagreement on teacher’s support, in terms of feedback, commitment, interest, helpfulness, and observation, was as high as approximately 1/3rd of all responses.

Skills Development (Questions 13–17)

A generally satisfactory response was received for questions related to clinical skills development ranging from 3.25 to 4.13. The highest mean Likert ratings (4.13) were observed for improvement in clinical examination skills. However, almost 25% of students disagreed with improvement in patient communication, exams preparation, and improvement in OSCE performance.

Overall Satisfaction (Questions 17–22)

The overall satisfaction for mini-CEX as an assessment tool was mixed, with mean ratings ranging from 2.91 to 3.40. Approximately one-third of the students disagreed with the mini-CEX as a satisfactory assessment tool and its role in reflecting clinical performance.

Professional Development (Questions 23–25)

A high agreement was observed among all questions in the component related to professional development, with the mean Likert ratings ranging between 3.55 and 4.00. The disagreement regarding mini-CEX in improving patient interviewing was as low as 8.30%.

Personality Development (Questions 26–28)

There was a general agreement on mini-CEX role in personality development with the mean Likert ratings ranging from 3.44 to 3.63. The proportion of disagreeing students was low, 15.5–21.2%.

Qualitative Analysis

The closed- and open-ended responses appeared to be interlinked. The closed-ended questions related to the basic understanding of mini-CEX as a clinical assessment tool. The results indicated a satisfactory understanding among the students with respect to the methodology, purpose, clarity, comprehensiveness, and as a self-assessment tool among undergraduate medical students, as reflected by the Likert ratings of questions 1–6. In open-ended responses, students highlighted the advantages of a well-structured format and the potential for feedback in the mini-CEX. The following were some related comments:

I like the objective format where each component contributed to the assessment, be it history, examination, patient interaction, etc.

Mini-CEX help me know in what domain of clinical assessment I am deficient; I can now focus on my weak domains

It is difficult to know whether I am doing the clinical examination correctly or not, teachers’ feedback helps a lot in improving my skills

The support by instructors while administering the mini-CEX, however, appeared inadequate, with most students reporting low ratings as visible in questions 7–12. Students felt that the interest, support, feedback, and observation from the teachers, which are integral to mini-CEX, were somewhat limited. Among the open-ended responses, the students highlighted the subjectivity of the teachers and inappropriate conduction of the mini-CEX by some teachers. Some related comments were as follows:

the level of knowledge expected from us should be the basic and according to the curriculum defined competencies, however, sometimes teachers expect a detailed assessment from us which differed from teacher to teacher

there should be some uniformity in the assessment process to counter the teacher bias

my teacher was unaware of the domains to be assessed in the mini-CEX and could not provide specific feedback

my teacher was busy on phone for some inpatient discussion with his resident, I doubt he carefully observed my mini-CEX

Most students developed better clinical skills, which was reflected in an improvement in their clinical exam preparation, the OSCE, and clinical judgment (questions 13–17). Feedback has been viewed as a tool for improving clinical skills. The students appreciated these points among the advantages of the mini-CEX in open-ended responses as well. Some comments were as follows:

several special tests had been my weak domain, now by performing and correcting those under teacher’s observation, I can perform those much more confidently

the feeling of being observed helps in learning and performing clinical examination properly

The overall satisfaction with the mini-CEX conducted revealed a borderline satisfaction among students and even lower satisfaction from the assessment scores (questions 18–22).

Discussion

The current study findings suggest that the mini-CEX role in clinical assessment has been well appreciated by students, with the potential for wider implementation to help students be more aware of real-life scenarios. The analysis of the questionnaire responses, however, suggests that the mere idea and implementation of mini-CEX are not adequate and require further strategic planning, which would involve tailored experience and improved teachers’ support. Our questionnaire covered aspects related to inclusiveness, comprehensiveness, instructor role, skill improvement, satisfaction, and professional development among undergraduate medical students in orthopedic outpatient settings. The questionnaire was reliable, consistent, and validated through discriminant functional analysis. The positive perceptions of mini-CEX observed in this study suggest an overall agreement on the basic understanding of mini-CEX assessments, desire for wider implementation, and role in professional and personality development. Still, the general satisfaction regarding mini-CEX in orthopedics was borderline. The perceived limitations of mini-CEX in orthopedics were inappropriateness for detailed orthopedic examinations and discussion, limited role in clinical judgment, advanced diagnostic tests, patient management, and patient counseling skills at an undergraduate level, inconsistencies in instructor assessments, observation and attention, and feedback from tutors.

Previously, mini-CEX has been used in various specialties and settings among medical students and trainees. However, mini-CEX implementation in orthopedics in an undergraduate medical setting has seldomly been investigated. Our study explores critical aspects that require further attention for effective implementation of mini-CEX in orthopedics.

Like previous studies,17–25 this study shows the general awareness of mini-CEX among medical students. The desire for a higher number of mini-CEX encounters in more than half of the surveyed students could be attributed to its brief and relevant nature. Also, many students felt that the mini-CEX need not be limited to the outpatient setting and should be paired with other assessment methods such as case-based discussion, OSCE, and long cases. This perception is possibly due to the curriculum-based implementation and familiarity of students with such an approach through other specialties rotation, which would have resulted in a similar perspective in orthopedic setting. The limited role in clinical judgment, advanced diagnostic tests, patient management, and patient counseling skills at an undergraduate level is due to the preliminary level of desired skills among undergraduates. Furthermore, advanced diagnostic tests require detailed stepwise examination and counseling skills with considerable clinical experience, which may be difficult to achieve at the undergraduate level. Therefore, mini-CEX execution should be individualized based on the desired competence of target students. Few students raised concerns regarding the limited time for mini-CEX. It is likely that better organization and student guidance on time management for different steps during patient visit would be helpful in mini-CEX implementation.

Teacher’s support had the least agreement among students in our orthopedic setting. Unfortunately, less than 20% of the surveyed students agreed on the conduction of constructive feedback by teachers, teacher’s commitment to mini-CEX activities, their explanation, helpfulness, and observation of mini-CEX activities. Evidence has highlighted teachers’ roles related constraints in several specialties. Chang et al31 studied the quality of feedback given by teachers to first year postgraduates in emergency medicine and observed similar issues related to limited teachers’ observation, interest, explanation, helpfulness, feedback, queries addressal during the implementation of mini-CEX in emergency medicine.31 Likewise, Liao et al30 observed that in mini-CEX encounters in internal medicine training, the feedback was either not given or incomplete in terms of individual exercises covered in each encounter. Fernando et al28 found similar limitations in feedback in undergraduate medical teaching, which included surgery and medicine related mini-CEX encounters.

In addition, the authors pointed out the inter-assessor variability in giving assessment scores that possibly reduced the quality of feedback. Hill et al,27 and Yanting et al,23 also evaluated mini-CEX implementation in undergraduate and suggested that assessor variation can negatively impact mini-CEX based assessment. The inter-assessor variation in scoring was not suggested in this study, probably due to limited support from most teachers, thereby hindering students to predict assessor related variations. Moreover, Martinsen et al26 found no difference in feedback in conventional work place-based assessment and mini-CEX in undergraduate medical setting, which covered multiple specialties. Castanelli et al24 investigated the perceptions of mini-CEX among anesthesia residents and supervisors and found a lack of clarity regarding understanding of mini-CEX assessments and their purpose. Additionally, the feedback was not timely provided. Herein, most students showed basic understanding of mini-CEX probably due to their familiarity with such assessments in other specialties. Meanwhile, Weller et al22 projected another limitation of feedback in mini-CEX related to the reluctance of assessors to award failing grades that could potentially conceal underperforming trainees. Further research on feedback quality will provide deeper insights into this issue. Recently, Eltayar et al37 observed that the inter-rater reliability was highest when using entrustment scales for workplace-based assessments, which indicated that entrustment scales could achieve good psychometric properties as regards to consistency among different raters. Thus, the authors decreased the confounding effect of differences in assessors, thereby providing a clearer image of the actual academic level of the learners.

There have been studies on mini-CEX implementation in other specialties, such as pediatrics, obstetrics and gynecology, cardiology, and endocrinology.32,34,38,39 However, these studies did not highlight the limitations attributed to the lack of teachers’ support. The critical parameter in these differences is faculty training prior to mini-CEX conduction.30 Unless teachers are well trained on how and what individual mini-CEX components to be addressed, there will be lack of standardization in assessment, and the quality of feedback would be suboptimal. However, the quality of training and adherence to the guidelines remain questionable unless students’ perceptions are understood.

Literature suggests that the primary reasons for dissatisfaction to teacher’ support were lack of assessors’ training, reduction of mini-CEX to a mere tick box tool due to lack of assessor observation, lack of well-defined evaluation criteria, feedback discrepancy (limited, absent, or inappropriate), difficulties in arranging prefixed encounters, lack of available senior physicians, unrealistic situations, and lack of expertise in the assessment methods.18,27,31 Inter-teacher differences in assessments can result in biased assessment methods. Therefore, the need for multiple assessors has been highlighted in previous studies to provide reliable assessment and feedback.17 A lack of adequate time availability among teachers and specific domain coverage has also been suggested.25,26

The clinical evaluation priorities may vary among specialties; different specialties have different skills based on the extent of psychomotor domain involved in clinical assessment. Additionally, there remain challenges associated with variable assessor stringency across specialties. Subspecialty related differences can affect mini-CEX validity and feasibility.27 This difference is related to the domains of the competency covered in the mini-CEX encounter, which can be affected by the specialty related complexity of the case, resources involved, and assessors’ variation.11,26–28 Without knowing the students’ perception of mini-CEX implementation in orthopedics, it is difficult to ensure that all its components are satisfactorily contributing to the desired improvement in students’ patient-assessment skills.

Previous studies have reported limitations regarding the mismatch in anticipated and perceived benefit of mini-CEX in different specialities.26–31,40 Unless the shortcomings are known, the scope of improvement may not be understood. For example, Malhotra et al33 found that clinical performance observation and assessment of internal medicine residents by assessors generated anxiety.

The current study strengthens the potential role of mini-CEX in the orthopedic outpatient setting. Most perceived benefits and limitations are similar to those of other specialties and setting of mini-CEX implementation. Prior preliminary training of orthopedic assessors for mini-CEX encounters may not ensure anticipated students’ satisfaction. Additional special training of orthopedic specialty teachers and methods to scrutinize the same frequent intervals are warranted to ensure the quality of assessment and improve assessment bias. Dedicated faculty for mini-CEX encounter without interference of other workload can also improve mini-CEX implementation with emphasis on constructive feedback, time organization, and specific activities relevant to undergraduate understanding. Domains related to clinical judgment, multiple advanced special clinical tests, and disease-related counseling are perceived to be of limited relevance at the undergraduate level and are probably related to postgraduate expertise. Therefore, clinical encounters should be framed with the most attention paid to patient interviews, history, and basic physical examinations, which would be in line with the undergraduate curriculum.

This study has some limitations. First, the study observed only undergraduate medical student perceptions of orthopedic outpatient settings. The influence of the mini-CEX on student performance or grades could not be predicted. Second, the study does not predict the effectiveness of the mini-CEX as an assessment tool and its comparison with other assessments. Comparative studies are warranted for further insight. Third, the perception of students toward mini-CEX implementations in orthopedic outpatient settings reveals the characteristics of one department of a single medical institute, which can vary between subjects and institutes implementing the mini-CEX. Fourth, the questionnaire was bound to have some subjective deficiencies and some critical aspects might have been missed in the current form.

Fifth, the study did not analyze the cost-effectiveness of mini-CEX implementation for undergraduate teaching. Finally, our study’s major findings suggest that teachers’ support has been a limiting factor for mini-CEX in orthopedic settings. Therefore, it is important to understand mini-CEX from the perspective of instructors, which was not investigated in this study. Nevertheless, important aspects highlighted from the closed-and open-ended questions will be helpful in designing mini-CEX sessions.

Conclusion

Undergraduate medical students perceive the mini-CEX as an effective tool for clinical teaching in an outpatient orthopedic setting. However, there have been concerns regarding suboptimal instructor involvement in the teaching and assessment process, which can lead to inadequate direct observations and limited feedback. From the undergraduate medical student perspective, clinical judgment, advanced clinical tests, and counseling skills have a limited role in mini-CEX assessment. Additionally, there is a need to clarify the scoring criteria, time distribution of clinical encounters, and reduction of inter-teacher assessment variability. Further measures are needed to ensure the quality of clinical encounters, teacher training, integration with other assessment tools, and wider coverage for mini-CEX implementation in orthopedics.

Acknowledgments

The author would like to thank the College of Medicine Research Center, Deanship of Scientific Research, King Saud University, for supporting this project.

Disclosure

The author declares that there is no conflicts of interest.

References

1. Menon J, Patro DK. Undergraduate orthopedic education: is it adequate? Indian J Orthop. 2009;43(1):82–86. doi:10.4103/0019-5413.45328

2. Salam A, Siraj HH, Mohamad N, et al. Bedside teaching in undergraduate medical education: issues, strategies, and new models for better preparation of new generation doctors. Iran J Med Sci. 2011;36(1):1–6.

3. Sultan AS. Bedside teaching: an indispensable tool for enhancing the clinical skills of undergraduate medical students. J Pak Med Assoc. 2019;69(2):235–240.

4. McGee SR, Irby DM. Teaching in the outpatient clinic. Practical tips. J Gen Intern Med. 1997;12(Suppl 2):S34–S40.

5. James HK, Chapman AW, Pattison GTR, et al. Analysis of tools used in assessing technical skills and operative competence in trauma and orthopaedic surgical training: a Systematic Review. JBJS Rev. 2020;8(6):e1900167.

6. Khan KZ, Ramachandran S, Gaunt K, et al. The objective structured clinical examination (OSCE): AMEE guide no. 81. Part I: an historical and theoretical perspective. Med Teach. 2013;35(9):e1437–e1446.

7. Beard J, Rowley D, Bussey M, et al. Workplace-based assessment: assessing technical skill throughout the continuum of surgical training. ANZ J Surg. 2009;79(3):148–153.

8. Erfani Khanghahi M, Ebadi Fard Azar F. Direct observation of procedural skills (DOPS) evaluation method: systematic review of evidence. Med J Islam Repub Iran. 2018;32:45.

9. Sethi S, Srivastava V, Verma P. mini-clinical evaluation exercise as a tool for formative assessment of postgraduates in psychiatry. Int J Appl Basic Med Res. 2021;11(1):27–31.

10. Hill F, Kendall K. Adopting and adapting the mini-CEX as an undergraduate assessment and learning tool. Clin Teach. 2007;22(4):242–248.

11. Norcini JJ, Blank LL, Arnold GK, et al. The mini-CEX (clinical evaluation exercise): a preliminary investigation. Ann Intern Med. 1995;123(10):795–799. doi:10.7326/0003-4819-123-10-199511150-00008

12. Kroboth FJ, Hanusa BH, Parker S, et al. The inter-rater reliability and internal consistency of a clinical evaluation exercise. J Gen Intern Med. 1992;7(2):174–179. doi:10.1007/BF02598008

13. Yu JC, Guo Q, Hodgson CS. Deconstructing the joint examination: a novel approach to teaching introductory musculoskeletal physical examination skills for medical students. MedEdPORTAL. 2020;16:10945. doi:10.15766/mep_2374-8265.10945

14. Hendrick P, Bond C, Duncan E, et al. Clinical reasoning in musculoskeletal practice: students’ conceptualizations. Phys Ther. 2009;89(5):430–442. doi:10.2522/ptj.20080150

15. Hauer KE, Teherani A, Kerr KM, et al. Student performance problems in medical school clinical skills assessments. Acad Med. 2007;82(10 Suppl):S69–S72. doi:10.1097/ACM.0b013e31814003e8

16. Faustinella F, Jacobs RJ. The decline of clinical skills: a challenge for medical schools. Int J Med Educ. 2018;9:195–197. doi:10.5116/ijme.5b3f.9fb3

17. Weller JM, Jolly B, Misur MP, et al. mini-clinical evaluation exercise in anaesthesia training. Br J Anaesth. 2009;102(5):633–641. doi:10.1093/bja/aep055

18. Jackson D, Wall D. An evaluation of the use of the mini-CEX in the foundation programme. Br J Hosp Med. 2010;71(10):584–588. doi:10.12968/hmed.2010.71.10.78949

19. Singh T, Sharma M. Mini-clinical examination (CEX) as a tool for formative assessment. Natl Med J India. 2010;23(2):100–102.

20. Weston PS, Smith CA. The use of mini-CEX in UK foundation training six years following its introduction: lessons still to be learned and the benefit of formal teaching regarding its utility. Med Teach. 2014;36(2):155–163. doi:10.3109/0142159X.2013.836267

21. Joshi MK, Singh T, Badyal DK. Acceptability, and feasibility of mini-clinical evaluation exercise as a formative assessment tool for workplace-based assessment for surgical postgraduate students. J Postgrad Med. 2017;63(2):100–105. doi:10.4103/0022-3859.201411

22. Weller JM, Jones A, Merry AF, et al. Investigation of trainee and specialist reactions to the mini-clinical evaluation exercise in anaesthesia: implications for implementation. Br J Anaesth. 2009;103(4):524–530. doi:10.1093/bja/aep211

23. Yanting SL, Sinnathamby A, Wang D, et al. Conceptualizing workplace based assessment in Singapore: undergraduate mini-clinical evaluation exercise experiences of students and teachers. Ci Ji Yi Xue Za Zhi. 2016;28(3):113–120. doi:10.1016/j.tcmj.2016.06.001

24. Castanelli DJ, Jowsey T, Chen Y, et al. Perceptions of purpose, value, and process of the mini-clinical evaluation exercise in anesthesia training. Can J Anaesth. 2016;63(12):1345–1356. doi:10.1007/s12630-016-0740-9

25. Norcini JJ, Blank LL, Duffy FD, et al. The mini-CEX: a method for assessing clinical skills. Ann Intern Med. 2003;138(6):476–481. doi:10.7326/0003-4819-138-6-200303180-00012

26. Martinsen SSS, Espeland T, Berg EAR, et al. Examining the educational impact of the mini-CEX: a randomised controlled study. BMC Med Educ. 2021;21(1):228.

27. Hill F, Kendall K, Galbraith K, et al. Implementing the undergraduate mini-CEX: a tailored approach at Southampton University. Med Educ. 2009;43(4):326–334.

28. Fernando N, Cleland J, McKenzie H, et al. Identifying the factors that determine feedback given to undergraduate medical students following formative mini-CEX assessments. Med Educ. 2008;42(1):89–95.

29. Holmboe ES, Yepes M, Williams F, et al. Feedback and the mini clinical evaluation exercise. J Gen Intern Med. 2004;19(5 Pt 2):558–561.

30. Liao KC, Pu SJ, Liu MS, et al. Development, and implementation of a mini-Clinical Evaluation Exercise (mini-CEX) program to assess the clinical competencies of internal medicine residents: from faculty development to curriculum evaluation. BMC Med Educ. 2013;13:31.

31. Chang YC, Lee CH, Chen CK, et al. Exploring the influence of gender, seniority and specialty on paper and computer-based feedback provision during mini-CEX assessments in a busy emergency department. Adv Health Sci Educ Theory Pract. 2017;22(1):57–67.

32. Gupta S, Sharma M, Singh T. The acceptability and feasibility of mini-Clinical Evaluation Exercise as a learning tool for pediatric postgraduate students. Int J Appl Basic Med Res. 2017;7(Suppl 1):S19–S22.

33. Malhotra S, Hatala R, Courneya CA. Internal medicine residents’ perceptions of the mini-Clinical Evaluation Exercise. Med Teach. 2008;30(4):414–419.

34. Johnson NR, Pelletier A, Berkowitz LR. Mini-clinical evaluation exercise in the era of milestones and entrustable professional activities in obstetrics and gynaecology: resume or reform? J Obstet Gynaecol Can. 2020;42(6):718–725.

35. Bartlett MS. A note on the multiplying factors for various χ2 approximations. J R Stat Soc Series B Methodol. 1954;16(2):296–298.

36. Tabachnick BG, Fidell LS, Ullman JB. Using multivariate statistics. Boston MA Pearson. 2007;5(481):498.

37. Eltayar AN, Aref SR, Khalifa HM, et al. Do entrustment scales make a difference in the inter-rater reliability of the workplace-based assessment? Med Educ Online. 2022;27(1):2053401.

38. Alves de lima A, Barrero C, Baratta S, et al. Validity, reliability, feasibility and satisfaction of the mini-clinical evaluation exercise (Mini-CEX) for cardiology residency training. Med Teach. 2007;29(8):785–790.

39. He Y, Wen S, Zhou M, et al. A pilot study of modified mini-clinical evaluation exercises (mini-CEX) in rotation students in the department of endocrinology. Diabetes Metab Syndr Obes. 2022;15:2031–2038.

40. Cieślak I, Panczyk M, Musik S, et al. The impact of the evaluation of summer internships on student self-assessment of and opinion on educational outcomes obtained by the second students of Public Health. Critical Care Innovat. 2022;5(1):1–4.

Creative Commons License © 2022 The Author(s). This work is published and licensed by Dove Medical Press Limited. The full terms of this license are available at https://www.dovepress.com/terms.php and incorporate the Creative Commons Attribution - Non Commercial (unported, v3.0) License. By accessing the work you hereby accept the Terms. Non-commercial uses of the work are permitted without any further permission from Dove Medical Press Limited, provided the work is properly attributed. For permission for commercial use of this work, please see paragraphs 4.2 and 5 of our Terms.