Back to Journals » Advances in Medical Education and Practice » Volume 1

Using systematically observed clinical encounters (SOCEs) to assess medical students’ skills in clinical settings

Authors Bergus G, Woodhead J, Kreiter C 

Published 19 November 2010 Volume 2010:1 Pages 67—73

DOI https://doi.org/10.2147/AMEP.S12962

Review by Single anonymous peer review

Peer reviewer comments 3



George R Bergus1–3, Jerold C Woodhead4, Clarence D Kreiter2,5
1Performance Based Assessment Program, Office of Student Affairs and Curriculum, 2Department of Family Medicine, 3Department of Psychiatry, 4Department of Pediatrics, 5Office of Consultation and Research in Medical Education, Roy J and Lucille A Carver College of Medicine, The University of Iowa, Iowa City, IA, USA

Introduction: The Objective Structured Clinical Examination (OSCE) is widely used to assess the clinical performance of medical students. However, concerns related to cost, availability, and validity, have led educators to investigate alternatives to the OSCE. Some alternatives involve assessing students while they provide care to patients – the mini-CEX (mini-Clinical Evaluation Exercise) and the Long Case are examples. We investigated the psychometrics of systematically observed clinical encounters (SOCEs), in which physicians are supplemented by lay trained observers, as a means of assessing the clinical performances of medical students.
Methods: During the pediatrics clerkship at the University of Iowa, trained lay observers assessed the communication skills of third-year medical students using a communication checklist while the students interviewed and examined pediatric patients. Students then verbally presented their findings to faculty, who assessed students’ clinical skills using a standardized form. The reliability of the combined communication and clinical skills scores was calculated using generalizability theory.
Results: Fifty-one medical students completed 199 observed patient encounters. The mean combined clinical and communication skills score (out of a maximum 45 points) was 40.8 (standard deviation 3.3). The calculated reliability of the SOCE scores, using generalizability theory, from 10 observed patient encounters was 0.81. Students reported receiving helpful feedback from faculty after 97% of their observed clinical encounters.
Conclusion: The SOCE can reliably assess the clinical performances of third-year medical students on their pediatrics clerkship. The SOCE is an attractive addition to the other methods utilizing real patient encounters for assessing the skills of learners.

Keywords: performance assessment, clinical skills, medical education

Creative Commons License © 2010 The Author(s). This work is published and licensed by Dove Medical Press Limited. The full terms of this license are available at https://www.dovepress.com/terms.php and incorporate the Creative Commons Attribution - Non Commercial (unported, v3.0) License. By accessing the work you hereby accept the Terms. Non-commercial uses of the work are permitted without any further permission from Dove Medical Press Limited, provided the work is properly attributed. For permission for commercial use of this work, please see paragraphs 4.2 and 5 of our Terms.