Back to Journals » Journal of Multidisciplinary Healthcare » Volume 3

Reliability of a seminar grading rubric in a grand rounds course

Authors MacLaughlin E , Fike D, Alvarez C, Seifert C, Blaszczyk A

Published 9 September 2010 Volume 2010:3 Pages 169—179

DOI https://doi.org/10.2147/JMDH.S12346

Review by Single anonymous peer review

Peer reviewer comments 2



Eric J MacLaughlin1, David S Fike1, Carlos A Alvarez2, Charles F Seifert3, Amie T Blaszczyk2
Texas Tech University Health Sciences School of Pharmacy, Department of Pharmacy Practice, 1Amarillo, 2Dallas, 3Lubbock, Texas, USA

Purpose: Formal presentations are a common requirement for students in health professional programs, and evaluations are often viewed as subjective. To date, literature describing the reliability or validity of seminar grading rubrics is lacking. The objectives of this study were to characterize inter-rater agreement and internal consistency of a grading rubric used in a grand rounds seminar course.
Methods: Retrospective study of 252 student presentations given from fall 2007 to fall 2008. Data including student and faculty demographics, overall content score, overall communication scores, subcomponents of content and communication, and total presentation scores were collected. Statistical analyses were performed using SPSS, 16.0.
Results: The rubric demonstrated internal consistency (Cronbach’s alpha = 0.826). Mean grade difference between faculty graders was 4.54 percentage points (SD = 3.614), with ≤ 10-point difference for 92.5% of faculty evaluations. Student self evaluations correlated with faculty scores for content, communication, and overall presentation (r = 0.513, r = 0.455, and r = 0.539; P < 0.001 for all respectively). When comparing mean faculty scores to student’s self-evaluations between quintiles, students with lower faculty evaluations overestimated their performance, and those with high faculty evaluations underestimated their performance (P < 0.001).
Conclusion: The seminar evaluation rubric demonstrated inter-rater agreement and internal consistency.

Keywords: seminar, public speaking, evaluation, grand rounds

Creative Commons License © 2010 The Author(s). This work is published and licensed by Dove Medical Press Limited. The full terms of this license are available at https://www.dovepress.com/terms.php and incorporate the Creative Commons Attribution - Non Commercial (unported, v3.0) License. By accessing the work you hereby accept the Terms. Non-commercial uses of the work are permitted without any further permission from Dove Medical Press Limited, provided the work is properly attributed. For permission for commercial use of this work, please see paragraphs 4.2 and 5 of our Terms.