Back to Journals » Advances in Medical Education and Practice » Volume 11

Medical Students’ Perspectives on an Assessment of Reflective Portfolios [Response to Letter]

Authors Kassab SE, Bidmos M, Nomikos M , Daher-Nashif S, Kane T , Sarangi S, Abu-Hijleh M

Received 3 July 2020

Accepted for publication 3 July 2020

Published 16 July 2020 Volume 2020:11 Pages 495—496

DOI https://doi.org/10.2147/AMEP.S270581



Salah Eldin Kassab,1 Mubarak Bidmos,1 Michail Nomikos,1 Suhad Daher-Nashif,2 Tanya Kane,2 Srikant Sarangi,3 Marwan Abu-Hijleh1

1Department of Basic Medical Sciences, College of Medicine, QU Health, Qatar University, Doha, Qatar; 2Department of Population Medicine, College of Medicine, QU Health, Qatar University, Doha, Qatar; 3Danish Institute of Humanities and Medicine (DIHM), Aalborg University, Aalborg, Denmark

Correspondence: Salah Eldin Kassab
Physiology and Medical Education, College of Medicine, QU Health, Qatar University, PO Box 2713, Doha, Qatar
Tel +974 4403 7843
Email [email protected]

We thank Forenc et al1 for their interest in our study titled Construct Validity of an Instrument for Assessment of Reflective Writing-Based Portfolios of Medical Students.2 In their Letter to Editor, their main critique concerned the extent to which the nonanonymity of reflective portfolios and the lack of reflection prompts to students may have affected the G-theory analysis. In their view, these two aspects will have reduced the percentage variance of the object of measurement (students) and thus influenced the variance attributed to the study facets. In addition, they draw attention that the study instrument might not be replicable for clinical students, due to increased complexity of the learning environment. We address their concerns in turn.

 

View the original paper by Kassab and colleagues

 

This is in response to the Letter to the Editor

 

Dear editor

We thank Forenc et al1 for their interest in our study titled Construct Validity of an Instrument for Assessment of Reflective Writing-Based Portfolios of Medical Students.2 In their Letter to Editor, their main critique concerned the extent to which the nonanonymity of reflective portfolios and the lack of reflection prompts to students may have affected the G-theory analysis. In their view, these two aspects will have reduced the percentage variance of the object of measurement (students) and thus influenced the variance attributed to the study facets. In addition, they draw attention that the study instrument might not be replicable for clinical students, due to increased complexity of the learning environment. We address their concerns in turn.

It is important to clarify that there are currently no universal guidelines for explaining the magnitude of variance related to each component in G-theory analysis. Of course, any researcher would aim to get the maximum percentage of variance attributed to differences between the object of measurement compared with other facets in the measurement plan. However, the main determinant of what represents large or small variance is the purpose of the study and the identified sources of variance.3 For example, we have recently reported an acceptable reliability coefficient with only 27% variance due to the object of measurement, because the study aimed to measure “soft skills”, which are considered difficult to measure.4 Given that reflection is an enigmatic and complex construct, we believe that the 46.6% variance attributed to the object of measurement in our study was reasonably grounded.

We fully acknowledge that anonymity in reflective writing–based portfolios could have reduced the variance in student–rater interaction and thus bias in assessment. However, the decision concerning whether to provide students with reflection prompts or not is a tradeoff between scaffolding a structured, guided reflection process or affording the unbounded freedom of personal reflections deriving from a rich and varied array of lived experiences. We firmly believe that the absence of reflection prompts optimizes the conditions for individually unique, authentic reflections, which must be preferred to (re)acting reflectively to a checklist of activities triggered by a set of predetermined prompts. Here, maintaining a distinction between “reflection for learning” and “reflection for assessment” is useful: although prompts are a good device for learning purposes, they are not relevant for assessment purposes. In the latter context, what students choose to reflect on and how they articulate their reflections are both integral to assessment. It is worth noting that in the absence of explicit prompts, we have equipped students with scaffolding measures, such as mentoring, training, and study guides, which help them to select and reflect on their learning experiences. Moreover, in our view, Renner et al’s study, which asserts that prompts stimulate reflection by guiding the process and encouraging students to evaluate their experiences in greater depth, may be not be a valid analogy.5 Renner et al addressed the use of computer-supported prompts in collaborative reflections, where such prompts would catalyze learners to reflect upon the contributions of others and write down comprehensive and relevant comments.5 Their study and ours are thus conceptually different, as reflecting on others’ experiences — unlike one’s own — is not an easy endeavor and probably requires more nudging through prompts.

Finally, we argue that the study instrument is suitable for use in both preclinical and clinical phases of the curriculum. The preclinical phase at the College of Medicine, Qatar University is characterized by problem-based learning at its core, which includes horizontal and vertical integration in conjunction with early exposure to clinical practice.4 The clinical activities in preclinical phase include the clinical skill labs, exposure to real patients in primary-care settings, and experiential review sessions where students reflect on their clinical encounters and discuss issues related to professionalism and ethical practice. These clinical activities occur during the preclinical years and extend into the clerkship phase, providing rich experiences for students to reflect on. The replicability of the study instrument in the clinical phase remains a priority in our future work.

We are grateful to the editor for offering us the opportunity to respond to Forenc et al’s thoughtful comments.

Disclosure

The authors report no conflicts of interest in this communication.

References

1. Forenc KM, Eriksson FM, Malhotra B. Medical Students’ Perspectives on an Assessment of Reflective Portfolios [Letter]. Adv Med Educ Pract. 2020; 11:463–464.

2. Kassab S, Bidmos M, Nomikos M, et al. Construct Validity of an Instrument for Assessment of Reflective Writing-Based Portfolios of Medical Students. Adv Med Educ Pract. 2020; 11:397–404.

3. Briesch AM, Swaminathan H, Welsh M, Chafouleas SM: Generalizability theory: A practical guide to study design, implementation, and interpretation. J Sch Psychol. 2014; 52(1):13–35. doi:10.1016/j.jsp.2013.11.008

4. Kassab S, Du X, Toft E, Cyprian F, Moslih A, Schmidt H, Hamdy H, Abu-Hijleh M. Measuring medical students’ professional competencies in a problem-based curriculum: a reliability study. BMC Med Educ. 2019; 19:155 19:155. doi:10.1186/s12909-019-1594-y

5. Renner B, Prilla M, Cress U, Kimmerle J. Effects of Prompting in Reflective Learning Tools: Findings from Experimental Field, Lab, and Online Studies. Front Psychol. 2016; 7(820). doi:10.3389/fpsyg.2016.00820

Creative Commons License © 2020 The Author(s). This work is published and licensed by Dove Medical Press Limited. The full terms of this license are available at https://www.dovepress.com/terms.php and incorporate the Creative Commons Attribution - Non Commercial (unported, v3.0) License. By accessing the work you hereby accept the Terms. Non-commercial uses of the work are permitted without any further permission from Dove Medical Press Limited, provided the work is properly attributed. For permission for commercial use of this work, please see paragraphs 4.2 and 5 of our Terms.