Back to Journals » Advances in Medical Education and Practice » Volume 13

A Paradigm Shift in Assessment of Scientific Skills in Undergraduate Medical Education

Authors Goss C , Culley FJ, Parthasarathy P , MacLeod K, McGregor AH, Sam AH

Received 5 October 2021

Accepted for publication 23 December 2021

Published 8 February 2022 Volume 2022:13 Pages 123—127

DOI https://doi.org/10.2147/AMEP.S342789

Checked for plagiarism Yes

Review by Single anonymous peer review

Peer reviewer comments 2

Editor who approved publication: Prof. Dr. Balakrishnan Nair



Charlotte Goss, Fiona J Culley, Prabha Parthasarathy, Ken MacLeod, Alison H McGregor, Amir H Sam

Faculty of Medicine, Imperial College London, London, UK

Correspondence: Amir H Sam, Email [email protected]

Abstract: The General Medical Council’s publication ‘Outcomes for Graduates’ places emphasis on doctors being able to integrate biomedical science, research and scholarship with clinical practice. In response, a new paradigm of assessment was introduced for the intercalated Bachelor of Science program at Imperial College School of Medicine in 2019. This innovative approach involves authentic “active learning” assessments analogous to tasks encountered in a research environment and intends to test a wider range of applied scientific skills than traditional examinations. Written assessments include a “Letter to the Editor”, scientific abstract, and production of a lay summary. A clinical case study titled “Science in Context” presents a real or virtual patient, with evaluation of current and emerging evidence within that field. Another assessment emulates the academic publishing process: groups submit a literature review and engage in reciprocal peer review of another group’s work. A rebuttal letter accompanies the final submission, detailing how feedback was addressed. Scientific presentation skills are developed through tasks including a research proposal pitch, discussion of therapies or diagnostics, or review of a paper. A data management assignment develops skills in hypothesis generation, performing analysis, and drawing conclusions. Finally, students conduct an original research project which is assessed via a written report in the format of a research paper and an oral presentation involving critical analysis of their project. We aspire to train clinicians who apply scientific principles to critique the evidence base of medical practice and possess the skillset to conduct high-quality research underpinned by the principles of best clinical and academic practice. Assessment drives learning, and active learning has been demonstrated to enhance academic performance and reduce attainment gaps in science education. We therefore believe this strategy will help to successfully shape our students as future scientists and scholars as well as clinical practitioners and professionals.

Keywords: biomedical science, assessment, active learning, professional skills

Introduction

The General Medical Council’s (GMC) publication ‘Outcomes for Graduates’ outlines the knowledge and skills required of newly qualified doctors in the United Kingdom (UK).1 These guidelines focus on three main domains spanning the breadth of good clinical practice: professional values and behaviors, professional skills and professional knowledge.1 Across these domains there is emphasis on doctors having the ability to integrate biomedical science, research and scholarship with clinical practice. The concept of evidence-based medicine underpins modern clinical practice. It is grounded on the principle of implementing current best evidence to provide the highest standards of care and achieve the best outcomes for patients.2,3 Constant advances in medical knowledge demand that all clinicians are equipped with a skillset that enables them to access, interpret and evaluate new evidence as it arises, and appropriately integrate this with their practice. These skills are of paramount importance to any clinician who is directly involved in conducting academic research, as a robust scientific approach assures the quality of the evidence on which we base our practice.4 Medical schools must, therefore, have a curriculum and assessment strategy that is directed towards students developing the knowledge and skills to practice evidence-based medicine and the potential to conduct high caliber scientific research.3

Aligned with the GMC’s outcomes relating to the integration of biomedical science with clinical practice, a new paradigm of assessment was introduced for fourth year undergraduate medical students on the intercalated Bachelor of Science (iBSc) program at Imperial College School of Medicine (ICSM) in 2019. Assessment is recognized to have significant influence over students’ learning behaviors, and those which are authentic and have practical application encourage a deeper approach to learning.5–7 Our novel approach therefore moves away from conventional written examinations to an evidence-based system of applied, authentic “active learning” assessments.

In contrast to the traditional learning format where students form a “passive” audience who listen to information delivered by a lecturer, active learning is achieved through completion of meaningful learning exercises (individually or in groups) which require students to think about, and directly apply, the knowledge and concepts that are intended to be taught.8,9 Active learning can therefore be used to simultaneously teach applied skills and cognitive processes such as critical thinking and complex problem-solving, alongside subject-specific knowledge.8–10 These are important skills to foster in future doctors as they are integral to accurate clinical decision making, potentially reducing diagnostic errors and improving patient safety.11,12 Taking an active approach to learning has shown to improve academic outcomes for students across scientific disciplines, and there is a large pedagogical literature base (including a meta-analysis of 225 studies) supporting improved assessment scores, reduced failure rates and reductions in attainment gaps.8,10,13,14

Cognitive psychology research has demonstrated that active learning tasks should be designed to specifically practise the skills that students need to learn, as this leads to development of expertise in those areas.9,15 If we wish to equip our students with the ability to search the literature, critically appraise research and apply scientific reasoning, we must provide learning opportunities that specifically support the development of these skills. We therefore designed this new system of assessments with assignments that are analogous to tasks encountered in a research environment (Figure 1).8 Through authentic replication of academic activities such as scientific writing and presentation, literature searches and data analysis, this novel approach intends to teach and test a wider range of applied scientific skills than is possible with traditional essay-based examinations.

Figure 1 Overview of assessments of scientific skills in the iBSc program at Imperial College School of Medicine.

Our Innovative Approach

The iBSc program constitutes the fourth year of the six-year undergraduate medical degree at ICSM. The program includes 15 pathways that are delivered via three modules. “Module 1” develops core research skills alongside specialism-specific knowledge. Assessments include a written task, oral presentation and a data management assignment. The written assignment involves students writing a “Letter to the Editor” of a journal or a commentary addressing a topic within their chosen field. This letter is based on a topical or contentious subject, research article or a report of a relevant major clinical study. The presentation assessment focuses either on comparing aspects of their specialism or a controversial topic in the field, and these tasks can include discussion of an area of research, therapies or diagnostics, a pitch for a research proposal, or review of a paper. This work is presented via oral or poster presentation, or using digital story telling. Finally, students perform analysis on a data set they are given or have gathered through their own laboratory work. They generate a hypothesis, perform appropriate analysis, and draw conclusions. This is assessed through three written tasks: a results compendium (a succinct but comprehensive account of their data management and analysis), a scientific abstract and production of a lay summary. Alongside subject-specific knowledge, these assessments are designed to target the development of fundamental scientific skills including scientific writing, consulting literature, critical appraisal, understanding of research methodologies and data analysis and interpretation. Furthermore, the focus on written and verbal communication of scientific information to different audiences is directly transferrable to clinical practice: doctors must be able to communicate clinical and scientific information safely and effectively to a wide range of colleagues and patients.

“Module 2” consists of two written tasks: a group literature review and a patient case report titled “Science in Context”. The group literature review assessment is designed to emulate the academic publishing process and is based on evidence that completing active learning tasks in small groups significantly increases learning.9 Groups perform an in-depth, self-directed literature search on an assigned topic within their specialism and submit a written literature review, plus an accompanying cover letter to “the Editor”. Each initial submission is marked by a faculty member (“the Editor”) and undergoes reciprocal peer review by another student group. In response to peer and “Editor” feedback, groups revise their literature review and re-submit it along with a rebuttal letter detailing how reviewers’ comments were addressed, thus simulating the stages of publication from manuscript preparation through to acceptance.

Marks for teamwork in this assessment are generated using a modified comprehensive assessment of team member effectiveness (CATME) score, which evaluates performance within a team by rating each individual across five behavioral domains that have been shown to correlate with effective teamwork (Figure 2).16,17 Students are required to submit peer scores and feedback comments for each member of their group across all five domains, as well as a self-evaluation of their own performance. An overall group mark is awarded for the written review by the faculty. This mark is subsequently adjusted up or down for each student by their individual modified CATME peer score (awarded by their team) to generate a final individual mark (Figure 2). A high modified CATME peer score reflects good teamwork and would result in the group mark being adjusted upwards, resulting in a higher final individual assessment mark for that student. A student who received a lower modified CATME peer score would have the group mark adjusted down, receiving a lower final mark for this assessment. The CATME system has shown to be a reliable method of assessment with good interrater agreement and correlate with final course results.16–18 In addition to building skills in literature searching, scientific writing and critical appraisal, this task also fosters professional skills in providing feedback, teamworking, and accountability, all of which are essential to clinical practice.

Figure 2 Example of generating individual assessment marks from group work using the modified CATME Score17.

The second assessment of “Module 2”, “Science in Context”, is a written clinical case study based on a real or virtual patient. Students write a report which includes a clinical case summary of the patient’s presentation and a discussion of the current and emerging evidence for either the underlying pathophysiology or the clinical management of the condition, commenting on the level of available evidence. This task serves to highlight the inextricable link between scientific research and evidence-based clinical practice while developing students’ ability to systematically present clinical information.

The aim of Module 3 is to consolidate learning across the iBSc program with students conducting an original research project either in a clinical or laboratory-based setting. This is assessed via a written report in the style of a research paper and an oral presentation. In the oral presentation, the students should present a critical analysis of their project including limitations and future research directions, alongside a reflective account of their research experience and skills gained throughout the iBSc program.

Conclusion

At Imperial College School of Medicine, we aspire to train future clinicians who apply scientific principles to critique the evidence base of their medical practice, work as reflective team-players, and possess the skillset to pursue a clinical academic career. Using this innovative approach to assessment of the fourth year undergraduate iBSc degree program, we can assess specialism-specific knowledge within an applied scientific context, while also fostering essential professional skills for practicing safe, evidence-based medicine. We have therefore moved away from traditional assessment of learning to a program using authentic active learning assessments for learning. Supported by extensive pedagogical research we believe this is an effective approach to achieving the GMC outcomes and training our students to graduate as scholars as well as clinical practitioners and professionals.

Abbreviations

GMC, General Medical Council; UK, United Kingdom; iBSc, intercalated Bachelor of Science; ICSM, Imperial College School of Medicine; CATME, Comprehensive Assessment of Team Member Effectiveness.

Acknowledgments

No funding or financial support was received to carry out this work and ethical approval was not required. This work has not been presented or accepted for publication anywhere else.

Disclosure

The authors report no competing interests in this work.

References

1. General Medical Council. Outcomes for graduates 2018 [online standards and outcomes guideline for medical schools]. General Medical Council; 2018. Available from: https://www.gmc-uk.org/-/media/documents/dc11326-outcomes-for-graduates-2018_pdf-75040796.pdf. Accessed September 23, 2021.

2. Akobeng AK. Principles of evidence based medicine. Arch Dis Child. 2005;90:837–840. doi:10.1136/adc.2005.071761

3. Lehane E, Leahy-Warren P, O’Riordan C, et al. Evidence-based practice education for healthcare professions: an expert view. BMJ Evid Based Med. 2019;24:103–108. doi:10.1136/bmjebm-2018-111019

4. Szajewska H. Evidence-based medicine and clinical research: both are needed, neither is perfect. Ann Nutr Metab. 2018;72(Suppl 3):13–23. doi:10.1159/000487375

5. Gibbs G, Simpson C. Conditions under which assessment supports students’ learning. Learn Teach High Educ. 2004;(1):3–31. Available from http://eprints.glos.ac.uk/3609/1/LATHE%201.%20Conditions%20Under%20Which%20Assessment%20Supports%20Students%27%20Learning%20Gibbs_Simpson.pdf.

6. Struyven K, Dochy F, Janssens S. Students’ perceptions about evaluation and assessment in higher education: a review. Assess Eval High Educ. 2005;30(4):325–341. doi:10.1080/02602930500099102

7. Bloxham S, Boyd P. The evidence base for assessment practice in higher education. In: Bloxham S, Boyd P, editors. Developing Effective Assessment in Higher Education: A Practical Guide. Maidenhead, UK: Open University Press; 2007:15–30.

8. Sandrone S, Scott G, Anderson WJ, Musunuru K. Active learning-based STEM education for in-person and online learning. Cell. 2021;184(6):1409–1414. doi:10.1016/j.cell.2021.01.045

9. Wieman C, Gilbert S. Taking a scientific approach to science education, part I – research. Microbe. 2015a;10(4):152–156.

10. Sahoo S, Mohammed CA. Fostering critical thinking and collaborative learning skills among medical students through a research protocol writing activity in the curriculum. Korean J Med Educ. 2018;30(2):109–118. doi:10.3946/kjme.2018.86

11. Scott IA, Hubbard RE, Crock C, Campbell T, Perera M. Developing critical thinking skills for delivering optimal care. Intern Med J. 2021;51(4):488–493. doi:10.1111/imj.15272

12. Royce CS, Hayes MM, Schwartzstein RM. Teaching critical thinking: a case for instruction in cognitive biases to reduce diagnostic errors and improve patient safety. Acad Med. 2019;94(2):187–194. doi:10.1097/ACM.0000000000002518

13. Freeman S, Eddy SL, McDonough M, et al. Active learning increases student performance in science, engineering, and mathematics. Proc Natl Acad Sci. 2014;111(23):8410–8415. doi:10.1073/pnas.1319030111

14. Theobald EJ, Hill MJ, Tran E, et al. Active learning narrows achievement gaps for underrepresented students in undergraduate science, technology, engineering, and math. Proc Natl Acad Sci. 2020;117(12):6476–6483. doi:10.1073/pnas.1916903117

15. Bonwell CC, Sutherland TE. The active learning continuum: choosing activities to engage students in the classroom. N Direct Teach Learn. 1996;1996(67):3–16. doi:10.1002/tl.37219966704

16. Loughry ML, Ohland MW, Moore DD. Development of a theory-based assessment of team member effectiveness. Educ Psychol Meas. 2007;67:505–524. doi:10.1177/0013164406292085

17. Ohland MW, Loughry ML, Woehr DJ, et al. The comprehensive assessment of team member effectiveness: development of a behaviorally anchored rating scale for self and peer evaluation. Acad Manag Learn Educ. 2012;11:609–630. doi:10.5465/amle.2010.0177

18. Loughry ML, Ohland MW, Woehr DJ. Assessing teamwork skills for assurance of learning using CATME team tools. J Market Educat. 2014;36(1):5–19. doi:10.1177/0273475313499023

Creative Commons License © 2022 The Author(s). This work is published and licensed by Dove Medical Press Limited. The full terms of this license are available at https://www.dovepress.com/terms.php and incorporate the Creative Commons Attribution - Non Commercial (unported, v3.0) License. By accessing the work you hereby accept the Terms. Non-commercial uses of the work are permitted without any further permission from Dove Medical Press Limited, provided the work is properly attributed. For permission for commercial use of this work, please see paragraphs 4.2 and 5 of our Terms.