Back to Journals » Advances in Medical Education and Practice » Volume 11

Question-Based Collaborative Learning for Constructive Curricular Alignment

Authors Wynn-Lawrence LS , Bala L , Fletcher RJ, Wilson RK, Sam AH 

Received 8 September 2020

Accepted for publication 23 November 2020

Published 5 January 2021 Volume 2020:11 Pages 1047—1053


Checked for plagiarism Yes

Review by Single anonymous peer review

Peer reviewer comments 2

Editor who approved publication: Professor Balakrishnan R Nair

Laura S Wynn-Lawrence, Laksha Bala, Rebekah J Fletcher, Rebecca K Wilson, Amir H Sam

Imperial College School of Medicine, Imperial College London, London, UK

Correspondence: Amir H Sam
Imperial College School of Medicine, London, UK
Email [email protected]

Introduction: We designed a curriculum mapping tool which enables medical students to access intended learning outcomes (ILOs) on their iPads in the workplace. Students were encouraged to use the online curriculum map in a specially planned teaching session: question-based collaborative learning (QBCL). The aim of the session was to empower medical students to constructively align their experiential learning with the learning outcomes of the undergraduate curriculum. In doing so, our session aimed to provide students with a greater understanding of the curriculum, improve their insights into assessment and their question-writing abilities.
Methods: The QBCL pre-session preparation involved reviewing a patient with a presentation that aligned to the year-specific ILOs. During a 150 minute QBCL session, students received training on how to write high quality multiple choice questions (MCQs) delivered by a faculty member of Imperial College School of Medicine. They then worked collaboratively in groups and created MCQs based on their clinical encounters. Their questions were tagged to the relevant learning objective and submitted online via the curriculum map. The student-generated MCQs were analyzed using an adjusted version of Bloom’s taxonomy. We also conducted a quantitative evaluation of the session.
Results: One hundred and sixty-three questions were submitted, with 81% of questions being tagged to ILOs considered to show evidence of learning consistent with the “Apply” tier of Bloom’s taxonomy. The majority of students agreed that the session was interactive (80%), thought-provoking (77%) and improved their team-working skills (70%). It gave them a greater understanding of the undergraduate curriculum (65%), improved their question-writing and insight into assessments (76%), and provided an opportunity to learn from their peers (86%). Students agreed that this session covered a variety of cases (82%) and deepened their understanding of medical conditions and presentations (87%).
Conclusion: We encouraged students to actively interact with the curriculum map. Students were able to achieve their own constructive alignment by writing assessment items based on real patients and linking them to the appropriate intended learning outcomes.

Keywords: constructive alignment, multiple choice question

A Letter to the Editor has been published for this article.

A Response to Letter has been published for this article.


Curriculum mapping has previously been recognized as a tool to help achieve constructive alignment; the coordination of learning outcomes with teaching content and assessment tasks.1,2 We designed an online curriculum map (Sofia) which allows students to access intended learning outcomes (ILOs) and visualize connections between different parts of the curriculum. Whilst curriculum mapping has obvious benefits for faculty and programmatic structure, the utility and benefit for students have not yet been extensively explored.1,3 Student engagement has been considered a clear limitation.4 Without the active involvement of students, the apparent transparency of curriculum mapping can be lost. In order to implement the curriculum map as a tool for student learning, the map can be integrated within assignments or assessments to promote its active use.4

To promote student engagement with Sofia and utilize it as a teaching tool, we created a specially designed teaching session: Question-Based Collaborative Learning (QBCL). Previous studies have demonstrated that students find collaborative question writing beneficial for their learning and can produce high-quality questions, but the use of a curriculum mapping tool to guide and augment this process has not yet been considered.5,6 The aim of the session was to empower medical students to constructively align their experiential learning with the learning outcomes of the undergraduate curriculum. In doing so, our session aimed to provide students with a greater understanding of our undergraduate curriculum and improve their insights into assessment and question-writing abilities. This required students to generate assessment items (multiple choice questions) based on their clinical experiences within a small group setting. It may also have the fringe benefit of generating a question bank that is valuable to students.7


Designing an Online Curriculum Map

We compiled a database of all ILOs across the six-year course and tagged each one with the relevant specialties, domains of professional knowledge, and the General Medical Council’s ‘Outcomes for Graduates’.8 These data were then used by the software company Isotoma (York) to create a bespoke, online curriculum map, Sofia. The online curriculum map provides both educators and learners with a visual representation of ILOs and signposts where they are present within the curriculum. Students can access ILOs in multiple ways, including according to patient presentation or condition. When an objective is selected, it displays interactive links signposting if the topic has been covered elsewhere within the curriculum.

Designing QBCL

During QBCL sessions, students generated single best answer questions based on their patient encounters, and in line with the appropriate learning outcomes on the curriculum map. QBCL was delivered to third-year students during their first 10-week clinical placement in December 2018. Students were asked to review a participant information leaflet and sign a consent form prior to participation in the session to confirm their agreement to take part in this research study and for their MCQs to be reproduced.

This faculty-led teaching session was designed to allow students to share knowledge with their peers in a flipped classroom format. Pre-session preparation involved generating questions based on genuine clinical encounters. In order to ensure case variety within the teaching session, students were allocated question topics which aligned with their placement. They were then asked to produce a clerking based on the history, examination and investigation of a patient with a relevant condition or presentation. Students were advised to present their clerking to a senior clinician before constructing a multiple choice question in preparation for the session.

Each session contained around 50 students with one facilitator and ran for a maximum of 150 minutes. Students received training on how to construct high-quality MCQs and identify relevant learning outcomes during the first 30 minutes of the session. Students were then allocated to groups of five or six, with a mix of allocated topics within each group. They were asked to present their cases individually before selecting three or four of the best questions per table. These were then edited using a question formatting checklist. The questions were structured to include a stem, a lead in and five available answers; one of which was marked as correct. Learners were asked to tag their questions to the relevant ILOs before submitting the final version onto the Sofia template (as per the process in Figures 13). Students were then asked to complete an anonymous evaluation questionnaire regarding their responses to and learning from the QBCL session.

Figure 1 Learning outcomes on Sofia for pericarditis.

Figure 2 Tagging a learning outcome on Sofia to a question.

Figure 3 Creating a question using the Sofia template.

Analyzing Students’ Questions

In order to analyze the submitted questions, we assigned different levels of clinical knowledge and skills to the established tiers of Bloom’s Taxonomy [Figure 4].9 We used the established tiers titled “Remember”, “Understand”, “Apply” and “Analyze” and applied these to our institution’s desired learning outcomes, which reflect different levels of clinical practice. It was considered that students would not be able to show evidence of the top two tiers when answering multiple choice questions, as these require higher levels of clinical reasoning, and hence these are left blank [Figure 4].

Figure 4 Analysis of student-generated MCQs using an adjusted version of Bloom’s taxonomy.

During the QBCL session, the facilitator (a faculty member) screened the content of the questions to ensure alignment with the attached outcome. Where a question was tagged to more than one ILO, the most appropriate ILO was determined by the facilitator. Nine questions were excluded from analysis based on their attachment to unrelated ILOs, and one further question on the basis that it was attached to a different year's learning outcome.


Student-Generated Questions

A total of 163 questions were submitted by students; based on a variety of specialties. Out of the student-generated questions, the largest proportion (81%) were attached to ILOs in the “Apply” tier. Additionally, 16% of questions submitted were attached to ILOs in the “Remember” and “Understand” tiers and 3% to the “Analyze” tier [Figure 4]. Examples of questions for each tier include:

  1. A 53-year-old female is being prepared by anaesthetists before a total laryngopharyngectomy for oesophageal cancer. Her temperature is 37.0°C, pulse rate 82 bpm, BP 120/85 mmHg, respiratory rate 18 breaths per minute and oxygen saturation 98% breathing air. She has red wrist-bands to indicate her intolerance to codeine, which leaves her feeling nauseous. According to the WHO (World Health Organization) pain ladder, which could be an alternative analgesic to codeine?

Remember tier, as the question requires recollection of the WHO pain ladder.

  • 2. A 50-year-old woman is booked for an elective laparoscopic cholecystectomy. What advice should she be given regarding eating and drinking prior to surgery?
  • Understand tier, as the question requires an understanding of being nil by mouth prior to surgery.

  • 3. A 59-year-old male smoker presents with intermittent burning pain in both legs for several months. He describes his symptoms as arising predictably after 20 yards, and occasionally when at rest and at night. He has decreased pain and fine touch sensation below the knees, and cold shins and feet. His legs become pale when raised to 20° and redden when lowered back down. What is the most likely diagnosis?
  • Apply tier, as the question requires application of knowledge regarding the signs, symptoms and risk factors for vascular claudication to reach a diagnosis.

    Student Evaluation Questionnaire

    A total of 176 students completed the anonymous evaluation questionnaire after the QBCL session [Figure 5]. Not all students answered every question, with a minimum of 173 students completing responses to some questions. Overall, the students’ reaction to the QBCL session was positive. The majority of students agreed that the session was interactive (80%), interesting (68%), enjoyable (61%) and thought-provoking (77%). They felt well prepared for the session (77%) and found that it provided them with the opportunity to discuss cases with their clinical teams (68%). The session developed their understanding of the undergraduate curriculum (65%), improved their question-writing and insight into assessments (76%), provided an opportunity to learn from their peers (86%) and improved their team-working skills (70%). Students agreed that this session covered a variety of cases (82%) and deepened their understanding of medical conditions and presentations (87%).

    Figure 5 Student Responses to the QBCL Session Evaluation Questionnaire (n = 173 −176).


    Sofia was created to allow both medical students and teachers to be able to access and interact with the curriculum. Through its use in a faculty-led session, we demonstrated student engagement with Sofia and an improvement in their understanding of the curriculum and assessments, as evidenced by the student evaluation responses. This is in keeping with existing literature promoting the use of digital curriculum mapping tools to enhance curriculum visibility amongst students.4 Allowing students to write exam-style questions and attach them to their own learning objectives, also gave them a greater understanding of how assessments are written and helped to de-mystify assessment. Furthermore, by encouraging students to create their own MCQs, we may be improving their learning and engagement in higher-level cognitive processes.10 The discussion of cases and submission of questions during the teaching session also encourages peer to peer review and promotes student collaboration.7 The content and the quality (as determined by mapping onto Bloom’s taxonomy) of the questions submitted were considered to provide further evidence of how well the students engaged with Sofia and constructively aligned their experiential learning with learning outcomes from the undergraduate curriculum.

    Although we have concluded that the questions produced were constructively aligned with the curriculum, we recognize the limitations of this study. We have yet to test the utility of these MCQs by conducting test-item psychometric analysis following use of these MCQs with another cohorts of students. If psychometric analysis reveals that these questions are of high quality (e.g. through measures of reliability and validity), then this could be a way of generating cost-effective formative assessments with the added benefit of giving students a greater understanding of the curriculum, as suggested by previous studies.7,11 Whilst the evaluative data we collected using the student questionnaire addressed level 1 (reaction) and level 2 (learning) of Kirkpatrick’s training evaluation model, this evaluation could have been enriched further by gathering qualitative feedback from students.12

    Further work should examine the psychometric properties of the questions produced, and whether student-generated questions can assess the top two tiers of Bloom’s taxonomy (“Evaluate” and “Create”). This latter objective may be better achieved using very-short answer questions (VSAQs), short-answer questions or essays. Given our previous work on VSAQs being more representative of real-life practice than MCQs, it would be interesting to establish the level of learning assessed by this form of student-generated question, by collecting both quantitative and qualitative evaluative data.13 We hope to expand this work by incorporating this collaborative activity into the students’ clinical placements. The aim of this is to promote student engagement with the curriculum and provide further opportunities for students and faculty to identify any learning and teaching gaps.


    The creation of a curriculum map allows students to directly engage with the ILOs relevant to their year. The flipped-classroom teaching session was designed to proactively promote student engagement with the curriculum map and constructively align their experiential learning to course-specific ILOs in the creation of assessment items. Students were able to produce a reasonable number of questions and attach them to learning outcomes assessing different levels of learning. Further work should establish the generalizability of the teaching session into clinical placements and the utility of these student-generated items.

    Ethical Approval

    This study was granted ethical approval by the Medical Education Ethics Committee at Imperial College London (reference number MEEC1819-125).


    There is no funding to report.


    The product Sofia is currently being commercialised by Imperial College London; however, the authors will not receive any financial gain from publication of this paper. Although the curriculum mapping system mentioned in the paper is commercially available from Imperial College London in partnership with their developers Isotoma Ltd, the question banking tool within this system is not commercially available, nor do the authors receive any personal financial gain from the sale of the system. The authors report no other potential conflicts of interest for this work.


    1. Steketee C. Prudentia: a medical school’s solution to curriculum mapping and curriculum management. J Univ Teach Learn Pract. 2015;12(4):1–10.

    2. Biggs J. Aligning teaching for constructing learning (online) [Internet]. High Educ Acad. 2013.

    3. Harden RM. AMEE guide No. 21: curriculum mapping: a tool for transparent and authentic teaching and learning. Med Teach. 2001;23(2):123–137. doi:10.1080/01421590120036547

    4. Wijngaards-de Meij L, Merx S. Improving curriculum alignment and achieving learning goals by making the curriculum visible. Int J Acad Dev. 2018;23(3):219–231. doi:10.1080/1360144X.2018.1462187

    5. Kurtz J, Holman B, Monrad SU. Training medical students to create and collaboratively review multiple-choice questions: a comprehensive workshop. MedEdPORTAL. 2020;16:10986. doi:10.15766/mep_2374-8265.10986

    6. Wilson DJ, Smith PE, Tayyaba S, Harris DA, Wilson DJ, Smith PE. A novel student-led approach to multiple-choice question generation and online database creation, with targeted clinician input. Teach Learn Med. 2015;27(2):182–188. doi:10.1080/10401334.2015.1011651

    7. Gooi ACC, Sommerfeld CS. Medical school 2.0: how we developed a student-generated question bank using small group learning. Med Teach. 2015;37(10):892–896. doi:10.3109/0142159X.2014.970624

    8. General Medical Council. Outcomes for graduates: 2018. London: GMC; 2018.

    9. Bloom BS, Engelhart MD, Furst EJ, Hill WH and Krathwohl DR. The Taxonomy of Educational Objectives, The Classification of Educational Goals, Handbook 1: Cognitive Domain. New York, NY. David Mckay Company; 1956.

    10. Tackett S, Raymond M, Desai R, et al. Crowdsourcing for assessment items to support adaptive learning. Med Teach. 2018;40(8):838–841. doi:10.1080/0142159X.2018.1490704

    11. Walsh J, Harris B, Tayyaba S, Harris D, Smith P. Student-written single-best answer questions predict performance in finals. Clin Teach. 2016;13(5):352–356. doi:10.1111/tct.12445

    12. Kirkpatrick JD, Kirkpatrick DL. Implementing the Four Levels: A Practical Guide for Effective Evaluation of Training Programs. San Francisco, CA: Berrett-Koehler; 2007.

    13. Sam AH, Westacott R, Gurnell M, Wilson R, Meeran K, Brown C. Comparing single-best-answer and very-short-answer questions for the assessment of applied medical knowledge in 20 UK medical schools: cross-sectional study. BMJ Open. 2019;9(9):e032550. doi:10.1136/bmjopen-2019-032550

    Creative Commons License © 2021 The Author(s). This work is published and licensed by Dove Medical Press Limited. The full terms of this license are available at and incorporate the Creative Commons Attribution - Non Commercial (unported, v3.0) License. By accessing the work you hereby accept the Terms. Non-commercial uses of the work are permitted without any further permission from Dove Medical Press Limited, provided the work is properly attributed. For permission for commercial use of this work, please see paragraphs 4.2 and 5 of our Terms.