Back to Journals » Advances in Medical Education and Practice » Volume 11

Use of an Adaptive e-Learning Platform as a Formative Assessment Tool in the Cardiovascular System Course Component of an MBBS Programme

Authors Gupta S, Ojeh N , Sa B , Majumder MAA , Singh K, Adams OP 

Received 15 June 2020

Accepted for publication 18 October 2020

Published 15 December 2020 Volume 2020:11 Pages 989—996

DOI https://doi.org/10.2147/AMEP.S267834

Checked for plagiarism Yes

Review by Single anonymous peer review

Peer reviewer comments 2

Editor who approved publication: Prof. Dr. Balakrishnan Nair



Subir Gupta, 1 Nkemcho Ojeh, 1 Bidyadhar Sa, 2 Md Anwarul Azim Majumder, 1 Keerti Singh, 1 Oswald Peter Adams 1

1Faculty of Medical Sciences, The University of the West Indies, Cave Hill Campus, Bridgetown, Barbados; 2Centre for Medical Sciences Education, Faculty of Medical Sciences, The University of the West Indies, St. Augustine Campus, St Augustine, Trinidad and Tobago

Correspondence: Md Anwarul Azim Majumder
Faculty of Medical Sciences, The University of the West Indies, Cave Hill Campus, Bridgetown, Barbados
Email [email protected]

Background: Technology-enhanced learning includes the adaptive e-learning platform, a data-driven method with computer algorithms, providing customised learning enhancing critical thinking of individual learners. “Firecracker” – an online adaptive e-learning platform, and assessment software, promotes critical thinking, helps prepare students for courses and high-stakes examinations, and evaluates progress relative to co-learners. The objectives of this study were to determine the usage rates of Firecracker, examine the performance of Firecracker formative quizzes, identify the correlation between Firecracker use and performance with that of performance at summative course assessments, and assess students’ satisfaction with Firecracker usage.
Methods: Study participants were Year-2 MBBS (Bachelor of Medicine, Bachelor of Surgery) students (n=91) of the Faculty of Medical Sciences, The University of the West Indies, Cave Hill Campus, Barbados. The Firecracker Administrator uploaded quizzes covering basic science content in the Cardiovascular System course. Access, usage, and performance on Firecracker formative quizzes were retrieved from the Firecracker dashboard. A questionnaire sought the views of study participants.
Results: Seven sets of quizzes were administered over nine weeks, with weekly student completion rates ranging from 53% to 73%. Mean quiz scores ranged from 52% to 72%. Students completing > 4 quiz sessions compared to those completing ≤ 4 demonstrated significantly better performance in Firecracker quizzes (P< 0.01), final examinations (P< 0.01) and in-course assessment plus final examination (P< 0.05) scores. Correlations between overall Firecracker performance and in-course assessment marks (P< 0.05); between total overall Firecracker performance and final examination (P< 0.01); and overall Firecracker performance and total course marks (P< 0.01) were all significant. Most students (70%) were happy using Firecracker and felt it complemented coursework (78%) and prepared them for course exams (58%) (P< 0.01).
Conclusion: Overall, Firecracker was perceived very positively and welcomed by the students. Students were satisfied with the Firecracker as a formative assessment tool, and its use correlated with improved performance in the course examinations.

Keywords: formative assessment, technology-enhanced learning, adaptive learning, Cardiovascular System course, Firecracker, Barbados

A Letter to the Editor has been published for this article.

Background

Technology-enhanced learning has become popular and now plays a vital role in teaching, learning, and assessment in medical education.1–3 It supports and supplements teaching to deliver a learner-specific, personalised and adaptive learning environment according to individual learner needs through real-time experience.1,4 This experience is enhanced through timely feedback, using different modes to deliver course materials, availability and access to various resources and adapting the pace and path of learning based on preference and performance of students.1,4 The most widely used e-learning platforms in medical education have non-adaptive e-learning environments (NEE), which provide standardised training for all students.5,6 Though NEEs provide interactivity, feedback, and practice exercises, they fail to consider learners’ characteristics and provide personalised education and training.6

On the other hand, adaptive e-learning environments (AEEs) potentially increase learning efficacy and efficiency by building individual student profiles and using simple adaptive techniques to provide a personalised learning experience.7,8 AEEs have been commonly used in mathematics, physics and related disciplines for learning factual knowledge and skill development.9 Sharma et al (2017) defined adaptive learning as a “process that provides an individualised learning experience with technologies designed to determine a learner’s strengths and weaknesses.”1 Adaptive learning, also known as adaptive teaching, is a data-driven method with computer algorithms that provides customised learning to engage individual learners to enhance their learning.9

Adaptive learning utilises techniques to implement a variety of adaptivity: (i) designed adaptivity, where the educator designs the instructions so that the learner can master the content; and (ii) algorithmic adaptivity, which uses algorithms to determine the extent of the learner’s knowledge; and the best sequence of instructions, or training strategies to provide a personalised learning experience to guide toward content mastery.9 AEEs analyse how individual learners interact with courseware and perform and then predict the kinds of contents and resources that meet their needs, including where necessary specific remedial activities.10 As a result, adaptive learning can optimise learning efficiency.11 A recent systematic review and meta-analysis demonstrated that AEEs appear to be effective in improving skills in both professionals and students.9 Adaptive learning can be accurate and fluent and helps guide the spacing and sequence of learning events. Adaptive learning technology with a spacing element, such as Adaptive Response Time-based Sequencing (ARTS), has been shown to optimise learning and retention.12 Other studies have also reported improved learning with adaptive spaced education.13–15 Different adaptive e-learning platforms exist in the medical education arena such as Smart Sparrow Adaptive eLearning Platform, Articulate Storyline 360,16 and Firecracker (http://firecracker.lww.com/students.html).17

Recently medical education across the world has experienced a significant disruptive change because of the COVID-19 pandemic, and adaptive learning platforms have been rapidly and innovatively utilised to deliver teaching remotely by medical schools.18,19 It is highly likely that the use of emerging technology in medical schools will play a crucial role to enhance teaching and learning even after the resolution of the pandemic.

Firecracker, an online adaptive learning platform and assessment software, helps prepare students for their courses and high-stakes exams and determines their progress relative to peers. Adapted Spaced Learning is used as a fundamental technique in Firecracker’s adaptive learning platform, and the platform also utilises several other well-established principles of learning to improve students’ academic performance and long-term retention of memory (Appendix 1).20 A recent scoping review conducted by Versteeg et al (2020)21 showed that online spaced learning had been widely used in health profession education.13,22–26 Based on the findings, the authors proposed the following comprehensive definition of spaced learning:

Spaced learning involves [specified] educational encounters that are devoted to the same [specified] material, and distributed over a [specified] number of periods separated by a [specified] interstudy interval, with a [specified] learning outcome after a [specified] retention interval21

The objectives of this study were to: (a) determine the usage rates of Firecracker, (b) examine the performance of Firecracker formative quizzes, (c) identify the correlation between Firecracker use and performance with that of performance at summative course assessments, and (d) assess student satisfaction with Firecracker usage.

Method

Ethical Considerations

Ethical approval was obtained from the Institutional Review Board, The University of the West Indies, Faculty of Medical Sciences, Cave Hill Campus, Barbados (IRB No. 180405-B).

Context

The Faculty of Medical Sciences (FMS), Cave Hill Campus, the University of the West Indies (UWI), Barbados has a 5-year undergraduate Bachelor of Medicine, Bachelor of Surgery (MBBS) programme. Students are accepted into the programme either immediately after the completion of high school board examination or after the completion of at least a - degree in science. Students are from the English speaking Caribbean countries, and most remain and work in the region. The first three years of the programme is largely preclinical with an integrated system-based design to the curriculum. There are continuous in-course (40%), and final summative assessments (60%).

Study Design, Setting and Participants

This study was conducted in 2018, of the academic year 2017–2018, on the second-year medical students enrolled in the MBBS degree program at the FMS, UWI, Cave Hill Campus, Barbados. Firecracker was introduced in the 2017/2018 academic year to supplement the teaching of the Cardiovascular System (Course code: MDSC 2103) course, a year-2 MBBS course that runs over 13 weeks of the first semester. Second-year students who were taking the Cardiovascular System course were enrolled in Firecracker. The Cardiovascular Course consists of lectures, seminars, case/problem-based learning, tutorials, demonstrations and laboratory practicals, use of multimedia, and the University’s E-learning course management system. Grades for the courses are based on a combination of continuous assessments [practical and multiple-choice questions (MCQs)] and course final exam consisting of MCQs only. The course has one In-Course Examination, carrying 25% of the total marks, and one Final Examination that represents 60% of the total. The laboratory practicals in anatomy, physiology, pharmacology, and microbiology, and seminars delivered by the students constitute the 15% of the total. The In-Course Examination has 40 MCQs to be answered in 60 minutes whereas the duration of the Final Examination is 2 hours and students have to answer 90 MCQs.

Firecracker had a well-established questions bank. Questions and quizzes were uploaded weekly for 9 weeks by the Firecracker Administration. The questions and quizzes were grouped into three major types: analytical, problem solving, and knowledge-based, and covered all the major areas of the cardiovascular system. In a set of 20 questions, 4 to 6 questions were knowledge-based, 8 to 14 questions were analytical, and 3 to 6 were problem solving. The questions for the Cardiovascular System course were developed based on: (a) structural basis (both micro and macro) and structure–function relationship of different levels of organization of the human body; (b) reports of abnormal cases and parameters in patients; (c) reports of normal cases and measurements of normal variables in healthy individuals; (d) measurement of variables of the body under stressful conditions; (e) description of cases under hypothetical set-up or experimental conditions; (f) analysis and description of graphical presentation of data; and (g) classification and possible mechanism of actions of external agents on body functions. Firecracker generates new contents every week and continuously reviews item characteristics with the appropriate level of difficulty. Students received a daily assignment of review questions covering topics relevant to their studies and prioritized by Firecracker’s learning algorithm. Students also received a relevant daily patient case (clinical vignette). Staff received full access to the Firecracker platform allowed them to track student progress and identify at-risk students.

Data Collection

The access, usage, and performance at Firecracker formative quizzes were retrieved from the Firecracker dashboard. A paper-based questionnaire, administered in Week 11 of the semester, was used to seek the views of students who participated in Firecracker (Appendix 2). A printed cover letter providing a brief description of the study and a request of consent was attached to the questionnaire. Those who consented to participate in this study signed the informed consent form and filled out the questionnaire.

Statistical Analyses

Data from the paper-based questionnaires were entered into Microsoft Excel Database. The t-test was used to determine the significance of differences in mean scores of two groups, correlation (r) was used to explore the relationship between Firecracker quiz performance with in-course, final examination and total course marks, and chi-square (χ2) to determine the degree of agreement of feedback among the respondents were performed using SPSS statistical software version 24. Results were considered significant if the p-value was <0.05 and <0.01. Further, Cohen’s d test was used to determine the effect size.

Results

Firecracker Statistics

Ninety-one students of the second year MBBS programme enrolled in Firecracker and seven sets of quizzes were administered in the Cardiovascular System course. Overall, students received 14,089 review questions organized into 1187 topics and 19 subjects, as well as 2003 clinical vignette questions. The number of the questions received weekly were: Week 1:11; weeks 2 through 5: 20/each week; weeks 6 and 7:20, and weeks 8 and 9:20, making a total of 131 questions. The percentages of the students who completed the quizzes were 73%, 68%, 62%, 56%, and 58% in weeks 1 through 5, and 58% and 56% in weeks 6–7 and 8–9, respectively. The weekly student completion rate of the quizzes ranged from 53% to 73%. The weekly average scores of the students were 58%, 56%, 55%, 53%, 55%, in weeks 1 through 5, and 72% and 52% in weeks 6–7 and 8–9, respectively. The mean scores achieved on the quizzes during this period ranged from 52% to 72%. Four students were listed as “At Risk” throughout the entire course, indicating that they answered ≤35% questions correctly. The Firecracker team had 9 weekly online meetings with the course coordinator.

Academic Performance and Firecracker Usage

Statistically significant differences were noted between those who completed four or fewer quiz sessions versus those who completed more than four quiz sessions in relation to the performance in Firecracker quiz assessment (P<0.01), final examinations (P<0.01) and total marks (P<0.05) (Table 1). Further, Cohen’s d effect size revealed that completion of more quizzes had a significant effect on student performance in all the above categories of assessment.

Table 1 Differences of Performance in Different Forms of Assessment Between Those Who Completed 4 or Less Quizzes vs More Than Four Quizzes

A low significant correlation between total Firecracker quiz marks and in-course assessment marks (P<0.05); a moderate significant correlation between total Firecracker quiz marks and final examination; and total Firecracker quiz marks and total course marks (P<0.01) was observed.

As shown in Table 2 statistically significant differences existed between students who participated and those who did not participate in relation to the performance in final examinations, in-course assessment and total course marks obtained (P<0.01). Further, Cohen’s d effect size revealed that participation of students in Firecracker had a significant effect on student performance in all the above categories of assessment.

Table 2 Differences of Performance in Different Forms of Assessment Between Students Who Participated vs Those Who Did Not

Students’ Feedback

Of the 91 students enrolled in Firecracker, 60 students completed the evaluation questionnaire (response rate 66%), which was administered in Week 11 of Semester One. Seventy percent of the students reported being happy using Firecracker, and the rest were neutral (26.7%) or unhappy (3.3%) (p=0.000). The majority of students found Firecracker helpful in complementing coursework (78.3%) (p=0.000) and preparing for course exams (58.4%) (p=0.008); used flashcard questions and topic summaries provided by Firecracker; and agreed that the weekly Firecracker quizzes were well-aligned with the materials assessed and information learned in the classes (68.3%) (p=0.000) (Table 3).

Table 3 Students’ Feedback on the Usage of Firecracker

Discussion

The main findings of this study include the following:

  1. Students used Firecracker as a formative assessment tool (students completed 62% quizzes throughout nine weeks) and showed their satisfaction with adaptive learning technology.
  2. Firecracker usage was found to be significantly associated with better examination performance.
  3. Students who had completed more than 4 quiz sessions, secured better grades in Firecracker assessment, final examinations, and in-course assessment + final examination.
  4. A significant correlation was observed between overall Firecracker performance and final examination grade, and overall Firecracker performance and total course grades (in-course assessment + final examination).
  5. Students who participated in Firecracker, performed better both in the final examination, and in-course assessment.

Over the past several decades, higher education has evolved significantly, mainly driven by technological advancement, innovative curricular design, and diversity of pedagogical techniques.27,28 One of these techniques is adaptive spaced learning technology, which improves learning by attuning the level, spacing, and sequencing of learning events to each learner, allowing more efficient learning, better retention, and certification of mastery.10 Several meta-analyses demonstrated effective results of the efficacy of AEEs in comparison to large-group teaching among high school and university students.29–31

However, despite promising evidence of the efficiency of AEEs for knowledge acquisition and development of cognitive skills in higher education, their efficacy in improving learning outcomes in medical education has not yet been researched thoroughly. American Medical Association recommended that AEEs provide the opportunity to prepare today’s medical students for lifelong learning in the changing health care system (American Medical Association, 2019). Kellman and his colleagues conducted extensive research on adaptive learning technologies in higher education, including medicine and found remarkably promising results.10,32,33 Krasne et al (2013) used a perceptual and adaptive learning module (PALM) that utilised 261 unique images of cell injury, inflammation, neoplasia, or normal histology and showed evidence for improved recognition of histopathology patterns by medical students.33 Fontaine et al (2019) conducted the first systematic review and meta-analysis to evaluate the efficacy of adaptive e-learning in health professionals and found statistically significant improvements in learning outcomes in 12 out of 17 studies.9 In a study in the USA, Kerfoot et al (2010)13 demonstrated that adaptive spaced education significantly enhanced students’ retention of medical knowledge and improved composite end‐of‐year test scores (P<0.001). Another study found that implementation of a spaced education–based app study program in a third-year medical school surgery rotation was associated with higher scores on standardised tests in the National Board of Medical Examiners (NBME) examination.24 Our findings are consistent with the findings of these studies.

Our findings also demonstrated that students who participated in Firecracker performed better in in-course assessment and final examination. Bartsch (2016)34 demonstrated that Firecracker users (who answered at least 1500 flashcards in the Firecracker program prior to their exam) scored an average of 15 points higher in USMLE Step 1 than non-Firecracker users (who did not have a Firecracker account or answered fewer than 1500 flashcards in Firecracker prior to their exam). The average score (229.9 ±21.3) in Step 1 USMLE by non-Firecracker users was significantly (p <0.00001) lower than their Firecracker-user counterparts (245.0±16.6). The author also found that exposure to and mastery of flashcards within the Firecracker platform was correlated with significantly increased Step 1 scores. Bartsch (2016), in another study, examined the use of Firecracker with curriculum alignment to enhance student retention and summative exam performance and noted that Firecracker’s weekly formative assessments were significantly predictive of course summative exams.35 Moreover, student completion of weekly formative quizzes was also correlated with significantly increased summative final exam scores. We have also recorded that total Firecracker quiz grades were significantly correlated with final examination grades, and total Firecracker quiz grades and total course grades (in-course assessment and final examination). Other studies examining formative assessments in the form of practice quizzes in medical schools have also found predictive value for performances on summative exams.36–40

Participation of medical students in Firecracker was voluntary in the present study. Rashid et al (2017) observed an association between participation in voluntary practice quizzes and better performance on summative examinations.40 Non-participation was a common finding when these are voluntary exercises. In the current study, approximately 68% (ranging from 54% to 73%) of students participated in the weekly Firecracker quiz session, and 21% elected not to take part in any of these sessions. Other studies found even lower participation rates in similar voluntary mock tests, ranging between 40% and 70%.41–44 Rashid et al (2017)40 recorded 12% non-participation in their study, which is much lower than the findings from our research. Formative assessment and practice quizzes were found to be one of the least commonly used teaching and assessment modalities in medical education.45 The 2016 Report of the regional accreditation body, Caribbean Accreditation Authority for Education in Medicine and other Health Professions (CAAM-HP), highlighted the need for more formative assessments in our medicine programme. Hence, Firecracker was piloted to comply with CAAM-HP recommendations.

To summarise, formative self-testing resources and adaptive e-learning platforms are rapidly becoming an essential component of the medical curriculum. However, the effectiveness of these strategies needs to be evaluated in a variety of medical education settings and formats. Medical educators should integrate regular formative assessments into their curricula to improve student outcomes. Student participation in formative assessments gives instructors an understanding of knowledge gaps both for individual students and their instructional effectiveness. Firecracker’s adaptive system uses data collected from these formative assessments to help students quickly remediate their weak areas. Moreover, Firecracker’s adaptive flashcard system helped prepare students for their formative quizzes and final exams.

In addition to the limitation of non-participation as noted above, interpreting the effects of a practice quiz on summative performances of a course also has limitations, as other academic and non-academic variables may affect student performance. Besides, our study would have benefited from assessing student motivation and level of engagement in the course to identify the factors associated with participation in voluntary activities such as practice quizzes.

Conclusion

Firecracker was welcomed by most of the students, perceived to be helpful, and the use of this software was associated with better academic performance. Firecracker is likely to be useful in other courses for the continuous monitoring of students’ progress. Future studies should focus on examining why a significant number of students are reluctant to make use of the benefits of using Firecracker, or similar formative assessment tools, to their fullest and how these resources can be more constructively integrated in the curriculum to promote student performance and improve academic practices.

Abbreviations

NEE, non-adaptive e-learning environments; AEE, adaptive e-learning environments; ARTS, adaptive response time-based sequencing; PALM, perceptual and adaptive learning module; NBME, National Board of Medical Examiners; CAAM-HP, Caribbean Accreditation Authority for Education in Medicine and Other Health Professions.

Data Sharing Statement

The datasets of the current study is available from the corresponding author on reasonable request.

Ethics Approval and Consent to Participate

This study was approved by the University of the West Indies-Cave Hill/Barbados Ministry of Health Research Ethics Committee/Institutional Review Board. IRB No: 180405-B.

Acknowledgments

The authors wish to thank the medical students who participated in the Firecracker and provided feedback on their Firecracker experience. The authors would also like to acknowledge and thank the cooperation of Mr Yonas Dinkneh (Firecracker) for uploading quizzes and learning resources for the students in a timely manner.

Disclosure

Dr Md Anwarul Azim Majumder is the Editor-in-Chief of Advances in Medical Education and Practice. The authors report no other potential conflicts of interest in this work.

References

1. Sharma N, Doherty I, Dong C. Adaptive learning in medical education: the final piece of technology enhanced learning? Ulster Med J. 2017;86:198–200.

2. Zemsky R, Massy WF. Thwarted Innovation - What Happened to e-Learning and Why? A Final Report for the Weatherstation Project of the Learning Alliance at the University of Pennsylvania in Cooperation with the Thomson Corporation. Pennsylvania: The University of Pennsylvania; 2004.

3. Miayzoe T, Anderson T. Interaction equivalency in an OER, MOOCS and informal learning era. J Interactive Media Educ. 2013;2:Art 9. doi:10.5334/2013-09

4. Ben-Naim D, Marcus N, Bain M. Virtual Apparatus Framework Approach to Constructing Adaptive Tutorials. CSREA EEE; 2007:3–10.

5. Lahti M, Hätönen H, Välimäki M. Impact of e-learning on nurses’ and student nurses knowledge, skills, and satisfaction: a systematic review and meta-analysis. Int J Nurs Stud. 2014;51:136–149. doi:10.1016/j.ijnurstu.2012.12.017

6. Cook DA, Levinson AJ, Garside S, Dupras DM, Erwin PJ. Internet-based learning in the health professions. JAMA. 2008;300:1181–1196.

7. Knutov E, De Bra P, Pechenizkiy M. Ah 12 years later: a comprehensive survey of adaptive hypermedia methods and techniques. N Rev Hypermedia Multimed. 2009;15:5–38. doi:10.1080/13614560902801608

8. Akbulut Y, Cardak CS. Adaptive educational hypermedia accommodating learning styles: a content analysis of publications from 2000 to 2011. Comput Educ. 2012;58:835–842. doi:10.1016/j.compedu.2011.10.008

9. Fontaine G, Cossette S, Maheu-Cadotte MA, Mailhot T, Deschênes MF. Efficacy of adaptive e-learning for health professionals and students: a systematic review and meta-analysis. BMJ Open. 2019;9:e025252.

10. Kellman PJ. Adaptive and perceptual learning technologies in medical education and training. Mil Med. 2013;178:98–106.

11. Kellman PJ, Krasne S. Accelerating expertise: perceptual and adaptive learning technology in medical learning. Med Teach. 2018;40:797–802. doi:10.1080/0142159X.2018.1484897

12. Mettler E, Massey CM, Kellman PJ. A comparison of adaptive and fixed schedules of practice. J Exp Psychol Gen. 2016;145:897–917. doi:10.1037/xge0000170

13. Kerfoot BP. Adaptive spaced education improves learning efficiency: a randomised controlled trial. J Urol. 2010;183:678–681. doi:10.1016/j.juro.2009.10.005

14. House H, Monuteaux MC, Nagler J, Santen S. A randomized educational interventional trial of spaced education during a pediatric rotation. AEM Educ Train. 2017;1:151–157. doi:10.1002/aet2.10025

15. Phillips JL, Heneka N, Bhattarai P, Fraser C, Shaw T. Effectiveness of the spaced education pedagogy for clinicians’ continuing professional development: a systematic review. Med Educ. 2019;53:886–902. doi:10.1111/medu.13895

16. Pfeiffer CN, Jabbar A. Adaptive e-learning: emerging digital tools for teaching parasitology. Trends Parasitol. 2019;35:270–274. doi:10.1016/j.pt.2019.01.008

17. Firecracker. Available from: https://firecracker.lww.com/. Accessed April 2020.

18. Goh P, Sandars J. A vision of the use of technology in medical education after the COVID-19 pandemic. MedEdPublish. 2020;9(1):49. doi:10.15694/mep.2020.000049.1

19. Cecilio-Fernandes D, Parisi M, Santos T, Sandars J. The COVID-19 pandemic and the challenge of using technology for medical education in low and middle income countries. MedEdPublish. 2020;9(1):74. doi:10.15694/mep.2020.000074.1

20. Firecracker. Firecracker uses 20 proven principles of learning & memory. Available from: http://blog.firecracker.me/students/20-proven-principles-of-learning-memory. Accessed April 2020.

21. Versteeg M, Hendriks RA, Thomas A, Ommering BW, Steendijk P. Conceptualising spaced learning in health professions education: a scoping review. Med Educ. 2020;54:205–216. doi:10.1111/medu.14025

22. Kerfoot BP, DeWolf WC, Masser BA, Church PA, Federman DD. Spaced education improves the retention of clinical knowledge by medical students: a randomised controlled trial. Med Educ. 2007;41:23–31. doi:10.1111/j.1365-2929.2006.02644.x

23. Kerfoot BP, Fu Y, Baker H, Connelly D, Ritchey ML, Genega EM. Online spaced education generates transfer and improves long-term retention of diagnostic skills: a randomised controlled trial. J Am Coll Surg. 2010;211:331–337. doi:10.1016/j.jamcollsurg.2010.04.023

24. Smeds MR, Thrush CR, Mizell JS, Berry KS, Bentley FR. Mobile spaced education for surgery rotation improves National Board of Medical Examiners scores. J Surg Res. 2016;201:99–104. doi:10.1016/j.jss.2015.10.010

25. Matos J, Petri CR, Mukamal KJ, Vanka A. Spaced education in medical residents: an electronic intervention to improve competency and retention of medical knowledge. PLoS One. 2017;12:e0181418.

26. Tshibwabwa E, Mallin R, Fraser M, Tshibwabwa M, Sani R. An integrated interactive- spaced education radiology curriculum for preclinical students. J Clin Imaging Sci. 2017;7:22. doi:10.4103/jcis.JCIS_1_17

27. Moran J, Briscoe G, Peglow S. Current technology in advancing medical education: perspectives for learning and providing care. Acad Psychiatry. 2018;42:796–799. doi:10.1007/s40596-018-0946-y

28. Guze PA. Using technology to meet the challenges of medical education trans. Am Clin Climatol Assoc. 2015;126:260–270.

29. Steenbergen-Hu S, Cooper H. A meta-analysis of the effectiveness of intelligent tutoring systems on K–12 students’ mathematical learning. J Educ Psychol. 2013;105:970–987. doi:10.1037/a0032447

30. Steenbergen-Hu S, Cooper H. A meta-analysis of the effectiveness of intelligent tutoring systems on college students’ academic learning. J Educ Psychol. 2014;106:331–347.

31. Kulik JA, Fletcher JD. Effectiveness of intelligent tutoring systems: a meta-analytic review. Rev Educ Res. 2015;86:42–78. doi:10.3102/0034654315581420

32. Mettler E, Massey CM, Kellman PJ. Improving adaptive learning technology through the use of response times. Proceedings of the 33rd Annual Conference of the Cognitive Science Society. Carlson L, Holscher C, Shipley T (Eds). Boston, MA: Cognitive Science Society; 2011. 2532–2537.

33. Krasne S, Hillman JD, Kellman PJ, Drake TA. Applying perceptual and adaptive learning techniques for teaching introductory histopathology. J Pathol Inform. 2013;4:34.

34. Bartsch E; Firecracker. Firecracker Step 1 Performance Analysis. Alphen aan den Rijn, The Netherlands: 2016.

35. Bartsch E. Case Study: Curriculum Alignment with Formative Quizzing. Alphen aan den Rijn, The Netherlands: Firecracker; 2016.

36. Mitra NK, Barua A. Effect of online formative assessment on summative performance in integrated musculoskeletal system module. BMC Med Educ. 2015;15:29. doi:10.1186/s12909-015-0318-1

37. Dobson JL. The use of formative online quizzes to enhance class preparation and scores on summative exams. Adv Physiol Educ. 2008;32:297–302. doi:10.1152/advan.90162.2008

38. Krasne S, Wimmers PF, Relan A, Drake TA. Differential effects of two types of formative assessment in predicting performance of first-year 2 medical students. Adv Health Sci Educ Theory Pract. 2006;11:155–171.

39. Brar MK, Laube DW, Bett GC. Effect of quantitative feedback on student performance on the National Board Medical Examination in an obstetrics and gynecology clerkship. Am J Obstet Gynecol. 2007;197:530–535. doi:10.1016/j.ajog.2007.07.029

40. Rashid MN, Soomro AM, Abro AH, Noman SB. Medical students academic performance assessment in physiology courses using formative and summative quizzes at SMBB Medical College Karachi, Pakistan. AAP. 2017;2:10–17. doi:10.11648/j.aap.20170201.12

41. Olson BL, McDonald JL. Influence of online formative assessment upon student learning in biomedical science courses. J Dent Educ. 2004;68:656–659. doi:10.1002/j.0022-0337.2004.68.6.tb03783.x

42. Johnson GM. Optional online quizzes: college student use and relationship to achievement. Can J Learn Tech. 2006;32:61.

43. Kibble J. Use of unsupervised online quizzes as formative assessment in a medical physiology course: effects of incentives on student participation and performance. Adv Physiol Educ. 2007;31:253–260. doi:10.1152/advan.00027.2007

44. Carrillo-de-la-Pe~na MT, Bailles E, Caseras X, Martınez A, Ortet G, Perez J. Formative assessment and academic achievement in pre-graduate students of health sciences. Adv Health Sci Educ Theory Pract. 2009;14:61–67. doi:10.1007/s10459-007-9086-y

45. McNulty JA, Halama J, Espiritu B. Evaluation of computer-aided instruction in the medical gross anatomy curriculum. Clin Anat. 2004;17:73–78. doi:10.1002/ca.10188

Creative Commons License © 2020 The Author(s). This work is published and licensed by Dove Medical Press Limited. The full terms of this license are available at https://www.dovepress.com/terms.php and incorporate the Creative Commons Attribution - Non Commercial (unported, v3.0) License. By accessing the work you hereby accept the Terms. Non-commercial uses of the work are permitted without any further permission from Dove Medical Press Limited, provided the work is properly attributed. For permission for commercial use of this work, please see paragraphs 4.2 and 5 of our Terms.