Back to Journals » Advances in Medical Education and Practice » Volume 8

Mastery learning: how is it helpful? An analytical review

Authors Siddaiah-Subramanya M, Smith S , Lonie J

Received 5 January 2017

Accepted for publication 9 March 2017

Published 5 April 2017 Volume 2017:8 Pages 269—275

DOI https://doi.org/10.2147/AMEP.S131638

Checked for plagiarism Yes

Review by Single anonymous peer review

Peer reviewer comments 2

Editor who approved publication: Dr Md Anwarul Azim Majumder



Manjunath Siddaiah-Subramanya,1 Sabin Smith,2 James Lonie2

1Department of General Surgery, Logan Hospital, Brisbane, 2Department of General Surgery, Townsville Hospital, Townsville, QLD, Australia

Abstract: The desire to be good at one’s work grows out of the aspiration, competition, and a yearning to be the best. Surgeons, in their aim to provide the best care possible to their patients, adopt this behavior to achieve high levels of expert performance through mastery learning, and the surgical training attempts to prepare them optimally to lead a virtuous and productive life. The proponents of the framework reject evidence that suggests that other variables are also necessary to achieve high levels of expert performance. Here, we review various models and designs to achieve mastery along with their pros and cons to help us understand how mastery learning is helpful in surgical practice.

Keywords: mastery learning, deliberate practice, learning curve, self-regulated learning and surgical training

View letter to the editor:  

Introduction

Mastery learning model dictates that trainees must achieve a defined proficiency in a given instructional unit before proceeding to the next unit.1 This is one of the earliest views put forward by Block and Burns, where focus is on the role of feedback in learning. Big question is that how do we achieve that proficiency and why do we need to achieve that. One of the methods is deliberate practice. It involves repetitive performance of intended cognitive or psychomotor skills in a focused domain, coupled with rigorous skills assessment that provides trainees with specific, informative feedback, to enable sustained improvement in performance to achieve mastery and competency. There are a number of modalities by which deliberate practice can be executed to achieve mastery. It is vital in the current era of surgical practice, where there are limited opportunities to learn directly on the patients.

Mastery learning has been studied widely, and the advantages are clear in terms of improved performance by the trainees.2,3 There are advantages and disadvantages of mastery learning, which depend on various models and designs formulated to achieve mastery learning and the way they are delivered. The two variables shown to have influenced on practice include 1) the way in which the task is practiced, either as the whole task, practice of parts in a sequential order, or practice of parts in a random order, which is also known as contextual interference4,5 and 2) the way in which practice sessions are distributed, either as massed or regular teaching sessions, which is also known as distributed practice.6,7

Mastery learning is made effective by features that include cognitive interactivity, feedback, repetition, and longer time period. Other features, which determine the outcome of mastery learning, are additional practice or pre-training, cost of instruction, and self-regulation. Simulation and practice at various workshops are an effective and safe way of achieving mastery learning.

Mastery learning is relevant to competency-based education, given the shared emphasis on defined objectives rather than defined learning time or number of procedures. Mastery learning also has the advantage of elaborating on the principles behind the task and defining the end points of that task, which makes learning clearer. It provides proficiency targets, which promote deliberate practice and technical skill acquisition. The trainees who adopt this strategy tend to develop a mindset, which focuses on proficiency rather than meeting a set target number for a particular procedure. Mastery learning brings about behavioral advantages that include positive change in attitude, nature of approach to a problem, spirit to achieve one’s best and beyond, and sense of rising to the challenge. It involves systematic, precise pedagogies that show supervisors concrete measurable evidence of trainees’ learning. However, the evidence for utilizing mastery learning in non-procedural clinical areas such as history taking and clinical examination is currently lacking. We here aim to critically analyze the literature available regarding various forms and aspects of mastery learning relevant to clinical surgery.

Simulation

In the current era of surgical training, which is based on apprenticeship, the opportunity for deliberate practice is rare. Simulation provides that opportunity with immediate feedback on performance, which is most often seen lacking in theatre-based teaching. Simulation is by far the most commonly used and studied instructional design to attain mastery. Various components that give value to simulation in surgical training include 1) use of an assessment with an established minimum passing standard, 2) definition of learning objectives aligned with the passing standard, 3) baseline assessment, 4) instruction that targets learning objectives, 5) reassessment after instruction, 6) progression to the next unit only after achievement of passing standard, and 7) continued practice if minimum passing standard was not achieved.8

Simulation is a valuable tool to acquire skills, and now there is growing evidence to support the same. But, there is limited evidence as to which method of simulation helps the most. Although there are numerous simulation methods that advocate improved skills, not all of them are aligned with the surgical curricula, which furthermore contribute to uncertainty in choosing the right model. Skill enhancement and skill transfer to operating theatre, to an extent, depend on the simulation model used in training and the features that simulation model is based on. Some of the models available include inanimate, live animal, cadaveric, animal tissue, computer based, box trainer, augmented reality model, and universal simulators.

The simulation models that incorporate feature of deliberate practice, such as goal-directed training, show better skill enhancement and opportunities for repetition with feedback as shown in a review by Issenberg et al.9 In addition, a subsequent review demonstrated that simulator engagement in deliberate practice was associated with better performance in operating theatre.2 Studies also suggest that random order practice schedule or contextual interference incorporated in deliberate practice could be a vital mediator for enhancing human efficiency, as it creates unpredictable environment necessitating elaborative cognitive processing. Expert surgeons consistently demonstrate far superior levels of performance on the simulator than the novice. This performance by experts has been taken as a standard in many simulations that need to be attained to achieve a passing score.10 Individuals trained on simulators achieve proficiency same as traditionally trained surgeons, but only earlier and capable of higher level of performance.11

The McGill Inanimate System for Training and Evaluation of Laparoscopic Skills (MISTELS) was developed and validated for training and evaluation of fundamental laparoscopic skills (FLS).12 Various versions of MISTELS are in use in different countries. There is evidence that FLS training improves procedural skills,13,14 but less is known about how best to integrate FLS into the curriculum. Currently, FLS certification is required for certification eligibility by the American Board of Surgeons.15 In Australia, although recommended, it is not a requirement either for entry into surgical training or for final certification.

Randomized-controlled trial (RCT) conducted by Sroka et al is one of the few studies to investigate the influence of simulation on the outcome of real cases in operating theatres. The operation selected was laparoscopic cholecystectomy, which is considered an operation of moderate- to high-level difficulty, and the simulation training was on a laparoscopic training box. There was a significant improvement in those who had successfully completed FLS with regard to depth perception, bimanual dexterity, and tissue handling. A similar study by Zendejas et al showed improved trainee performance, in terms of reduction in intra- and postoperative complications and need for overnight stay.16 A systematic review by Strum et al included 10 RCTs and various surgical procedures including laparoscopic and endoscopic procedures. It concluded that the skills learnt in FLS on simulators are transferable to real cases in operating theatre.

Part-task training

Part-task training is a learning strategy that can be applied to all surgical procedures and so in simulation as well. This is an approach whereby a complex task is deconstructed into smaller components for practice. The trainees gain proficiency in the individual components before progressing to more complex tasks. The advantage of this strategy in mastery learning is that a higher level of skill can be attained if participants master individual components before integrating them into the whole task, but how does it transfer to the actual cases?17 The benefits are very dependent on specific tasks with transfer actually quite limited.18 Furthermore, some literature suggest that complex surgical procedures, especially those that involve constant interlimb interaction and coordination, are best acquired when practiced as a whole.19,20

The study by Dubrowski et al compares the transfer of learned skills between trainees who learnt a procedure in parts and those who learnt as a whole. The group who learnt the operation as a whole performed better and scored higher on global rating, checklists, and final product analysis, followed by random practice schedule. It was a study based on bone plating task. Teaching in operating theatre is most often a representation of random practice, where a trainee is stopped routinely once an error occurs. The trainees do not often get to practice steps of an operation in a sequential manner over a number of theatre sessions. Therefore, following a random practice method in simulation would have the greatest benefit as shown in the study, unless opportunities are available for the entire operation to be practiced. Scheverien in his review states that an assumption of how well-existing motor learning theories apply to surgical skill environment cannot be tested reliably. In addition, there exists a great variability in retention of skills when assessed after a period of time, which varied between 2 weeks to 6 months.20,21

Another approach in part or whole training is either distributed or massed practice. Individuals who practice in distributed practice schedule outperform those who practice in a massed practice schedule, especially when tested immediately after the training period.7 This is probably due to reactive impendence, where fatigue is detrimental to performance. Moulton et al in their RCT compared the performance of the two groups on an animal model. The distributed practice group had better retention of skills, which is one of the key indicators in achieving mastery. These findings suggest a review of the present curricula, which predominantly consist of courses taught on mass practice schedule.

Learning curve

Learning curve is a graphic representation of the temporal relationship between the surgeon’s mastery of a specifically assigned task and the chronological number of cases performed. Learning curve concept in achieving mastery has been used in various ways. A set proficiency criteria should be used as a training benchmark rather than a standard number of cases or the time spent on a simulator. The learning curve is not only a function of the surgeon’s understanding of a new technique, technical modifications, improvement in support staff, and perioperative care but also of surgeon’s evolving ease with procedures and performance of more challenging cases.

It has been increasingly recognized that numbers are poor surrogates for competency.22 The “true” learning curve as suggested by Villani et al is an alternating periods of improvement and regression until mastery is achieved, unlike the traditional thought of an “idealized” learning curve in which continuous improvement occurs until a plateau is reached.23 This is true because the complexities of the procedures and the patient factors are variable, although the designation of the procedure remains the same. This is particularly true for bigger cases, which in turn contains many steps which themselves independently determine the learning curve rate. It requires a surgeon to have a behavior that demonstrates determination, perseverance, and faith to achieve mastery and success, which in itself is a challenge. With the availability of perioperative outcome data and setting of national benchmark, historical subjective approaches to performance assessment, certification, and advancement are no longer acceptable, which almost makes attaining mastery a must.24,25

Villani et al in their study investigate the learning curve of laparoscopic liver resection and its impact on mastery learning and its outcomes. An overall decrease in complication and length of stay was observed. More interestingly, periods of initial improvement were followed by periods of regression, giving the learning curve a wave pattern, which is the representation of “true” learning curve. A number of factors contribute to this wave pattern of learning curve. They include growing confidence in the surgeon, more complex cases, and patients that surgeon chooses to operate as he progresses through the learning curve – periods of complacency. This behavior pattern has been shown in other professional disciplines, where when a level of proficiency is achieved, it may cause some surgeons to relax and be even over confident. This serves as a reminder to the inherent dangers of our work. Moon et al in their study, of upper gastrointestinal surgeons performing laparoscopic gastrectomy, reported three to four alternating phases prior to reaching the plateau.26

From these studies, it is apparent that those periods of regression are unavoidable as we continue to operate on more complex, obese, and elderly patients with multiple co-morbidities. Therefore, to avoid longer periods of regression, would achieving proficiency alone be enough or should one aim for expert level? Additional resources such as coaching from experts to facilitate appropriate decision making, multidisciplinary approach to optimize an operative candidate, improved methods of analyzing surgeon’s performance such as CUSUM (cumulative sum method), and augmenting technical training while improving on the learning curve will all help.

What role would overtraining, which is believed to help us achieve that expert level, play in making the learning curve smoother and reach expert competency? Learning curve could be divided into three phases: initial learning curve phase, accumulation phase, and postlearning phase. Phase 2 is widely variable and individuals here achieve proficiency. Phase 3 is a phase en route to reaching expert level and represents overtraining period in simulation. The concept of overtraining was first described by Ebbinghaus.27 Overtraining beyond the passing level may improve retention of learned skills and enhanced skill transfer into the operating theatre and is defined by expert performance.28,29 In the study by Seymour et al, trainees who had expert training performed better than those who achieved passing level alone. Their operating time with laparoscopic cholecystectomy was 25% faster and had significantly less rate of burn injury to surrounding structures. Training to expert level may also have other advantages. Skills learnt on simulator while performing a particular operation would enhance the learning rate of the trainee on the learning curve in other operations. In comparison, it has been shown that cessation of training while the trainee is in the steep part of the learning curve is associated with far less retention.30

Cost and time

Cost of instruction and time period involved in simulation are the two main deterrents for mastery learning model to be implemented. There are a number of studies that have reported on cost of learning.31,32 Although it involves initial investment and setup costs, it certainly reduces the cost of overall training.31 Use of inexpensive, cost-saving models such as physical pelvic model rather than virtual reality system or even using pretraining on less-expensive models such as animal models, web-based resources prior to being exposed to rather expensive virtual reality model can cut costs.33 For the use of discarded theatre equipment, which are still functional, donations (either monetary or materials) from medical industries would further help in reducing the cost of training. Furthermore, a proficient, pretrained trainee would be less likely to need constant rigorous supervision, likely to take less time in completing a unit of operation, progress to more complex operation at the intended or even faster pace, and less likely to make mistakes that could potentially have led to litigations, thereby reducing the cost of training for hospital and training boards. Time period spent in pretraining a trainee can be advantageous in a similar way and would fast track a trainee’s progress. This also contributes to an increase in trainee’s confidence and morale.

Self-regulation

Self-regulated learning is a vital feature that could be made a part of mastery learning. Self-regulated learning involves metacognitive, motivational, and behavioral elements that also form an important feature of deliberate practice en route to mastery. Two studies of note that have investigated self-regulation in mastery learning have shown conflicting results. One of the studies showed that the trainees who determined for themselves that they were ready to progress to the next task performed similar to those who were required to demonstrate objective evidence prior to advancing.34 In contrast, another study demonstrated that the trainees with defined proficiency targets attained better results than those without such targets.35 An important difference between these two studies is the time period, and the timing of instruction, which they themselves are an independent predictor of learning. In the study by Brydges et al, the instructions and targets were all given at the beginning of the tasks on a 1-day course, which the participants belonging to self-regulated group could use in any manner they liked. On the other hand, in the study by Gauger et al, the targets were given periodically as and when the trainee determined that they have achieved the initial task. This allowed the trainee to focus on just that one task that was given. This also was carried out and assessed over a 4-month period. These two features may have contributed to divergent results in these two studies.

Mastery learning presumes that it is possible and desirable to specify all the components that encompass mastery or competency. There are advantages and disadvantages with this assumption. While it provides trainees a direction in which they have to learn and the learning is measurable, it lacks creative element that is vital in making the learning enjoyable and sustainable. Surgery involves a number of well-defined procedures, which have been practiced and performed in a particular way that has proven to be effective. But, there still exists an opportunity for every surgeon and so every trainee to learn in a way which is enjoyable and may lead to making the procedure either quicker or more effective, and furthermore, may lead to a new method of performing a procedure or treating a pathology.

Feedback

Feedback plays a critical role in mastery learning model. In another study, the value of feedback was put to test.36 Feedback could be from the simulator that the trainee works on or from the workshop instructor. Does one have a better influence on the trainee’s learning and outcome than the other? The study showed that the performance of the trainees was similar when they received feedback from the simulator alone or from both the simulator and the human preceptor. This suggests that some trainees or in some situations, the learners can implement the next step or carry out the task completely without the assistance of human instructor. The feature, furthermore, reinforces the human ability of self-regulated learning.

There are various approaches to a feedback whether it is given by a human instructor or by a simulator. Some of these approaches include continuous feedback, audiovisual feedback, and minimal feedback.37,38,39 These studies demonstrate that continuous feedback is superior. Then comes a question of how long the feedback should be given for and in what way the continuous feedback be offered? An RCT found that trainees performed worse in laparoscopic training when the feedback was limited to only 10 min per session.39 This is probably due to the fact that trainees could become dependent on feedback, especially continuous feedback before proceeding further in a task, and then the performance suffers when the feedback is withdrawn. This finding is common among junior trainees and among senior trainees when performing a new procedure and can be improved by individualizing the instructions as reiterated in a review by Issenberg et al.9

Teacher’s perspective

Teachers who aim for success rates of 90–100% on trainees’ progress produce more learning than teachers who tolerate higher rates of failure.40 There are almost no data or studies published on the role of teachers and their influence on mastery learning of trainees, although their role is acknowledged indirectly with the trainees’ progress and success. On the contrary, there exist an immense number of studies in relation to teacher’s role, at the school level, in the development of skills among children. Similar ethos and ideologies are applicable at the level of surgical fellowship and to mentor and trainee relationship. Teachers using mastery learning develop more positive attitudes toward teaching, have higher expectations for students, and take greater personal responsibility for learning outcomes. This attitude from supervisors would encourage trainees to attain proficiency scores and may even motivate them to aim for expert-level scores. Furthermore, another study demonstrated that teaching was more effective when less importance was given to personality factors, and greater importance was given to teaching practices and behaviors of the students.41 This principle could and must be applied to the surgical training. It is especially important, given the recent events of bullying in surgical training, which has been only acknowledged now. As much as the mastery learning strategy requires specific goals, it is difficult for supervisors to agree on than broad goals, more so with regard to non-procedural learning. In addition, not all supervisors are comfortable or even knowledgeable in teaching strategies applying mastery learning techniques.

Surgical training

Acquisition of surgical skills is complex, because it requires a greater degree of cognitive involvement. Practicing surgical skills in a systematic order to achieve performance that mirrors the expected level in operating theatre results in long-term retention. Standards set by experts on the simulators act as a guide and a benchmark to aim for. Simulation courses provide a way of gaining experience by deliberate practice at a frequency that may not be able to be achieved in operating theatre. Consideration must be given to incorporating random-order schedule in courses, which are now predominated by mass order practice. Greater involvement in deliberate practice along with time and experience in clinical environment is the key to gain competency and evolve as experts.

Conclusion

The advantages of mastery learning are obvious and many outweighing the disadvantages. The question however is, which model of mastery learning is the best? The matter of fact is there may never be a model, which suits all needs, even procedural, and so needs to be tailored. Various forms of simulation-based models are being developed, and they have shown immense benefits in our endeavor to achieve mastery. There seems to be a predominance of models aimed at procedural skills. Currently, the evidence for deliberate practice in non-procedural skills is very limited. More research into this component of clinical competency, which includes decision-making process, would provide much needed evidence for training cognitive skills for intraoperative environment.

Disclosure

The authors report no conflicts of interest in this work.

References

1.

Block JH, Burns RB. Mastery learning. Rev Res Educ. 1976;4:3–49.

2.

McGaghie WC, Issenberg SB, Cohen ER, Barsuk JH, Wayne DB. Does simulation based medical education with deliberate practice yield better results than traditional clinical education? A meta-analytic comparative review of the evidence. Acad Med. 2011;86(6):706–711.

3.

Cook DA, Brydges R, Zendejas B, Hamstra SJ, Hatala R. Mastery learning for health professionals using technology-enhanced simulation: a systematic review and meta-analysis. Acad Med. 2013;88(8):1178–1186.

4.

Battig WF. Facilitation and interference. In: Bilodeau EA, editor. Acquisition of Skill. New York, NY: Academic Press; 1966:215–244.

5.

Shea JB, Morgan RL. Contextual interference on the acquisition, retention, and transfer of a motor skill. J Exp Psychol Hum Percept Perform. 1979;5:179–187.

6.

Dempster FN. Spacing effects and their implications for theory and practice. Educ Psych Rev. 1989;1:309–330.

7.

Moulton CA, Dubrowski A, Macrae H, Graham B, Grober E, Reznick R. Teaching surgical skills: what kind of practice makes perfect?: a randomized, controlled trial. Ann Surg. 2006;244(3):400–409.

8.

McGaghie WC, Issenberg SB, Petrusa ER, Scalese RJ. A critical review of simulation based medical education research: 2003–2009. Med Educ. 2010;44(1):50–63.

9.

Issenberg SB, McGaghie WC, Petrusa ER, Lee Gordon D, Scalese RJ. Features and uses of high-fidelity medical simulations that lead to effective learning: a BEME systematic review. Med Teach. 2005;27(1):10–28.

10.

Van Sickle KR, Ritter EM, McClusky DA III, et al. Attempted establishment of proficiency levels for laparoscopic performance on a national scale using simulation: the results from the 2004 SAGES minimally invasive surgical trainer-virtual reality (mist-VR) learning centre study. Surg Endosc. 2007;21(1):5–10.

11.

Grantcharov TP, Kristiansen VB, Bendix J, Bardram L, Rosenberg J, Funch-Jensen P. Randomized clinical trial of virtual reality simulation for laparoscopic skills training. Br J Surg. 2004;91(2):146–150.

12.

Fraser SA, Klassen DR, Feldman LS, Ghitulescu GA, Stanbridge D, Fried GM. Evaluating laparoscopic skills: setting the pass/fail score for the MISTELS system. Surg Endosc. 2003;17(6):964–967.

13.

Sroka G, Feldman LS, Vassiliou MC, Kaneva PA, Fayez R, Fried GM. Fundamentals of laparoscopic surgery simulator training to proficiency improves laparoscopic performance in the operating room: a randomized controlled trial. Am J Surg. 2010;199(1):115–120.

14.

Strum LP, Windsor JA, Cosman PH, Cregan P, Hewett PJ, Maddern GJ. A systematic review of skills transfer after surgical simulation training. Ann Surg. 2008;248:166–179

15.

ABS The American Board of Surgery. ABS to require ACLS, ATLS, and FLS for general surgery certification. 2008. Available from: http://home.absurgery.org/default.jsp?news_newreqs. Accessed December 14, 2016.

16.

Zendejas B, Cook DA, Bingener J, et al. Simulation-based mastery learning improves patient outcomes in laparoscopic inguinal hernia repair: a randomized controlled trial. Ann Surg. 2011;254:502–509.

17.

Proctor RW, Vu KPL. Laboratory studies of training, skill acquisition, and retention of performance. In: Ericsson KA, Charness N, Feltovich PJ, Hoffman JR, editors. The Cambridge Handbook of Expertise and Expert Performance. New York, NY: Cambridge University Press; 2006:265–286.

18.

Norman G, Eva K, Brooks L, Hamstra SJ. Expertise in medicine and surgery. In: Ericsson KA, Charness N, Feltovich PJ, Hoffman RR, editors. The Cambridge Handbook of Expertise and Expert Performance. New York, NY: Cambridge University Press; 2006:39–354.

19.

Schaverien MV. Development of expertise in surgical training. J Surg Educ. 2010;67:37–43

20.

Dubrowski A, Backstein D, Abughaduma R, Leidl D, Carnahan H. The influence of practice schedules in the learning of a complex bone-plating surgical task. Am J Surg. 2005;190(3):359–363.

21.

Torkington J, Smith SG, Rees B, Darzi A. The role of the basic surgical skills course in the acquisition and retention of laparoscopic skill. Surg Endosc. 2001;15(10):1071–1075.

22.

Bell RH Jr. Why Johnny cannot operate. Surgery. 2009;146(4):533–542.

23.

Villani V, Bohnen JD, Torabi R, Sabbatino F, Chang DC, Ferrone CR. “Idealized” vs. “True” learning curves: the case of laparoscopic liver resection. HPB (Oxford). 2016;18(6):504–509.

24.

Hopper AN, Jamison MH, Lewis WG. Learning curves in surgical practice. Postgrad Med J. 2007;83(986):777–779.

25.

Maruthappu M, Duclos A, Lipsitz SR, Orgill D, Carty MJ. Surgical learning curves and operative efficiency: a cross-specialty observational study. BMJ Open. 2015;5(3):6679.

26.

Moon JS, Park MS, Kim JH, Jang YJ, Park SS, Mok YJ. Lessons learned from a comparative analysis of surgical outcomes of and learning curves for laparoscopy-assisted distal gastrectomy. J Gastric Cancer. 2015;15(1):29–38.

27.

Ebbinghaus H. Memory: A Contribution to Experimental Psychology. New York, NY: Dover Publications; 1964.

28.

Scott DJ, Ritter EM, Tesfay ST, Pimentel EA, Nagji A, Fried GM. Certification pass rate of 100% for fundamentals of laparoscopic surgery skills after proficiency-based training. Surg Endosc. 2008;22(8):1887–1893.

29.

Seymour NE, Gallagher AG, Roman SA, et al. Virtual reality training improves operating room performance. Ann Surg. 2002;236(4):458–464.

30.

Lathan CE, Tracey MR, Sebrechts MM, Clawson DM, Higgins GA. Using virtual environments as training simulators: measuring transfer. In: Stanney KM, editor. Handbook of Virtual Environments: Design Implementation and Applications. Mahwah, NJ: Lawrence Erlbaum Associates; 2002:403.

31.

Rosenthal ME, Castellvi AO, Goova MT, Hollett LA, Dale J, Scott DJ. Pretraining on Southwestern stations decreases training time and cost for proficiency-based fundamentals of laparoscopic surgery training. J Am Coll Surg. 2009;209(5):626–631.

32.

McDougall EM, Kolla SB, Santos RT. Preliminary study of virtual reality and model simulation for learning laparoscopic suturing skills. J Urol. 2009;182(3):1018–1025.

33.

Stefanidis D, Hope WW, Korndorffer JR Jr, Markley S, Scott DJ. Initial laparoscopic basic skills training shortens the learning curve of laparoscopic suturing and is cost-effective. J Am Coll Surg. 2010;210(4):436–440.

34.

Brydges R, Carnahan H, Rose D, Dubrowski A. Comparing self-guided learning and educator-guided learning formats for simulation-based clinical training. J Adv Nurs. 2010;66(8):1832–1844.

35.

Gauger PG, Hauge LS, Andreatta PB, et al. Laparoscopic simulation training with proficiency targets improves practice and performance of novice surgeons. Am J Surg. 2010;199(1):72–80.

36.

Snyder CW, Vandromme MJ, Tyra SL, Porterfield JR Jr, Clements RH, Hawn MT. Effects of virtual reality simulator training method and observational learning on surgical performance. World J Surg. 2011;35(2):245–252.

37.

Salvendy G, Pilitsis J. The development and validation of an analytical training program for medical suturing. Hum Factors. 1980;22(2):153–170.

38.

Domuracki KJ, Moule CJ, Owen H, Kostandoff G, Plummer JL. Learning on a simulator does transfer to clinical practice. Resuscitation. 2009;80(3):346–349.

39.

Stefanidis D, Korndorffer JR Jr, Heniford BT, Scott DJ. Limited feedback and video tutorials optimize learning and resource utilization during laparoscopic simulator training. Surgery. 2007;142(2):202–206.

40.

Brophy JE. Successful teaching strategies for the inner city-child. Ebi Delta Kappan. 1982;63;527–530.

41.

Guskey TR. The essential elements of mastery learning. J Classroom Interac. 1987;22;19–22.

Creative Commons License © 2017 The Author(s). This work is published and licensed by Dove Medical Press Limited. The full terms of this license are available at https://www.dovepress.com/terms.php and incorporate the Creative Commons Attribution - Non Commercial (unported, v3.0) License. By accessing the work you hereby accept the Terms. Non-commercial uses of the work are permitted without any further permission from Dove Medical Press Limited, provided the work is properly attributed. For permission for commercial use of this work, please see paragraphs 4.2 and 5 of our Terms.