Back to Journals » Advances in Medical Education and Practice » Volume 9

Surgical simulation training in orthopedics: current insights

Authors Kalun P, Wagner N, Yan J, Nousiainen MT, Sonnadara RR

Received 21 September 2017

Accepted for publication 28 November 2017

Published 21 February 2018 Volume 2018:9 Pages 125—131

DOI https://doi.org/10.2147/AMEP.S138758

Checked for plagiarism Yes

Review by Single anonymous peer review

Peer reviewer comments 4

Editor who approved publication: Dr Md Anwarul Azim Majumder



Video abstract presented by Portia Kalun.

Views: 325

Portia Kalun,1 Natalie Wagner,1 James Yan,2 Markku T Nousiainen,3 Ranil R Sonnadara1,4

1Office of Education Science, Department of Surgery, McMaster University, Hamilton, ON, Canada; 2Division of Orthopaedic Surgery, Department of Surgery, McMaster University, Hamilton, ON, Canada; 3Division of Orthopaedic Surgery, Department of Surgery, University of Toronto, Toronto, ON, Canada; 4Department of Surgery, University of Toronto, Toronto, ON, Canada

Background: While the knowledge required of residents training in orthopedic surgery continues to increase, various factors, including reductions in work hours, have resulted in decreased clinical learning opportunities. Recent work suggests residents graduate from their training programs without sufficient exposure to key procedures. In response, simulation is increasingly being incorporated into training programs to supplement clinical learning. This paper reviews the literature to explore whether skills learned in simulation-based settings results in improved clinical performance in orthopedic surgery trainees.
Materials and methods: A scoping review of the literature was conducted to identify papers discussing simulation training in orthopedic surgery. We focused on exploring whether skills learned in simulation transferred effectively to a clinical setting. Experimental studies, systematic reviews, and narrative reviews were included.
Results: A total of 15 studies were included, with 11 review papers and four experimental studies. The review articles reported little evidence regarding the transfer of skills from simulation to the clinical setting, strong evidence that simulator models discriminate among different levels of experience, varied outcome measures among studies, and a need to define competent performance in both simulated and clinical settings. Furthermore, while three out of the four experimental studies demonstrated transfer between the simulated and clinical environments, methodological study design issues were identified.
Conclusion: Our review identifies weak evidence as to whether skills learned in simulation transfer effectively to clinical practice for orthopedic surgery trainees. Given the increased reliance on simulation, there is an immediate need for comprehensive studies that focus on skill transfer, which will allow simulation to be incorporated effectively into orthopedic surgery training programs.

Keywords: orthopedics, simulation, postgraduate medical education, scoping review, transfer

Introduction

Recent publications suggest that surgical residents may graduate from their training programs without sufficient clinical exposure to key procedures.1,2 This is thought, in part, to be driven by duty hour restrictions, pressures to increase operating room (OR) efficiency, and the ongoing development of new techniques for trainees to learn.2,3 In their seminal paper, Reznick and MacRae3 suggested that simulation can be used as an adjunct to clinical rotations to increase exposure to different procedures and skills. Simulation is an appealing teaching tool as it provides residents with an environment in which they can learn new skills with no impact on patient care.4 Residents are able to make mistakes, receive valuable feedback, and improve performance prior to working with patients, without the time pressures that are omnipresent in traditional clinical teaching situations.4 Furthermore, the variety of simulation materials available, including synthetic models, animal models, cadavers, and virtual reality5 allows educators to choose models best suited for teaching specific skills. Not having to rely on patients with real health issues also allows educators and learners to adjust the fidelity of models to create learning experiences which are optimized for the educational needs of the learner.4 Fidelity is defined as how realistic a simulation model appears to the learner (physical fidelity) or whether the task itself causes similar behavior to what is required in the real world (functional fidelity).17 While high-fidelity simulators may sometimes better mimic the setting in which trainees will actually perform the skill, having too much information can sometimes interfere with learning.6 Thus, the ability to adjust the level of fidelity present can be a powerful educational tool.

In 2006, Reznick and MacRae3 described the evidence on the transfer of skills from simulation models to the OR. Much of the work they presented focused on laparoscopic procedures. Although laparoscopic procedures may have some similarities to procedures performed by orthopedic surgeons, such as arthroscopy, there are distinct technical differences that may influence how well data showing the transfer of laparoscopic skills learned through simulation to the clinical setting may generalize to skills related to orthopedic surgery, especially arthroscopic surgery.7 For instance, the arthroscopic operative field tends to be much more shallow than the laparoscopic field. This means that operative instruments tend to be shorter and require a different, often larger range of motion that introduces challenges such as increased travel in and out of the operative field and an increased reliance on anatomic landmarks compared with laparoscopy.8 These technical differences may increase the importance of depth perception and haptic feedback in arthroscopy simulation models.9

Although a few studies have looked at the effect training on arthroscopic simulators has on surgical performance in live patients, little evidence has been published examining simulation training for open orthopedic surgical procedures.10 Thus, further work is required to determine whether simulation training in orthopedic surgery actually improves performance in the clinical environment, and how much weight should be given to performance assessments that have been conducted in simulation. This is a timely issue, given the global shift toward competency-based models of education for surgical trainees which are making increasing use of simulation for teaching and assessment.11 The purpose of this scoping review was to identify and summarize the existing literature regarding the effectiveness of transfer of skills related to the specialty of orthopedic surgery learned through simulation.

Materials and methods

The databases used to conduct this search were Ovid Embase (from 1974 to August 23, 2016); Ovid MEDLINE(R) In-Process & Other Non-Indexed Citations, Ovid MEDLINE(R) (from 1946 to August 23, 2016); Ovid MEDLINE(R) Epub Ahead of Print (August 23, 2016); Cochrane Library (September 23, 2016); and PubMed (September 23, 2016). The following search terms were used: simulation, transfer, orthopaedic*, orthopedic*. Boolean terms were used to combine the search terms. Two independent reviewers (JY, NW) completed a title and abstract review, and then a full-text screening of the articles that met the inclusion criteria. Reference lists of the included articles were manually searched for relevant studies. The reviewers met following each stage of the review and discussed any discrepancies until consensus was reached. We included articles that discussed trainees in orthopedic surgery (medical students, residents, fellows) and transfer of skills from simulation training to a clinical environment. Articles in fields outside orthopedic surgery, which focused on staff performance, and conference abstracts or reports that lacked sufficient detail were excluded.

Results

Article selection

The search revealed 94 studies from Ovid, 12 from Cochrane, and 101 from PubMed, giving a total of 207 studies. Of these studies, 80 duplicates were removed. The resulting 127 articles underwent title and abstract screening. Following this screening, 18 articles were selected for full-text review. Of these articles, 11 were included in the final qualitative analysis. Two articles were excluded due to insufficient information (conference abstract and report), and the other five articles excluded did not look at transfer to a clinical setting. Four papers were added following a hand-search of the reference lists, thus giving a total of 15 studies (Figure 1). Of the 15 included papers, there were four experimental studies and 11 review articles.

Figure 1 PRISMA diagram.

Abbreviation: PRISMA, Preferred Reporting Items for Systematic reviews and Meta-Analyses.

Experimental study characteristics and outcomes

Four experimental studies, published from 2008 to 2016, looked at the effect training orthopedic surgery residents (ranging across postgraduate years 1–5) on arthroscopy simulators had on their ability to transfer their skills into the OR (Table 1). Two of these studies investigated knee arthroscopy,12,13 while the other two investigated shoulder arthroscopy.14,15 While the specific evaluation tools used in all four studies differed, all studies measured performance with a checklist and a global rating scale (GRS). Three out of the four studies used at least one previously validated checklist or GRS.12,14,15 All four studies also used the time to complete the surgical task as an outcome measurement. Additionally, two of the studies incorporated motion analyses and looked at the efficiency of hand movements,22 using path length and number of hand movements as indices of performance.13,15 From these performance metrics, three of the four studies concluded skills transferred to performance of procedures on live patients,12,13,15 as evidenced by a significant difference between the simulation-trained and traditionally-trained (control) groups on at least one outcome measure (see Table 1 for further details).

Table 1 Experimental study characteristics

Abbreviations: ASSET, Arthroscopic Surgery Skill Evaluation Tool; GRS, Global Rating Scale; OSATS, Objective Structured Assessment of Technical Skills; VR, virtual reality.

In addition to varied outcomes, the four studies differed with respect to their methodologies and performance requirements. While three studies12,14,15 provided their control groups with the same learning materials as the simulation-trained group, including a video of the procedure and a procedural checklist, one study did not.13 Furthermore, one study required trainees to receive a perfect score on visualization and probing tasks before moving on to perform the surgical procedure on a live patient,12 two studies required a minimum number of arthroscopies to be practiced in simulation based on previous literature,14,15 and one study stated that the trainee had to perform 18 arthroscopies in the simulation environment.13 Thus, only one study12 required trainees to achieve a perfect score before performing the surgical procedure on a live patient, whereas the other studies merely set a minimum number of attempts.

Finally, these studies also differed in how or if they defined fidelity of the simulation model used. Only two of the experimental studies, Cannon et al12 and Waterman et al,15 stated the fidelity of their simulation models (Table 1). The fidelity of the models used by Dunn et al14 and Howells et al13 were not provided directly, but were inferred from another experimental study15 that used the same simulator model and a systematic review22 of simulation in arthroscopy, respectively (Table 1).

Review study characteristics

Eleven review studies were reviewed; these were published from 2010 to 2016. Five reviews explored arthroscopy simulation more broadly, including whether skills transferred into the OR.7,1821 Three reviews evaluated the validity of simulator models in arthroscopy training.16,22,23 One reviewed arthroscopy techniques that can be taught using simulator models.24 Two reviews discussed the evidence to support the use of different simulator models in orthopedic training and mentioned the transfer of skills to the clinical environment in their discussions.2,5

Most of these reviews7,19,20,21 included experimental studies that did not test whether skills transferred from a simulator model to the clinical setting and none of these reviews included any of the four experimental studies discussed in this paper.

Outcome measures

The most common outcome measure among studies was the time to complete the surgical task.7,16,22,23 Most studies (and all studies that measured transfer) included a variety of additional outcome measures such as path length,7 number of collisions,7,21 economy of movement,7 and number of hand movements.22 The review by Aim et al7 reported over 30 different outcome measures across 10 studies, and suggested a need to standardize the outcome measures used in simulation studies.

Transfer of skills

All of the reviews identified a need for further evidence on whether skills learned in simulation transfer to improved operative performance.2,5,19,20 Many of the reviews cited the paper of Howells et al13 as the only study to date demonstrating transfer from simulation to the OR; however, the majority of the reviews were published prior to three of the four experimental studies included in this study. Several of the reviews also criticized the poor outcome measures and control group used in the Howells et al study.16,19,21 Furthermore, Hetaimish et al22 highlighted that many studies examine the transfer of skills between different types of simulator models rather than transfer from simulator models to clinical performance.

Construct validity

Despite the lack of evidence supporting the transfer of skills from simulation to the clinical environment, most reviews found strong support for construct validity or the ability of simulator models and assessment tools to discriminate between different levels of experience.2,5,7,19,20,22 It is important to note that the construct in question could be interpreted not as “clinical performance”, but rather the ability to discriminate between levels of trainees’ skills on a simulator task. One review did not find strong support for construct validity (as defined above), though the authors suggested this might be due to practice effects on the simulator, rather than trainee experience with arthroscopic procedures.21 Another review suggested that although there was strong support for simulation models to discriminate between novices and experts, future studies should investigate whether models can discriminate between novices and intermediate learners.23

Competence

The review studies also commented on the need to define competent performance in arthroscopic training.7,16,19 This includes defining what a competent performance on a simulator is, particularly as it relates to transfer into the clinical setting,19 how many simulation sessions it takes to facilitate the transfer of those skills,7,19 and lastly, how many arthroscopic procedures in the OR must be completed for a trainee to be deemed competent for independent practice.16

Fidelity

Out of the 11 reviews, six commented only on the fidelity of specific simulator models (i.e., identified each model as being either high or low fidelity).5,16,18,19,23,24 Stirling et al2 suggested high-fidelity simulators can provide haptic feedback, potentially facilitating the transfer of skills from simulation to the OR;2 they also suggested high fidelity might be more appropriate for senior surgeons.2 However, one paper discussed the conflicting evidence on whether high or low fidelity is more beneficial for learning surgical skills.22

Discussion

This scoping review suggests evidence for the transfer of skills from simulation to the clinical environment remains sparse for surgical procedures related to the practice of orthopedic surgery. Importantly, none of the experimental studies reviewed focused on the transfer of skills from simulation to the clinical setting for open procedures.

Only four experimental studies measuring the transfer of arthroscopic skills acquired in simulation to performance in the OR were identified, with three concluding that transfer occurred. One experimental study showed a significant difference between simulation- and traditionally-trained trainees’ performance in simulation and the OR on a checklist, but not the GRS.12 One interpretation of this data is that the simulator helped trainees to learn the steps of the procedure, but did not help them to perform the procedure better in the clinical context, which calls into question the authors’ assertion of skill transfer. Another experimental study only showed a significant difference between the simulation- and traditionally-trained trainees on half of the outcomes they measured.15 Additionally, this study measured different aspects of performance in simulation (motion analysis) and the clinical setting (safety scores and a checklist), so it is difficult to determine whether improvements in performance in simulation were linked to, or transferred to, improved performance in the clinical setting. The variety of assessment tools and outcome measures used across the different studies not only makes it challenging to compare the different studies, but also raises questions as to what is the most appropriate strategy for measuring transfer of learning.

While most studies in this review used time to complete the surgical task as a measure of performance, other additional outcome measures, such as task-specific checklists, GRS, and motion analysis, were used in only some papers. The four experimental studies included in this review used checklists and GRS, which are traditionally used to measure surgical skill performance in the clinical setting. Checklists measure knowledge or steps of a procedure, and GRS captures how well the task was completed.22 As such, a GRS is a useful supplement to checklists as it is able to distinguish between novices and experts, as both groups might know the steps but have a significant difference in performance.22 It has also been shown that as trainees become more familiar with procedures, some of the underlying neural processes become automatized, which results in poorer performance on task-specific checklists.25 Educators may consider focusing on GRS to ensure patient safety and readiness for trainees to move on to the next phase of training.

Another weakness of the studies reviewed relates to the issue of the validity of the tools that were used to assess performance.22 Since validity is context specific, educators should be mindful when applying a “validated” tool into a new context without proper testing. This is of particular importance for studies investigating transfer, as the simulated and clinical settings are different contexts. Thus, studies investigating transfer should establish validity of outcome measures in both the simulated and clinical settings being used. Using the same validated outcome measures in both settings will allow for appropriate comparisons to establish transfer of learning.

Another tool that was used to measure performance is motion analysis, which can generate metrics such as number of collisions, number of hand movements, and path length to provide information on the level of skill with which a technique is performed.12 The literature suggests that motion analysis measures are able to accurately discriminate between trainees with different levels of arthroscopic experience13 and thus have strong construct validity as defined by the studies in this review. Nevertheless, there remains a lack of evidence supporting whether motion analysis can predict transfer of learning from a simulation laboratory to the clinical setting. Furthermore, motion analysis is rarely used in the OR, making the comparison of performance between simulation and the clinical setting challenging.

Clearly, more research needs to be focused on measuring how skills transfer from the simulation laboratory to the clinical setting in orthopedic surgery. Determination of the most appropriate outcome measures to assess competency for procedures performed in orthopedics must be developed, validated, and assessed. Moreover, in light of the recent focus on competency-based medical education, the lack of the understanding of what defines competence has broader implications for determining whether a trainee is able to practice independently. Moving forward, educators must create evaluation frameworks that clearly define competence, so that training programs can meaningfully compare performance in simulation to the clinical environment, or implement simulation training in competency-based curricula.

Lastly, another critical component to successfully incorporating simulation into surgical training is the fidelity of the learning model. While the fidelity of simulator models in the experimental articles in this review were identified, we were unable to determine if transfer of skills from simulation to the OR is different between low- and high-fidelity models, due to the varying outcome measures across studies. The reviews included in this paper suggested that a difference does exist between low- versus high-fidelity models for the transfer of learning,22 but did not elaborate on their reasoning for this. In 2012, Norman et al17 conducted a review of studies comparing low- and high-fidelity models and concluded that the benefits gained from low- and high-fidelity models are equivalent. However, this may have been due to many simulators being incorrectly labeled as “high fidelity”, when, in fact, the functional fidelity of the model is low.17 For example, a virtual reality simulator may place the participant in an environment that looks real; however, if they are not forced to use their hands the way they would in the OR (haptic feedback, space limitations, and so on), the simulator would be high in physical fidelity but low in functional fidelity. These two subsets of fidelity should be carefully considered, so that the fidelity of simulation models can be adjusted to optimize learning based on the level of the learner and hopefully enhance transfer of skills to the clinical setting.

Conclusion

More work needs to be done regarding the transfer of skills from the simulation environment to the clinical setting in orthopedic surgery. Studies need to use reliable and consistent outcomes, include clear definitions of competence, and consider both physical and functional fidelity. Tools used to measure transfer of skills from the simulation laboratory to the clinical setting should be validated in the context in which they are being used, and the same aspects of performance should be measured in simulation and the clinical setting. Improving these aspects of simulation studies in orthopedic surgery will help determine whether skills can be transferred into the clinical environment, and will help training programs better assess the competence of their trainees.

Disclosure

The authors report no conflicts of interest in this work.

References

1.

Bell RH Jr, Biester TW, Tabuenca A, et al. Operative experience of residents in US general surgery programs. Ann Surg. 2009;249(5):719–724.

2.

Stirling ER, Lewis TL, Ferran NA. Surgical skills simulation in trauma and orthopaedic training. J Orthop Surg Res. 2014;9:126.

3.

Reznick RK, MacRae H. Teaching surgical skills—changes in the wind. N Engl J Med. 2006;355(25):2664–2669.

4.

Thomas GW, Johns BD, Marsh JL, Anderson DD. A review of the role of simulation in developing and assessing orthopaedic surgical skills. Iowa Orthop J. 2014;34:181–189.

5.

Akhtar KS, Chen A, Standfield NJ, Gupte CM. The role of simulation in developing surgical skills. Curr Rev Musculoskelet Med. 2014;7(2):155–160.

6.

Brydges R, Carnahan H, Rose D, Rose L, Dubrowski A. Coordinating progressive levels of simulation fidelity to maximize educational benefit. Acad Med. 2010;85(5);806–812.

7.

Aim F, Lonjon G, Hannouche D, Nizard R. Effectiveness of virtual reality training in orthopaedic surgery. Arthroscopy. 2016;32(1):224–232.

8.

Akhtar K, Sugand K, Wijendra A, et al. The transferability of generic minimally invasive surgical skills: is there crossover of core skills between laparoscopy and arthroscopy? J Surg Educ. 2016;73(2):329–338.

9.

van der Meijden OA, Schijven MP. The value of haptic feedback in conventional and robot-assisted minimal invasive surgery and virtual reality training: a current review. Surg Endosc. 2009;23(6):1180–1190.

10.

Burns GT, King BW, Holmes JR, Irwin TA. Evaluating internal fixation skills using surgical simulation. J Bone Joint Surg Am. 2017;99(5):e21.

11.

Sonnadara R, Mui C, McQueen S, et al. Reflections on competency-based education and training for surgical residents. J Surg Educ. 2014;71(1):151–158.

12.

Cannon WD, Garrett WE Jr, Hunter RE, et al. Improving residency training in arthroscopic knee surgery with use of a virtual-reality simulator: a randomized blinded study. J Bone Joint Surg Am. 2014;96(21):1798–1806.

13.

Howells NR, Gill HS, Carr AJ, Price AJ, Rees JL. Transferring simulated arthroscopic skills to the operating theatre: a randomised blinded study. J Bone Joint Surg Br. 2008;90(4):494–499.

14.

Dunn JC, Belmont PJ, Lanzi J, et al. Arthroscopic shoulder surgical simulation training curriculum: transfer reliability and maintenance of skill over time. J Surg Educ. 2015;72(6):1118–1123.

15.

Waterman BR, Martin KD, Cameron KL, Owens BD, Belmont PJ Jr. Simulation training improves surgical proficiency and safety during diagnostic shoulder arthroscopy performed by residents. Orthopedics. 2016;39(3):e479–e485.

16.

Hodgins JL, Veillette C. Arthroscopic proficiency: methods in evaluating competency. BMC Med Educ. 2013;13:61.

17.

Norman G, Dore K, Grierson L. The minimal relationship between simulation fidelity and transfer of learning. Med Educ. 2012;46(7):636–647.

18.

Boutefnouchet T, Laios T. Transfer of arthroscopic skills from computer simulation training to the operating theatre: a review of evidence from two randomised controlled studies. SICOT J. 2016;2:4.

19.

Frank RM, Erickson B, Frank JM, et al. Utility of modern arthroscopic simulator training models. Arthroscopy. 2014;30(1):121–133.

20.

Vaughan N, Dubey VN, Wainwright TW, Middleton RG. Does virtual-reality training on orthopaedic simulators improve performance in the operating room? Paper presented at: Science and Information Conference; July 28–30; 2015; London, UK.

21.

Modi CS, Morris G, Mukherjee R. Computer-simulation training for knee and shoulder arthroscopic surgery. Arthroscopy. 2010;26(6):832–840.

22.

Hetaimish B, Elbadawi H, Ayeni OR. Evaluating simulation in training for arthroscopic knee surgery: a systematic review of the literature. Arthroscopy. 2016;32(6):1207–1220.e1.

23.

Slade Shantz JA, Leiter JR, Gottschalk T, MacDonald PB. The internal validity of arthroscopic simulators and their effectiveness in arthroscopic education. Knee Surg Sports Traumatol Arthrosc. 2014;22(1):33–40.

24.

Tay C, Khajuria A, Gupte C. Simulation training: a systematic review of simulation in arthroscopy and proposal of a new competency-based training framework. Int J Surg. 2014;12(6):626–633.

25.

Schmidt HG, Norman GR, Boshuizen, HP. A cognitive perspective on medical expertise: theory and implications. Acad Med. 1990; 65(10);611–621.

Creative Commons License © 2018 The Author(s). This work is published and licensed by Dove Medical Press Limited. The full terms of this license are available at https://www.dovepress.com/terms.php and incorporate the Creative Commons Attribution - Non Commercial (unported, v3.0) License. By accessing the work you hereby accept the Terms. Non-commercial uses of the work are permitted without any further permission from Dove Medical Press Limited, provided the work is properly attributed. For permission for commercial use of this work, please see paragraphs 4.2 and 5 of our Terms.