Back to Journals » Journal of Pain Research » Volume 16
Augmented Reality-Assisted Navigation System for Transforaminal Epidural Injection
Authors Jun EK , Lim S , Seo J, Lee KH , Lee JH, Lee D, Koh JC
Received 9 December 2022
Accepted for publication 7 March 2023
Published 17 March 2023 Volume 2023:16 Pages 921—931
Checked for plagiarism Yes
Review by Single anonymous peer review
Peer reviewer comments 2
Editor who approved publication: Professor Krishnan Chakravarthy
Eun Kyung Jun,1,* Sunghwan Lim,2,* Joonho Seo,3 Kae Hong Lee,1 Jae Hee Lee,1 Deukhee Lee,2 Jae Chul Koh1
1Department of Anesthesiology and Pain Medicine, Korea University Anam Hospital, Seoul, Korea; 2Center for Healthcare Robotics, Artificial Intelligence and Robotics Institute, Korea Institute of Science and Technology, Seoul, Korea; 3Department of Medical Assistant Robot, Korea Institute of Machinery and Materials, Daegu, Korea
*These authors contributed equally to this work
Correspondence: Deukhee Lee, Center for Bionics, Korea Institute of Science and Technology, Hwarangno 14-gil 5, Seongbuk-gu, Seoul, 136-791, Republic of Korea, Tel +82-2-958-5633, Fax +82-2-920-2275, Email [email protected] Jae Chul Koh, Department of Anesthesiology and Pain Medicine, Korea University Anam Hospital, 73, Goryeodae-ro, Seongbukgu, Seoul, 02841, Korea, Tel +82-2-920-5632, Fax +82-2-920-2275, Email [email protected]
Purpose: Multiple studies have attempted to demonstrate the benefits of augmented reality (AR)-assisted navigation systems in surgery. Lumbosacral transforaminal epidural injection is an effective treatment commonly used in patients with radiculopathy due to spinal degenerative pathologies. However, few studies have applied AR-assisted navigation systems to this procedure. The study aimed to investigate the safety and effectiveness of an AR-assisted navigation system for transforaminal epidural injection.
Patients and Methods: Through a real-time tracking system and a wireless network to the head-mounted display, computed tomography images of the spine and the path of a spinal needle to the target were visualized on a torso phantom with respiration movements installed. From L1/L2 to L5/S1, needle insertions were performed using an AR-assisted system on the left side of the phantom, and the conventional method was performed on the right side.
Results: The procedure duration was approximately three times shorter, and the number of radiographs required was reduced in the experimental group compared to the control group. The distance from the needle tips to the target areas in the plan showed no significant difference between the two groups. (AR group 1.7 ± 2.3mm, control group 3.2 ± 2.8mm, P value 0.067).
Conclusion: An AR-assisted navigation system may be used to reduce the time required for spinal interventions and ensure the safety of patients and physicians in view of radiation exposure. Further studies are essential to apply AR-assisted navigation systems to spine interventions.
Keywords: interventional procedure, epidural injection, augmented reality, navigation system, radiation exposure
Emerging Augmented Reality (AR) technology has led to an increasing interest in applying AR-assisted navigation systems in surgery.1 When applied in surgery, three-dimensional (3D) patient-specific computed tomography (CT) images are overlaid directly on the surgical field. Surgical planning is performed beforehand, and the targets are also shown with AR images of the patients. Combining a real-time tracking system that transmits the spatial information of objects to a head-mounted display (HMD) or device screen enables an AR-assisted navigation system to help physicians in surgical planning and performance. A straightforward workflow with an AR system requires a quick and accurate tracking and data processing system.2 With the development of equipment and technology, AR systems have been evaluated for their usefulness in various fields, without technical latency.
Recent studies have attempted to demonstrate the benefits of using AR-assisted systems in various medical fields including orthognathic surgery,3 plastic surgery,4 spine surgery,1,3–5 breast surgery,6 neurosurgery,7 and training tools for a medical procedure.8 Improvements in the clinical outcomes and safety of both patients and physicians in terms of radiation exposure are purported benefits of the system.9–12
Transforaminal epidural injection is an effective treatment modality for patients with lumbar radiculopathy secondary to spinal degenerative diseases. The procedure constitutes a large proportion of all spinal interventions performed for patients with back pain.13 The procedure involves injecting drugs into the anterior epidural space, in which nerve inflammation elicits pain in numerous spinal disorders. Thus, injectants should be targeted to achieve effective treatment outcomes. For injection accuracy and safety reasons, the procedure is usually performed under fluoroscopic guidance. However, solely using two-dimensional (2D) fluoroscopic images, the procedure is challenging and fails in some patients with spinal deformities. Inaccurate placement of the needle is inevitable because of the indirect assessment of the needle’s location using X-ray images. In addition, difficult eye-hand coordination hinders an accurate procedure and causes difficulty in effective training.14
We assumed that planning before a procedure and directing the needle to the target area along the path identified by the AR-assisted navigation system could increase the accuracy of needle placement and reduce the overall procedure time. Therefore, it may be possible to reduce radiation exposure to both patients and physicians. However, although studies have reported on AR-assisted navigation systems in spine surgery,1,3–5 we could not find previous research on its application in procedures using small-diameter needles such as transforaminal epidural injection.
This experimental study aimed to investigate the sparing of procedure duration and radiation exposure when using an AR-assisted navigation system for transforaminal epidural injection.
Materials and Methods
The software and system used in this study were developed with the cooperation of several institutions (Korea University College of Medicine (Seoul, Republic of Korea), Korea Institute of Science and Technology (KIST), Korea Institute of Machinery and Materials (KIMM), META SYSTEMS CO., LTD (Seongnam-si, Republic of Korea), DIGITEK CO., LTD (Seoul, Republic of Korea)), and verification was conducted at Korea University Anam Hospital. The project was conducted for 5 years through the Technology Development Program for AI-Bio-Robot-Medicine Convergence.
AR-Assisted Navigation System for Transforaminal Epidural Injection
We developed an AR-assisted navigation system for needle placement. Figure 1 shows the components of the system. The AR-assisted navigation system includes three components:1) virtual reality (VR)-based planning and navigation software, 2) a patient and tool tracking system consisting of three optical trackers, and 3) an HMD (Hololens2, Microsoft, USA) and its software for AR-based navigation. The overall workflow of the AR-assisted transforaminal epidural injection using this system is described below.
A patient underwent a CT scan before the procedure, and the CT image of the patient was acquired in the DICOM format. Four skin markers made of a retroreflective sheet and thin copper sheets were attached to the back surface of the patient before the CT scan. These markers were utilized for performing patient-image registration and tracking the movement of the patient in real-time during the procedure.
We developed a VR-based planning and navigation software using C/C++ programming language, incorporating various open-source libraries including The Visualization Toolkit (VTK, version 9.1.0, https://vtk.org), Insight Toolkit (ITK, version 5.2.1, https://itk.org), Grassroots DICOM (GDCM, version 3.0.10, https://gdcm.sourceforge.net), Open Computer Vision (OpenCV, version 4.5.4, https://opencv.org), and Qt (version 6.2.2, https://www.qt.io). The software was created to manage the complete data pipeline outlined in this work, and no other software tools were utilized. The software program allowed the user to reconstruct 3D volumes from 2D DICOM series. 3D polygonal models of the lumbar spines and skin markers of the patient were then generated from the 3D volumes using the isosurface extraction function implemented in the planning software, as shown in Figure 2.
Figure 2 Virtual reality-based planning and navigation software showing a preoperative plan.
The target areas were assigned to the epidural spaces where the transforaminal epidural injection is performed. Size-modifiable cuboids were initially located at each epidural space, and the margins of the cuboids were adjusted by the physician so that the areas are within the pedicle margin and anterior half of the pedicle in the coronal view of a vertebra, as shown in Figure 2. The target points were then defined at the centers of the target areas and visualized during needle insertion under the guidance of the AR-assisted navigation system. An example of the final planning results is shown in Figure 2. It included the target areas and polygonal models of the lumbar spine and skin markers.
An optical tracking system15 developed by our team was used for real-time patient and needle tracking. The tracking system is composed of three optical trackers, a reference marker, a cube marker attached to the epidural needle, and four skin markers attached to the back skin of the patient. The three optical trackers were installed at three different locations on the ceiling so that all trackers are oriented toward the same region of the procedure from different angles. This tracker configuration helped overcome the line-of-sight restriction problem of a conventional optical localizer. The positions of the patient and epidural needle could be tracked even if the markers are occluded by one or two trackers.
The reference marker was manufactured to have a square shape with dimensions of 100 mm x 100 mm. A QR code, which possesses patient information, was printed on the top surface of the marker, and four retroreflective markers were placed at the boundary of the marker, as shown in Figure 3A and B. The QR code and four retroreflective markers were independently recognized by the HMD and the trackers, respectively. Because the QR code and four retroreflective markers were designed to share the same coordinate system, the relationship between the coordinate systems of the HMD and trackers can be integrated through the reference marker. Furthermore, because the patient and needle positions were always relatively measured with respect to the coordinate system of the reference marker, the positions of the trackers attached to the ceiling could be freely adjusted during the procedure to secure the best location without affecting the tracking accuracy of the system. This relative position calculation approach allowed the system to be robust against accidental movements of the trackers during the procedure.
Figure 3 Retroreflective markers for patient and tool tracking. (A) skin markers on the patient mock-up, (B) a reference marker, and (C) a cube marker for needle tracking.
The epidural needle’s position was estimated and tracked using a cube marker attached to the needle, as shown in Figure 3A and C). The cube had 20 retroreflective markers on its five faces. Each face of the cube marker included four retroreflective markers to estimate the cube marker’s position from an extensive range of views in real-time. The reference marker’s position was measured together with the cube marker’s position to calculate the relative position of the cube marker with respect to that of the coordinate system of the reference marker. The relationship between the needle and the cube marker was obtained using the information from the design parameters of the cube marker and needle.
The patient’s movements were estimated and tracked in real-time using four markers attached to the patient’s back skin. The relative positions of the skin markers with respect to the coordinate system of the reference marker could be calculated by tracking the positions of the reference marker and the four skin markers. The positions of the four skin markers were then used to perform a real-time point-to-point registration using the positions of four skin markers extracted from the CT image data. Finally, the registration matrix, which defines the position of the CT image data with respect to the coordinate system of the reference markers, was calculated in real-time by repeating the point-to-point registration. The registration matrix enabled the navigation software to export the target points of the epidural needles established on the CT image coordinate system to the reference marker coordinate system so that the navigation system visualized the current positions of the needle and target points in real-time.
The relative positions of the epidural needle and target points with respect to that of the coordinate system of the reference marker were transmitted from the navigation software to the HMD device through the wireless network. During the navigation process, we built a UNITY® (version 20.3.30f, http://unity.com) based program that enabled the HMD device to recognize the position of the QR code in real-time and visualize the 3D models of the lumbar spine, target points, and epidural needle on the back skin of the patient. The locations of the lumbar spine, target points, and epidural needle were updated in real-time using the position information transmitted from the navigation software. Under the guidance of this AR-based information, the physician inserted the epidural needle toward the target point.
A Patient Model
For the patient model, a torso phantom was produced by Dr. Koh, one of the corresponding authors, using a three-dimensional bone framework printing from human CT and MRI imaging data.16 The size of the phantom was close to a real patient since the structure of the spine was replicated from the imaging data. A respiratory simulator (Figure 4A) was developed to create an environment similar to an actual human procedure. A 100–400 mL bag (Figure 4B) was placed under the low level of the thoracic vertebrae of the phantom. The phantom was moved up and down while the air was filled in and out of the bag to mimic a patient’s breathing movements.
Figure 4 Respiratory simulator. (A) A torso phantom with a respiratory simulator and (B) a 100–400 mL bag to mimic a patient’s breathing movements.
An anesthesiologist with five-year experience in pain medicine conducted the experimental procedure. The investigator had no experience using the AR system, thus a user manual of the AR system was provided beforehand. A 25-gauge spinal anesthesia needle (UNISIS CORP., Japan) was used. In the control group, needles were inserted from the L1/L2 neural foramen to the L5/S1 neural foramen on the right side of the phantom. Five needles were inserted using the conventional transforaminal epidural injection method as shown in Figure 5A. In the AR group, five other needles were inserted on the left side of the phantom using an AR-assisted navigation system (Figure 5B). In both groups, fluoroscopic images were obtained to identify the location of the needle (Figure 5C). The final placement of the needle tip should be the anterior epidural space, which is confirmed by anteroposterior (AP) and lateral views of fluoroscopy. As shown in Figure 5D, the reference marker, the lumbar spine 3D model, target points, and the epidural needle were continuously tracked in the AR group.
To measure the time required for the procedure, it was divided into T1 and T2. T1 was defined as the time from the first fluoroscopic image to the insertion of a needle into the skin. T2 was defined as the time from needle insertion to reaching the target area and confirming the final placement of the needle. T1, T2, and the number of radiographs taken were recorded for both groups.
The secondary outcome was the targeting error to evaluate the accuracy of the procedure. The targeting error was defined as the shortest distance between the final point of the needle tip and the target area designated during preoperative planning.
After the procedure, a CT image of the phantom was taken with the needles placed on each target left intact. The 3D polygonal models of the needles were then extracted from the 3D CT volume data, and the tip positions of the needles were determined by selecting the apices of the polygonal models. A rigid-body transformation matrix between the two coordinate systems of the postoperative CT image and the preoperative CT image was calculated using the center positions of the four skin markers extracted from each volume of data. Using the transformation matrix, the 3D polygonal models and tip positions of the needles were transformed into the preoperative planning space and visualized using the target areas, as shown in Figure 6. Targeting errors were estimated by measuring the shortest distance between the needle tip and the corresponding target area. The average targeting errors were calculated for the AR-guided and control groups.
Figure 6 Targeting error analysis. The targeting error was defined as the shortest distance between the final point of the needle tip (yellow spheres) and the target area designated in the preoperative planning.
Data were summarized as mean ± SD. Statistical analysis was performed using the SPSS statistical software (version 18.0; SPSS Inc., Chicago, Illinois, USA). Variables did not follow normal distributions according to the Shapiro–Wilk test. Thus, the Mann–Whitney test was used to compare the outcomes of the two groups. Statistical significance was set at a p-value < 0.05.
Table 1 lists the durations of the procedures and the number of fluoroscopic images taken during each procedure. The mean time required to select a target point, recorded as T1, was 5.9 ± 3.8 and 44.7 ± 36.4 s in AR and the control groups, respectively. The duration of T2 was 46.8 ± 13.5 s in the AR group and 128.3 ± 66.7 s in the control group. Overall, the mean duration of the procedure in the AR-guided group was approximately three times shorter than that in the control group. In addition, a smaller number of radiographs was required for needle placement in the AR group than in the control group. Only one radiograph was required to determine the target point (T1=1). For each needle insertion, the targeting errors were 1.70±2.32 mm in the AR group and 3.20±2.77 mm in the control group. No statistically significant differences in the targeting error were observed between the groups (p=0.067) (Table 2).
Table 1 Time Duration and Number of X-Rays Required for the Needle Insertion in the AR-Guided Group and the Control Group
Table 2 Targeting Error in Distance
In this study, we developed an AR-assisted navigation system for transforaminal epidural injection. To our knowledge, this is the first study to investigate the efficacy of an AR-assisted navigation system in an interventional procedure other than surgery. The results suggest the possibility of an AR-assisted navigation system to reduce the time requirement and improve the quality of spinal interventions. Furthermore, the system is expected to contribute to the safety of patients and medical staff regarding radiation exposure.
The duration of the procedure and the number of radiographs required were significantly reduced in the AR-guided group. AR-assisted navigation enables the physician to determine the needle insertion point with minimum radiation exposure. To reach the target epidural space, the needle position does not require constant monitoring. Consequently, the overall radiation exposure to physicians can be reduced. All needles’ tips were placed within an acceptable targeting area.
Several studies have demonstrated a shorter procedure duration and significantly less radiation exposure using an AR-assisted system. In particular, Edstrom et al and Peh et al emphasized minor or null radiation to the staff during the procedure. In view of the accuracy of the procedure, outcomes were comparable between the AR-assisted and freehand groups.17,18 The present study’s results are consistent with those of published research.
We have previously demonstrated research on inserting tools through an AR-assisted system.19 However, compared to previous studies using large, non-bending 14-gauge needles, this study was more difficult and challenging using easily bending 25-gauge needles. Compared to the previous study, this study demonstrated that, even in a situation where the procedure could not be performed while continuously looking at the bent part of the needle, the procedure duration and radiation exposure could be reduced by only guiding the needle entry position and direction through this system.
Spinal intervention requires an accurate understanding of the 3D spine anatomy. For this reason, the procedure is performed under imaging device guidance.20 C-arm fluoroscopy-guided interventions are most widely performed. Accurate needle placement, confirmation of injectates delivered to an accurate target location, and no intravascular uptake must be verified by C-arm fluoroscopy. Thus, radiation exposure during the procedure is unavoidable.21
Radiation exposure is associated with leukemia, solid cancer, cataract, and tissue damage in the brain and breast tissue.22–27 In addition, concerns are being raised about cardiovascular, cerebrovascular, and dermal damage.28 The incidence of cataract and cancer in workers with occupational radiation exposure is significant and relatively higher.21,29
To reduce the radiation exposure to the medical staff during the procedure, the main concepts that should be followed are “time”, “distance”, and “shielding”,21 during which it is necessary to reduce the irradiation time and perform a procedure by taking a minimum number of fluoroscopy images. Through AR guidance, radiation exposure during the procedure can be significantly reduced, ensuring the safety of both the physician and the patient from radiation.
Another limitation of the current image-guided intervention is poor eye-hand coordination. During the fluoroscopy-guided procedure, the physician could not see the needle tip in real-time when the needle was heading. Thus, physicians are dependent on the images shown on the fluoroscopy monitor to assess the position of the tool and manipulate the tool accordingly. This is an important factor influencing the quality of the performance and duration of the procedure. The patient’s anatomy was evaluated using an AR-assisted navigation system, and the procedure was planned based on 3D CT. The system increases the accuracy of the procedure, even for patients with anatomical variations that can be missed by checking 2D fluoroscopic images. Thorough planning and assessment reduce the number of needle manipulations and misdirection of the needle and might lead to less damage to the surrounding structures. In addition, with HMD-AR, needle manipulation is performed in the same field of view through the AR anatomical image, while looking directly at the patient’s back. The planned target point was directly visualized, and intuitive needle insertion was possible. These factors could ease the difficulty of the procedure and hence improve the quality of performance.
Not only the physician but also the patients would benefit from reduced procedure time and radiation exposure. During the procedure, patients are instructed to maintain a prone position, which may be difficult in patients with severe spinal diseases. When AR-assisted navigation is applied in real medical environments, patient satisfaction with the procedure is expected to improve.
The AR-assisted navigation system is also beneficial for trainees who need more experience in procedure. Although trainees should master anatomies for performing the procedure, an intuitive procedure would be possible if the patient’s CT images with anatomical structures were visible on the patients. Therefore, the learning curve for the procedure will be short and errors will be reduced when performing the procedure on the patient, bringing positive results to both the trainee and the patient. Before performing a procedure on a patient, sufficient practice with the AR system to the phantom will be necessary. The manual for using the AR system is simple for trainees to practice immediately.
An optical-see-through HMD has a clear advantage in hand-eye coordination during navigation-assisted needle insertion.30 The optical-see-through HMD provides an intuitive navigation view augmented on the needle insertion site directly, while the monitor-based method provides navigational information on the remote monitor, causing poor awareness of the needle insertion field. Despite this obvious advantage, a small field of view and AR visualization obscuring the needle tip are the limitations of the optical-see-through HMD-based approach.
A line-of-sight problem is a well-known drawback of optical 3D localizers.31 Occluding optical markers on the patient or the needle may yield a critical situation during the procedure. We propose the use of multiple optical trackers attached to the ceiling so that all the trackers look at the insertion field from different angles. We confirmed that this configuration of optical trackers is effective in expanding the field of view of the trackers. The multifaced cube marker is also an essential element, enabling a multitracker-based tracking system.
The AR system is a recent technique and could be expensive to be used in many centers. However, technologies and machines are constantly developing and become more simple and easier to be reproduced. The cost of applying AR in the procedure will be less burdensome. Artificial intelligence (AI) is a field of great interest and a number of research in its emerging technology in medicine have been published. A recent review suggested AI’s important role in every step from the diagnosis to post-operative care in neurosurgery.32 Through AI, resources could be allocated efficiently, save time and enhance the performance of healthcare workers in the process of treating patients. Hence, the overall medical costs could be reduced while enhancing high-quality healthcare. As AI technology becomes widespread and well adapts to hospital systems, applying AR systems in medical procedures and operations will be affordable in many centers.
This study had several limitations. First, the procedures were performed by a single physician. The ability to adapt to a particular system may differ among physicians. A study including physicians with different experience levels may more clearly prove its usefulness. Second, the simulation was performed on a phantom. Because of the difference in the softness of tissues and muscles between phantoms and humans, changing the direction of the needle underneath the skin would not have been similar to the conditions in a real patient. In addition, transforaminal epidural injection is usually performed in adults and in the age group with degenerative changes. Since each patient has a different anatomical structure and the difficulty of the procedure varies in patients with degenerative changes, the discrepancies could act as a variable for the outcome of the procedure. Applying the AR system to only one model could be another limitation of this study. Third, because the procedure was performed on only one side of the spine model, differences according to the side might not have been reflected. However, the physician who performed the procedure reported no difference in time or accuracy depending on the side on which the procedure was performed. Therefore, we believe that this did not significantly affect our results.
The AR-assisted navigation system could improve the quality of spine interventions by reducing the time requirement and radiation exposure. Our study can serve as a starting point for future research. Large-scale multicenter studies are needed to compare the outcomes obtained from several doctors with different skills and experience. Ultimately, there must be benefits when performing an AR system-applied procedure in a real medical field. Therefore, further studies on patients are essential to better understand the effectiveness and safety of AR-assisted navigation systems for epidural interventions. The safety issue regarding the AR system should be addressed before the practice.
We would like to thank Editage for the English language editing.
This study was financially supported by the Ministry of Trade Industry & Energy (MOTIE, Korea), Ministry of Science & ICT (MSIT, Korea), and Ministry of Health & Welfare (MOHW, Korea) under the Technology Development Program for AI-Bio-Robot-Medicine Convergence (20001655).
The authors report no conflicts of interest in this work.
1. Ghaednia H, Fourman MS, Lans A, et al. Augmented and virtual reality in spine surgery, current applications and future potentials. Spine J. 2021;21(10):1617–1625. doi:10.1016/j.spinee.2021.03.018
2. Casari FA, Navab N, Hruby LA, et al. Augmented reality in orthopedic surgery is emerging from proof of concept towards clinical studies: a literature review explaining the technology and current state of the art. Curr Rev Musculoskelet Med. 2021;14(2):192–203. doi:10.1007/s12178-021-09699-3
3. Ayoub A, Pulijala Y. The application of virtual reality and augmented reality in oral & maxillofacial surgery. BMC Oral Health. 2019;19(1):238. doi:10.1186/s12903-019-0937-8
4. Kim Y, Kim H, Kim YO. Virtual reality and augmented reality in plastic surgery: a review. Arch Plast Surg. 2017;44(3):179–187. doi:10.5999/aps.2017.44.3.179
5. Liu Y, Lee MG, Kim JS. Spine surgery assisted by augmented reality: where have we been? Yonsei Med J. 2022;63(4):305–316. doi:10.3349/ymj.2022.63.4.305
6. Gouveia PF, Costa J, Morgado P, et al. Breast cancer surgery with augmented reality. Breast. 2021;56:14–17. doi:10.1016/j.breast.2021.01.004
7. Mofatteh M, Mashayekhi MS, Arfaie S, et al. Augmented and virtual reality usage in awake craniotomy: a systematic review. Neurosurg Rev. 2022;46(1):19. doi:10.1007/s10143-022-01929-7
8. da Silva D, Costa CB, da Silva NA, Ventura I, Leite FP, Lopes DS. Augmenting the training space of an epidural needle insertion simulator with HoloLens. Computer methods in biomechanics and biomedical engineering. Imaging Vis. 2022;10(3):260–265. doi:10.1080/21681163.2021.2012833
9. Edström E, Burström G, Nachabe R, Gerdhem P, Elmi Terander A. A novel augmented-reality-based surgical navigation system for spine surgery in a hybrid operating room: design, workflow, and clinical applications. Oper Neurosurg. 2020;18(5):496–502. doi:10.1093/ons/opz236
10. Edström E, Burström G, Omar A, et al. Augmented reality surgical navigation in spine surgery to minimize staff radiation exposure. Spine. 2020;45(1):E45–E53. doi:10.1097/brs.0000000000003197
11. Elmi-Terander A, Burström G, Nachabé R, et al. Augmented reality navigation with intraoperative 3D imaging vs fluoroscopy-assisted free-hand surgery for spine fixation surgery: a matched-control study comparing accuracy. Sci Rep. 2020;10(1):707. doi:10.1038/s41598-020-57693-5
12. Elmi-Terander A, Burström G, Nachabe R, et al. Pedicle screw placement using augmented reality surgical navigation with intraoperative 3D imaging: a first in-human prospective cohort study. Spine. 2019;44(7):517–525. doi:10.1097/brs.0000000000002876
13. Manchikanti L, Knezevic NN, Navani A, et al. Epidural interventions in the management of chronic spinal pain: American Society of Interventional Pain Physicians (ASIPP) comprehensive evidence-based guidelines. Pain Physician. 2021;24(S1):S27–S28.
14. Wentink B. Eye-hand coordination in laparoscopy - an overview of experiments and supporting aids. Minim Invasive Ther Allied Technol. 2001;10(3):155–162. doi:10.1080/136457001753192277
15. Baek J, Noh G, Seo J. Development and performance evaluation of wireless optical position tracking system. J Inst Control Robot Syst. 2019;25(12):1065–1070. doi:10.5302/J.ICROS.2019.19.0190
16. Koh JC, Jang YK, Seong H, Lee KH, Jun S, Choi JB. Creation of a three-dimensional printed spine model for training in pain procedures. J Int Med Res. 2021;49(11):3000605211053281. doi:10.1177/03000605211053281
17. Peh S, Chatterjea A, Pfarr J, et al. Accuracy of augmented reality surgical navigation for minimally invasive pedicle screw insertion in the thoracic and lumbar spine with a new tracking device. Spine J. 2020;20(4):629–637. doi:10.1016/j.spinee.2019.12.009
18. Dennler C, Jaberg L, Spirig J, et al. Augmented reality-based navigation increases precision of pedicle screw insertion. J Orthop Surg Res. 2020;15(1):174. doi:10.1186/s13018-020-01690-x
19. Lim S, Ha J, Yoon S, et al. Augmented reality assisted surgical navigation system for epidural needle intervention. Annu Int Conf IEEE Eng Med Biol Soc. 2021;2021:4705–4708. doi:10.1109/embc46164.2021.9629804
20. Wang D. Image guidance technologies for interventional pain procedures: ultrasound, fluoroscopy, and CT. Curr Pain Headache Rep. 2018;22(1):6. doi:10.1007/s11916-018-0660-1
21. Park S, Kim M, Kim JH. Radiation safety for pain physicians: principles and recommendations. Korean J Pain. 2022;35(2):129–139. doi:10.3344/kjp.2022.35.2.129
22. Mahesh M. Fluoroscopy: patient radiation exposure issues. Radiographics. 2001;21(4):1033–1045. doi:10.1148/radiographics.21.4.g01jl271033
23. Hijikata Y, Kamitani T, Yamamoto Y, et al. Association of occupational direct radiation exposure to the hands with longitudinal melanonychia and hand eczema in spine surgeons: a survey by the society for minimally invasive spinal treatment (MIST). Eur Spine J. 2021;30(12):3702–3708. doi:10.1007/s00586-021-06973-3
24. Little MP, Cahoon EK, Kitahara CM, Simon SL, Hamada N, Linet MS. Occupational radiation exposure and excess additive risk of cataract incidence in a cohort of US radiologic technologists. Occup Environ Med. 2020;77(1):1–8. doi:10.1136/oemed-2019-105902
25. Andreassi MG, Piccaluga E, Guagliumi G, Del Greco M, Gaita F, Picano E. Occupational health risks in cardiac catheterization laboratory workers. Circ Cardiovasc Interv. 2016;9(4):e003273. doi:10.1161/circinterventions.115.003273
26. Yoshinaga S, Mabuchi K, Sigurdson AJ, Doody MM, Ron E. Cancer risks among radiologists and radiologic technologists: review of epidemiologic studies. Radiology. 2004;233(2):313–321. doi:10.1148/radiol.2332031119
27. Sont WN, Zielinski JM, Ashmore JP, et al. First analysis of cancer incidence and occupational radiation exposure based on the National Dose Registry of Canada. Am J Epidemiol. 2001;153(4):309–318. doi:10.1093/aje/153.4.309
28. Picano E, Vano E, Domenici L, Bottai M, Thierry-Chef I. Cancer and non-cancer brain and eye effects of chronic low-dose ionizing radiation exposure. BMC Cancer. 2012;12(1):157. doi:10.1186/1471-2407-12-157
29. Milacic S. Risk of occupational radiation-induced cataract in medical workers. Med Lav. 2009;100(3):178–186.
30. Heinrich F, Schwenderling L, Joeres F, Lawonn K, Hansen C. Comparison of augmented reality display techniques to support medical needle insertion. IEEE Trans Vis Comput Graph. 2020;26(12):3568–3575. doi:10.1109/tvcg.2020.3023637
31. Sherman WR, Craig AB. Chapter 6 - Presenting the virtual world. In: Sherman WR, Craig AB, editors. Understanding Virtual Reality.
32. Mofatteh M. Neurosurgery and artificial intelligence. AIMS Neurosci. 2021;8(4):477–495. doi:10.3934/Neuroscience.2021025
© 2023 The Author(s). This work is published and licensed by Dove Medical Press Limited. The full terms of this license are available at https://www.dovepress.com/terms.php and incorporate the Creative Commons Attribution - Non Commercial (unported, v3.0) License. By accessing the work you hereby accept the Terms. Non-commercial uses of the work are permitted without any further permission from Dove Medical Press Limited, provided the work is properly attributed. For permission for commercial use of this work, please see paragraphs 4.2 and 5 of our Terms.