Back to Journals » Advances in Medical Education and Practice » Volume 14

Utilization of Video Otoscopes for Otoscopy Skills Training of Third Year Medical Students

Authors Cavuoto Petrizzo M, Olvet DM, Samuels R, Paul A , John JT, Pawelczak M, Steiner SD

Received 22 November 2022

Accepted for publication 6 April 2023

Published 12 April 2023 Volume 2023:14 Pages 363—369

DOI https://doi.org/10.2147/AMEP.S396046

Checked for plagiarism Yes

Review by Single anonymous peer review

Peer reviewer comments 2

Editor who approved publication: Dr Md Anwarul Azim Majumder



Marie Cavuoto Petrizzo,1 Doreen M Olvet,2 Roya Samuels,3 Aleena Paul,4 Janice T John,5 Melissa Pawelczak,1 Shara D Steiner6

1Departments of Science Education and Pediatrics, Zucker School of Medicine, Hempstead, NY, USA; 2Department of Science Education, Zucker School of Medicine, Hempstead, NY, USA; 3Department of Pediatrics, Zucker School of Medicine, Hempstead, NY, USA; 4Departments of Pediatrics and Family and Community Medicine, New York Medical College, Valhalla, NY, USA; 5Department of Pediatrics, Albert Einstein College of Medicine, Bronx, NY, USA; 6Specialized Programs in Education, Zucker School of Medicine, Hempstead, NY, USA

Correspondence: Marie Cavuoto Petrizzo, Departments of Science Education and Pediatrics, Zucker School of Medicine, 500 Hofstra University, W227, Hempstead, NY, 11549, USA, Tel +1 516 463-7476, Fax +1 516.463.5631, Email [email protected]

Purpose: Effective teaching and assessment of otologic examinations are challenging. Current methods of teaching otoscopy using traditional otoscopes have significant limitations. We hypothesized that use of all-in-one video otoscopes provides students with an opportunity for real-time faculty feedback and re-practicing of skills, increasing self-reported confidence.
Methods: An otoscopy microskills competency checklist was provided to third-year medical students during their pediatric clerkship to self-assess otoscopy technique during patient examinations, and to clinical preceptors to assess and provide feedback during exams. Over the course of two years, we collected data from students randomly assigned to train on a video otoscope or a traditional otoscope during the clerkship. Pre- and post-clerkship surveys measured confidence in performing otoscopy microskills, making a diagnosis and documentation of findings. For those students who trained on the video otoscope, we solicited post-clerkship feedback on the experience of using a video otoscope.
Results: Pre-clerkship confidence did not differ between the groups, but the video otoscope trained group had significantly higher scores than the traditional otoscope trained group on all self-reported technical and diagnostic microskills confidence questions items post-clerkship. Students trained on video otoscopes had a significant increase in confidence with all microskills items (p-values< 0.001), however confidence in the traditional otoscope trained group did not change over time (p-values> 0.10). Qualitative feedback from the video otoscope trained group reflected positive experiences regarding “technique/positioning” and “feedback from preceptors.”.
Conclusion: Teaching otoscopy skills to pediatric clerkship medical students using a video otoscope significantly enhanced confidence compared to those training on a traditional otoscope by 1. enabling preceptors and students to simultaneously visualize otoscopy findings 2. allowing preceptors to provide real-time feedback and 3. providing opportunity for deliberate practice of microskills. We encourage the use of video otoscopes to augment student confidence and self-efficacy when training in otoscopy.

Keywords: medical students, technology, physical diagnosis, otoscopy

Introduction

Effective otoscopy teaching and assessment are challenging. Translation of otoscopy skills from classroom to bedside is typically assumed and rarely assessed1 because current teaching methods do not allow instructors to visualize ear structures simultaneously with students and determine if students can recognize and delineate pathology in real-time. This impedes faculty’s ability to provide high-quality feedback and ensure competence. These limitations can result in students having difficulty performing skills adequately, struggling to obtain views, or misinterpreting findings. Not surprising, only 5% of students completing their third year of medical school felt confident in performing otoscopy.2 Competence in otoscopy prior to residency is vital as approximately 30% of medical students enter primary care residencies.3 The compounding effects of inferior training and a high prevalence of primary care visits for otologic concerns can result in poor outcomes including misdiagnoses, patient morbidity, the overuse of antibiotics, and an increase in medical expenditures.4–6

Mastery in otoscopic technical and diagnostic skills requires hands-on experience with the otoscope device and the repetition of seeing and interpreting findings.7 However, the quantity and quality of technical and diagnostic skills practice with effective formative feedback during clinical education is variable. To address these issues, we suggest an approach that includes skills instruction using video otoscopes to enhance opportunities for faculty feedback.

Otoscopy training methods for medical students reported include task trainers/otoscopic simulators and smartphone otoscopes.5,6,8–17As opposed to smartphone otoscopes, all-in-one handheld video otoscopes (video otoscopes) are similar in design and feel to traditional otoscopes allowing practice with dexterity. And as opposed to task trainers/otoscopic simulators, video otoscopes are used during authentic patient encounters. We suggest that training on video otoscopes will augment in-the-moment faculty feedback and real-time skills re-practicing for students, (Deliberate Practice18), resulting in increased confidence in skills.

Materials and Methods

Students at the Zucker School of Medicine practice otoscopy during their longitudinal physical diagnosis course. Students are instructed using traditional otoscopes during a small-group physical diagnosis skills session in the first year of medical school (general otoscopy technique). A didactic session in the context of the pediatric exam during the second year of medical school includes instruction on the recognition of normal and pathologic findings using stock images, and systematically documenting (TM color, position, translucency) otoscopy findings. Didactics are followed by a skills session. Students apply their training during standardized patient encounters and during ambulatory clinical experiences throughout the first two years of medical school.

In the 2020–21 academic year, an additional otoscopy training experience was designed for 106 third-year medical students enrolled in their pediatric clerkship. Approximately 60% of the class was assigned to an ambulatory site for one week where Jedmed Horus + HD Video Otoscopes ™ (video otoscope) were housed. The remainder rotated through ambulatory practices housing traditional otoscopes. In the 2021–22 academic year, all 92 of our third-year students were assigned to the practice housing video otoscopes.

A pre-clerkship QualtricsTM questionnaire was emailed to students one week prior to the clerkship. Using a 7-point Likert scale ranging from strongly disagree (1) to strongly agree (7), it measured self-confidence in: positioning a patient, correct placement of the otoscope, adjusting to visualize the tympanic membrane (TM), identifying landmarks, and describing the TM appearance. Two multiple choice questions (MCQs) assessed students’ ability to diagnose published stock images of acute otitis media (AOM) and normal TM. Finally, students were asked to systematically document findings based on a provided stock image (normal TM).

The COVID-19 pandemic necessitated hosting a virtual otoscopy orientation during week one of the clerkship. A didactic portion included review of material from the first two years and a detailed discussion of otoscopy microskills. A microskills competency checklist (Appendix A) adapted from a published validated checklist1 was provided to students to self-assess technique during patient examinations, and for preceptors to assess students’ skills during otoscopy (Kirkpatrick level 319). All student were expected to: 1. review the checklist and practice microskills when examining patients 2. be observed and coached by faculty during otoscopic exams and 3. re-practice any challenging microskills.

A post-clerkship Qualtrics TM questionnaire, identical to the pre-clerkship questionnaire was emailed to students on the last day of the clerkship. Additionally, the questionnaire solicited open-response feedback from students who used video otoscopes (Kirkpatrick levels 1 and 219). Data was statistically evaluated using IBM SPSS Statistics (SPSS Inc., Chicago, Illinois, USA, Version 24.0). Descriptive statistics are presented as the mean (M) and standard deviation (SD) for each of the 7-point Likert scale survey items (confidence) and the documentation of otoscopy findings based on a provided image. Correct responses to the diagnostic ability MCQs are presented as number (%) of correct responses. The Wilcoxon Signed Ranks Test (confidence) and McNemar’s test (diagnostic ability) were performed for within-subjects’ tests and the Mann–Whitney U (confidence) and chi-square test (diagnostic ability) were performed for between-subjects. For all tests, a p value ≤ 0.05 was considered statistically significant. Students’ open-ended feedback comments were analyzed qualitatively using a thematic analysis.20

This research was approved under Exempt Review procedures of Hofstra University’s Institutional Review Board (REF# 20200514-SOM-PET-1).

Results

Participants

Of the 106 students who participated in the pediatric clerkship July 2020–May 2021, 26 students (25%) completed the pre- and post-clerkship surveys. Thirteen were in the traditional otoscope training group (TOT) and 13 were in the video otoscope training group (VOT). Ninety-two third-year students participated in the pediatric clerkship June 2021–May 2022 and were part of the VOT. 57 students (60%) completing the pre- and post-clerkship surveys. All students in this second cohort used the video otoscope. The total sample was 13 TOT students and 70 VOT students.

Perceived Confidence

Table 1 shows mean confidence scores pre-and post-clerkship. Pre-clerkship confidence did not differ between the groups, but the VOT group had significantly higher scores than the TOT group on all items post-clerkship. The VOT group had a significant increase in confidence on all items (p-values<0.001) while confidence in the TOT group did not change over time (p-values>0.10).

Table 1 Self-reported confidence in technical and diagnostic skills

Diagnostic Ability and Documentation

Table 2 presents findings of diagnostic ability pre- and post-clerkship as evidenced by students’ ability to diagnose based on otoscopy stock images. Pre- and post-clerkship diagnostic ability did not differ between the two groups at the pre-clerkship (p-values≥.13) or post-clerkship (p-values≥.45) time-points.

Table 2 Diagnostic ability

Pre-clerkship documentation of otoscopy findings based on a provided stock image did not differ between the groups. There was a significant improvement in student’s documentation of otoscopy findings (Table 2) post-clerkship in both the TOT (Z=−2.5, p=0.01) and VOT group (Z=−3.8, p<0.001).

Student Feedback

VOT students’ qualitative responses were largely positive. Representative comments were thematically mapped to “technique/positioning” and “feedback from preceptors”.

Technique/Positioning (N=38)

Improvement in positioning, visualization and identification were noted. Representative comments included:

  • “It helped me properly identify structures and helped me properly position the otoscope”.
  • “Helped me figure out how to optimally position otoscope and allowed for better visualization of structures normally seen through otoscope”.
  • “I was only really able to feel confident in what I was doing after using the otoscope trainer and making sure that I was correct”.

Feedback from Preceptors: (N=23)

Utility of real-time feedback, preceptors’ ability to evaluate technique, confirm visualization and facilitate discussion around exam and findings were noted. Representative comments included:

  • “Discussing what I’m seeing with the attending with feedback on my assessment was helpful”.
  • “I was able to take a picture of a ruptured membrane and show my peers, which helped the discussion about his condition and differential. I was also able to take longer to think about the diagnosis while examining the picture and not having to keep the otoscope in the patient’s ear the entire time I was looking”.
  • “Helped to better visualize what I was looking at. Allowed for additional insight into what I could see by a nearby attending or resident”.
  • “It has helped me verbalize the findings on the otoscope exam. It has also helped me compare what I saw with what the physician saw on exam”.

Negative feedback reflected low personal usage of the video otoscope (N=10) and lack of fluidity with operating it (N=5).

Discussion

We set out to determine the usefulness video otoscopes in our training of third-year medical students and determined that while traditional otoscope training did not significantly change technical or diagnostic skills confidence over time, video otoscope training resulted in a significant increase in confidence in technical and diagnostic skills over time. VOT students attested to the utility of the video otoscope in learning aspects of proper technique and the ability to receive real-time feedback on skills and confirmation of findings from preceptors. The results suggest the utility of video otoscope training to advance students’ confidence through 1) preceptor’s direct visualization and evaluation of students’ technical skills 2) provision of corrective feedback in real-time, and 3) the opportunity for deliberate practice.

Confidence, in this regard, is unlikely to reflect a generalized personality trait but rather self-efficacy. Per Bandura, self-efficacy is one’s judgment of their capabilities for learning or performing a specific action.21 Competency in performance requires not only knowledge and skill but also self-efficacy.22 While there may not be a direct correlation between self-assessed confidence and observed competence of a particular skill23 the value of confidence and self-efficacy is meaningful. Self-efficacy can impact students’ attainment of new “knowledge, attitude and skills”, inspire motivation and perseverance, and promote effective performance rather than failure.21,22,24 We believe that promotion of students’ confidence and self-efficacy through video otoscopy training sets an important stage for continued honing of skills and a pathway to competence.

Our comparison of the VOT and TOT cohorts at baseline and post-clerkship demonstrated no significant difference in terms of diagnostic ability when reviewing stock images of AOM and normal otoscopic findings. This was not surprising as both groups received the same didactic instruction on recognition and diagnosis of common otoscopic findings in our curriculum. Similarly, there was no significant difference between the two cohorts in terms of their ability to systematically document otoscopy findings pre-clerkship, reflecting identical prior instruction. Both cohorts had significant improvement in their documentation post-clerkship. This is reassuring as both cohorts were required to practice describing their findings in a systematic way when using the microskills checklists during patient encounters.

Several study limitations were noted. Student placements in ambulatory practices were independent of this study. The matched response rate during the 2020-21 academic year was low (25%) resulting in a limited number of TOT respondents. Additionally, COVID-19 restrictions prevented us from scheduling in-person orientations for our students and preceptors. This may have impeded early-on operational comfort with the video otoscopes. We hope to introduce in-person orientations in which rehearsal with the operation of the video otoscope occurs. We were unable to develop an in-vivo summative assessment of diagnostic ability as it was not logistically possible to find standardized patients with acute pathologies who could be examined by an entire class of students. We chose not to use otoscopic simulators as an alternative method of summative diagnostic skills assessment, deeming them non-authentic. We suspect that the significant improvement in confidence in visualizing the TM while using a video otoscope will enable students to compare their mental models of pathology with the findings they encounter during a live examination. Another challenge with the study was the increased time commitment required to precept students in direct observation. At times, this was a limiting factor in students’ ability to practice with the video otoscope. Finally, it is possible respondents were those feeling more positively about the experience.

Conclusion

A lack of competence in otoscopy skills remains a barrier to the accurate diagnosis of otoscopic pathology. The quantity and quality of technical and diagnostic skills practice and feedback during clinical clerkships is variable. We suggest otoscopy teaching that includes the standard, current teaching methods plus advanced instruction with a video otoscope to improve confidence. This will augment student confidence and self-efficacy with otoscopy by allowing 1. preceptors and students to simultaneously visualize middle ear findings, 2. preceptors to provide real-time corrective feedback on technique, and 3. students opportunity for deliberate practice of challenging microskills.

Abbreviations

VOT, video otoscope trained; TOT, traditional otoscope trained; TM, tympanic membrane; MCQ, multiple choice questions; AOM, acute otitis media.

Acknowledgments

We would like to thank: Jeffrey Bird, Director Educational Data and Analytics, and Marybeth Wright, Program Coordinator, for their invaluable contributions to this study. A grant awarded by the Academy of Medical Educators at the Donald and Barbara Zucker school supported purchase of the video otoscopes used in this study.

Disclosure

Dr Marie Cavuoto Petrizzo reports a grant from the Zucker School of Medicine’s Academy of Medical Educators during the conduct of the study. The authors report no other conflicts of interest in this work.

References

1. Paul CR, Gjerde CL, McIntosh G, Weber LS. Teaching the pediatric ear exam and diagnosis of acute otitis media: a teaching and assessment model in three groups. BMC Med Educ. 2017;17(1):146. doi:10.1186/s12909-017-0988-y

2. Jones WS, Johnson CL, Longacre JL. How well are we teaching otoscopy? Medical students’ perspectives. Pediatr Res. 2003;53:95A.

3. Ladha FA, Pettinato AM, Perrin AE. Medical student residency preferences and motivational factors: a longitudinal, single-institution perspective. BMC Med Educ. 2022;22(1):187. doi:10.1186/s12909-022-03244-7

4. Butler AM, Brown DS, Durkin MJ, et al. Association of inappropriate outpatient pediatric antibiotic prescriptions with adverse drug events and health care expenditures. JAMA Netw Open. 2022;5(5):e2214153. doi:10.1001/jamanetworkopen.2022.14153

5. Xu J, Campisi P, Forte V, Carrillo B, Vescan A, Brydges R. Effectiveness of discovery learning using a mobile otoscopy simulator on knowledge acquisition and retention in medical students: a randomized controlled trial. J Otolaryngol Head Neck Surg. 2018;47(1):70. doi:10.1186/s40463-018-0317-4

6. Spiro DM, Arnold DH. The concept and practice of a wait-and-see approach to acute otitis media. Curr Opin Pediatr. 2008;20(1):72–78. doi:10.1097/MOP.0b013e3282f2fa62

7. Rappaport KM, McCracken CC, Beniflah J, et al. Assessment of a smartphone otoscope device for the diagnosis and management of otitis media. Clin Pediatr. 2016;55(9):800–810. doi:10.1177/0009922815593909

8. Davies J, Djelic L, Campisi P, Forte V, Chiodo A. Otoscopy simulation training in a classroom setting: a novel approach to teaching otoscopy to medical students. Laryngoscope. 2014;124(11):2594–2597. doi:10.1002/lary.24682

9. Hakimi AA, Lalehzarian AS, Lalehzarian SP, Azhdam AM, Nedjat-Haiem S, Boodaie BD. Utility of a smartphone-enabled otoscope in the instruction of otoscopy and middle ear anatomy. Eur Arch Otorhinolaryngol. 2019;276(10):2953–2956. doi:10.1007/s00405-019-05559-6

10. Higgins Joyce A, Raman M, Beaumont JL, Heiman H, Adler M, Schmidt SM. A survey comparison of educational interventions for teaching pneumatic otoscopy to medical students. BMC Med Educ. 2019;19(1):79. doi:10.1186/s12909-019-1507-0

11. Kleinman K, Psoter KJ, Nyhan A, Solomon BS, Kim JM, Canares T. Evaluation of digital otoscopy in pediatric patients: a prospective randomized controlled clinical trial. Am J Emerg Med. 2021;46:150–155. doi:10.1016/j.ajem.2021.04.030

12. Lee DJ, Fu TS, Carrillo B, Campisi P, Forte V, Chiodo A. Evaluation of an otoscopy simulator to teach otoscopy and normative anatomy to first year medical students. Laryngoscope. 2015;125(9):2159–2162. doi:10.1002/lary.25135

13. Morris E, Kesser BW, Peirce-Cottler S, Keeley M. Development and validation of a novel ear simulator to teach pneumatic otoscopy. Simul Healthc. 2012;7(1):22–26.

14. Navaratnam AV, Halai A, Chandrasekharan D, et al. Utilisation of a smartphone-enabled video otoscope to train novices in otological examination and procedural skills. J Laryngol Otol. 2022;136(4):314–320. doi:10.1017/S0022215121004102

15. Schuster-Bruce JR, Ali A, Van M, Rogel-Salazar J, Ofo E, Shamil E. A randomised trial to assess the educational benefit of a smartphone otoscope in undergraduate medical training. Eur Arch Otorhinolaryngol. 2021;278(6):1799–1804. doi:10.1007/s00405-020-06373-1

16. Stepniak C, Wickens B, Husein M, et al. Blinded randomized controlled study of a web-based otoscopy simulator in undergraduate medical education. Laryngoscope. 2017;127(6):1306–1311. doi:10.1002/lary.26246

17. Wu V, Beyea JA. Evaluation of a web-based module and an otoscopy simulator in teaching ear disease. Otolaryngol Head Neck Surg. 2017;156(2):272–277. doi:10.1177/0194599816677697

18. Ericsson KA. Deliberate practice and the acquisition and maintenance of expert performance in medicine and related domains. Acad Med. 2004;79(10 Suppl):S70–S81. doi:10.1097/00001888-200410001-00022

19. Kirkpatrick D, Kirkpatrick J. Evaluating Training Programs: The Four Levels. 3 ed. San Francisco, CA: Berrett-Koehler; 2006.

20. Braun V, Clarke V. Using thematic analysis in psychology. Qual Res Psychol. 2006;3(2):77–101. doi:10.1191/1478088706qp063oa

21. B A. Self-efficacy: toward a unifying theory of behavioral change. Psychol Rev. 1977;84(2):191–215. doi:10.1037/0033-295X.84.2.191

22. B A. Self-Efficacy: The Exercise of Control. New York: W.H. Freeman; 1997.

23. Barnsley L, Lyon PM, Ralston SJ, et al. Clinical skills in junior medical officers: a comparison of self-reported confidence and observed competence. Med Educ. 2004;38(4):358–367. doi:10.1046/j.1365-2923.2004.01773.x

24. Klassen RM, Klassen JRL. Self-efficacy beliefs of medical students: a critical review. Perspect Med Educ. 2018;7(2):76–82. doi:10.1007/S40037-018-0411-3

Creative Commons License © 2023 The Author(s). This work is published and licensed by Dove Medical Press Limited. The full terms of this license are available at https://www.dovepress.com/terms.php and incorporate the Creative Commons Attribution - Non Commercial (unported, v3.0) License. By accessing the work you hereby accept the Terms. Non-commercial uses of the work are permitted without any further permission from Dove Medical Press Limited, provided the work is properly attributed. For permission for commercial use of this work, please see paragraphs 4.2 and 5 of our Terms.