Back to Journals » Clinical Ophthalmology » Volume 18

Inter-Rater Reliability of EyeSpy Mobile for Pediatric Visual Acuity Assessments by Parent Volunteers

Authors Rosenthal E , O'Neil J , Hoyt B , Howard M

Received 19 September 2023

Accepted for publication 14 December 2023

Published 24 January 2024 Volume 2024:18 Pages 235—245

DOI https://doi.org/10.2147/OPTH.S440439

Checked for plagiarism Yes

Review by Single anonymous peer review

Peer reviewer comments 4

Editor who approved publication: Dr Scott Fraser



Elyssa Rosenthal,1 James O’Neil,1 Briggs Hoyt,2 Matthew Howard3

1Department of Ophthalmology, Phoenix Children’s, Phoenix, AZ, USA; 2Department of Ophthalmology, Loyola University Medical Center, Maywood, IL, USA; 3Cleveland Clinic Neurology Residency Program, Cleveland Clinic, Cleveland, OH, USA

Correspondence: Elyssa Rosenthal, Department of Ophthalmology, Phoenix Children’s, 1919 E Thomas Road, Phoenix, AZ, 85016, USA, Tel +1 602 933-1000, Email [email protected]

Purpose: To assess the inter-rater test reliability of the EyeSpy Mobile visual acuity smartphone algorithm when administered to children by eye professionals and parent volunteers.
Patients and Methods: Visual acuity test-retest results were analyzed for 106 children assigned to one of three different screenings: (1) An eye technician and pediatric ophthalmologist using their typical visual acuity testing method on a M&S computer; (2) An eye technician and pediatric ophthalmologist using EyeSpy Mobile; (3) An eye technician and parent volunteer using EyeSpy Mobile.
Results: All three phases demonstrated a strong agreement between the two testers, with mean test-retest equivalency results within 0.05 logMAR (2.5 letters, 90% CI). Whether testing using their typical technique on an M&S computer or using EyeSpy Mobile, eye professionals obtained statistically closer mean test-retest results than parent volunteers by 1 letter, with equivalency results within 0.03 logMAR (1.5 letters, 90% CI). Conversely, the number of retests within 2 vision lines was statistically greater when EyeSpy mobile was used by parents as compared to eye professional’s customary technique on the M&S computer.
Conclusion: EyeSpy Mobile provides clinically useful visual acuity test-retest results even when used by first-time parent volunteers. Adaptive visual acuity algorithms have the potential to improve reliability, lessen training requirements, and expand the number of vision screening volunteers in community settings.

Keywords: EyeSpy Mobile, M&S computer, visual acuity, vision screening, adaptive algorithm

Introduction

Measurements of visual acuity, the ability of the visual system to discern spatial resolution, have inherent test-to-test variability, even in the absence of true visual changes. Testing modifications designed to minimize test-to-test variability may improve testing accuracy. Eye chart design has evolved to improve standardization of visual acuity testing through the selection of letters with similar recognition difficulty, the display of equal numbers of letters per line, uniformity of letter spacing, and other logMAR design features.1,2 Even when utilizing eye charts with optimal design features, visual acuity testing techniques may still vary greatly between different examiners. These variations include whether to test with a child or adult chart, what letter size to begin testing, the sequence of letter presentation, the number of total letters presented, when to conclude testing, and how to calculate the final visual acuity assigned.3–5

Visual acuity testing protocols help improve standardization by providing a uniform structure for test presentation. Examples of testing protocols commonly used in research include the Early Treatment for Diabetic Retinopathy Study (ETDRS) for adults and the Amblyopia Treatment Study (ATS) protocol for children.2,6 Both of these visual acuity protocols can be complex to administer manually. Therefore, computerized versions have been developed to help administer and record responses more accurately (e-ETDRS and e-ATS).7–9

Computerized versions of the EDTRS and ATS employ adaptive algorithms, meaning letter (or other optotype) sequencing is based on the prior correct or incorrect responses to standardize testing. To date, these computerized adaptive visual acuity testing protocols have been mainly limited to use in research settings; this is most likely related to their overall complexity, including their time to administer, limited platform availability, and technology cost. The overwhelming number of visual acuity tests performed in the real world (doctor’s offices, schools, and various other venues) continue to be administered at the examiner’s discretion, without the benefit of a standardized and validated testing protocol.

Oftentimes, visual acuity screening outside of doctor’s offices is performed by personnel with varying levels of testing experience or even non-healthcare volunteers.10–13 Increasing the number of visual acuity assessments performed using validated, standardized protocols in real-world settings could profoundly improve the accuracy of vision testing in a wide variety of medical and public health venues.14,15 Standardized algorithms compatible with widely-available electronic devices would be especially valuable if such procedures were fast, easy to perform, and did not require extensive administrator training or vision testing expertise to achieve reliable results.

EyeSpy Mobile employs an adaptive visual acuity testing algorithm designed specifically for use on mobile devices to facilitate vision testing in a variety of testing locations, including outside of the eye professional’s office. The algorithm has been previously validated in children against the “gold standard” e-ETDRS protocol when administered by an eye professional on the same M&S computer system.16 Results demonstrated the EyeSpy Mobile algorithm to be non-inferior to e-ETDRS, and 40% faster to implement. The present study aims to evaluate the inter-rater test reliability in children using EyeSpy Mobile vision testing software on a smartphone when administered by different examiners, including eye professionals and parents with no previous vision testing experience. We predicted that the EyeSpy Mobile program when administered on a touchscreen device would provide consistent and accurate metrics of visual acuity, regardless of examiner training or experience.

Materials and Methods

This study was performed at Phoenix Children’s Hospital Department of Pediatric Ophthalmology. New patients aged five or older who were able to cooperate with vision testing were invited to participate regardless of pre-existing ocular conditions or developmental status. Written informed consent was obtained from parents prior to testing. All study procedures were conducted according to the tenets of the Declaration of Helsinki and approved by the Phoenix Children’s Hospital Institutional Review Board.

Testing modalities used in this study included the M&S computer system (2016 Smart System® PC Plus, M&S® Technologies, Inc.) and the EyeSpy Mobile application on a smartphone or iPad mini (Apple©, Inc.). These testing systems have been described elsewhere.16,17 There were three different visual acuity test-retest study phases: briefly, Phase 1 assessed eye professionals’ customary office technique on the M&S computer, Phase 2 assessed results from eye professionals using EyeSpy Mobile smartphone application, and Phase 3 assessed results from first-time parent volunteers using EyeSpy Mobile. Individual children were assigned to testing in only one of the three phases in order to avoid fatigue from conducting multiple visual acuity tests during the same visit. All testing phases are described in detail below.

Testing distances for all phases and across both platforms were set to 7.0–7.5 feet based on exam room dimensions; both M&S and EyeSpy Mobile were specifically calibrated for these distances. In all study phases, the order of which examiner went first was randomized by the subject’s even or odd day of birth. Testing for a particular patient was always performed in the same examination lane, using the same distance, lighting, and controlling for all room conditions other than the testing platform used and the examiner conducting testing. M&S computer testing did not conform to a specific protocol, but was performed by the particular technician or doctor per their usual office testing techniques. EyeSpy Mobile utilizes an adaptative algorithm to standardize testing presentation.

The smartphone or tablet device to be used for testing with EyeSpy Mobile was not specified in advance of the study. The EyeSpy testing algorithm is compatible with all modern touchscreen devices when properly calibrated. For time efficiency, the application was not downloaded on the individual devices owned by parent volunteers. Overall, five different devices were used throughout the study in various different offices and phases; these included Android and Apple smartphones as well as an i-Pad mini. EyeSpy mobile testing was performed with 100% phone contrast. As described previously, the EyeSpy mobile application presents three letters arranged top to bottom with crowding bars.16 As the patient reads the letters, correct responses are registered by tapping the screen once next to the letter and tapping twice for incorrect responses. A third tapping of the screen clears the response. If the child did not provide a response, an incorrect response was registered.

Phase 1 of study testing was performed using only the M&S computer system. Vision was assessed individually by two separate, trained examiners. These examiners included two pediatric ophthalmologist investigators who have over 45 years of combined experience, and two ophthalmic technicians who each had several years of experience working with their same physician. Each professional conducted visual acuity testing in their typical customary practice with no specified protocol other than the right eye was always tested first. Neither the physician nor the technician was able to see the other’s results.

Phase 2 of study testing was performed solely using the EyeSpy Mobile application. Visual acuity testing was measured twice, once by a single physician (ER), and once by the eye technician. The same smartphone was used for all patients under the same testing conditions. The EyeSpy mobile application algorithm guided visual acuity testing first for the right eye followed by the left, and determined results. Prior results were not displayed, shared, or retained on the phone in a manner that could bias retesting by the second examiner.

Phase 3 of study testing was also performed using EyeSpy Mobile; this time, testing was performed once by the ophthalmic technician and once by the child’s parent. Unlike Phase 2 conducted by eye professionals, one of four different devices was used for retesting with parent volunteers. Retesting was never performed on the same device to more closely represent the use of multiple devices which would occur with community screenings. Brief (~15-30 seconds) and informal parent volunteer instructions on how to use the mobile application were provided verbally by the technician or doctor. Parents testing second were permitted to witness the testing process first by the technician but were not able to see any results from this testing. The parent was given a device already calibrated for correct distance and was asked to enter the child’s age. They were then instructed to have the child occlude or patch the non-tested eye. Parent and patient positioning were confirmed by the technician to ensure proper testing distance. Testing then commenced with the right eye, always followed by the left eye. Results determined by the application were recorded in the electronic health record.

Statistical Analyses

Statistical analysis was performed using SAS Version 9.4 (Cary, NC). Patient demographics were summarized descriptively by phase and compared using Kruskal–Wallis tests for continuous variables and Chi-Squared or Fisher’s exact tests for categorical variables. For each phase, the logMAR measurements of all eyes were summarized by tester using mean, standard deviation, median, IQR, and range. EyeSpy Mobile has a logMAR design, and can calculate results using the logMAR system. M&S chart results were converted to logMAR equivalents for analysis. To compare the measurements of the two testers in each phase, pairwise differences were calculated, and testers were compared using Wilcoxon signed rank tests. The magnitude of tester differences across phases was compared using t-tests with Satterthwaite corrections for unequal variances.

To further assess agreement within each phase, Pearson correlation, Cronbach’s alpha, and intraclass correlation were calculated with 95% confidence intervals, along with the percent of measurements that agree within 0.1 and 0.2 logMAR. Bland-Altman plots were created to assess the agreement between testers for each phase. Equivalence testing was done using a two one-sided test (TOST) procedure at two pre-specified limits of agreement, 0.03 logMAR and 0.05 logMAR.

Results

A total of 106 patients were included in this study, for a total of 212 eyes measured. Overall, 72% of the subjects were between 5–10 years old. Phase 1 included 52 total patients, with an age range of 5–15 years and a mean age of 8.8 years. Phase 2 included 11 patients aged 5–14 years, with a mean age of 9.9 years. Phase 3 had 43 patients aged 5–17 years, with a mean age of 9.0 years. Of the total 106 patients, 52 were male and 54 were female. Two patients in Phase 3 wore eyeglasses (Table 1).

Table 1 Demographics of Patients by Study Phase

To maximize study power, statistical analyses included all eyes tested (106 patients, 212 eyes). In Phase 1, the mean logMAR score given by doctors was 0.17 and by technicians was 0.18. The pairwise difference between testers was not statistically significant (p = 0.2514). In Phase 2, both doctors and technicians gave a mean logMAR measurement of 0.15 (p > 0.999). In Phase 3, the technicians gave a mean logMAR measurement of 0.17 and parents gave a mean logMAR measurement of 0.14, a statistically significant difference (p = 0.0002). The mean logMAR tester difference in Phase 3 was significantly larger than in Phase 1 (p = 0.0010) and Phase 2 (p = 0.0419) (Table 2).

Table 2 Summary of LogMAR Measurements by Phase

Under equivalence testing with right and left eye results combined, visual acuity test-retest values by two different examiners were equivalent in all three phases of the study; these values were within 0.05 logMAR, or 2.5 letters, regardless of previous training or testing experience of the examiner. In Phases 1 and 2, tester results were also equivalent, within a margin of 0.03 logMAR or 1.5 letters (Table 3 and Figure 1).

Table 3 Equivalence Testing by Phase Within Margin of 0.03 and 0.05 logMAR

Figure 1 Equivalence testing within 0.03 and 0.05 logMAR (demonstrated as means and 90% confidence intervals).

In all three phases, there was strong agreement between the two testers (Figures 2–4). The highest Pearson test-retest correlation occurred in Phase 2, when eye professionals performed testing with the EyeSpy Mobile application using the same smartphone (0.962; 95% CI = 0.909, 0.984). This was significantly higher than the Pearson correlation in Phase 1, when eye professionals tested on the same M&S system (0.861; 95% CI = 0.801, 0.904), as evidenced by non-overlapping confidence intervals (Table 4).

Table 4 Measures of Agreement Between the LogMAR Scores in Each Phase^

Figure 2 Bland-Altman plot of differences in Phase 1.

Figure 3 Bland-Altman plot of differences in Phase 2.

Figure 4 Bland-Altman plot of differences in Phase 3.

The percentage of retests when using EyeSpy Mobile that fell within 1 to 2 vision lines were similar, whether testing was performed by two different eye professionals on the same device or an eye technician and a parent using different devices. Two different eye professionals using the EyeSpy Mobile app (Phase 2) demonstrated 91% of retests within 0.10 logMAR (1 vision line), and 100% of retests within 0.20 logMAR (2 vision lines) on the same smartphone device. Results were similar when parents and an eye technician used EyeSpy Mobile (Phase 3), with 86% of retests within 0.10 logMAR, and 99% within 0.20 logMAR, when using two different smartphone devices.

Overall, use of EyeSpy Mobile resulted in significantly greater percentage of visual acuity test-retest results within 0.20 logMAR than eye professionals using the M&S computer, even when testing was performed by parents (Phase 1 vs Phase 3: p = 0.0421; Phase 1 vs Phases 2 and 3: p = 0.0172) (Table 5).

Table 5 Agreement Within One and Two Lines

Finally, to evaluate for potential bias from including both eyes of the same subject in analyses, equivalence testing was repeated for left eyes alone and right eyes alone. Left eyes were equivalent within 0.05 logMAR (2.5 letters) for all three phases of the study. Right eyes were equivalent within 0.05 logMAR (2.5 letters) when professionals used the M&S computer in Phase 1, within 0.03 logMAR (1.5 letters) when professionals used EyeSpy Mobile in Phase 2, and within 0.07 logMAR (3.5 letters) when parents used EyeSpy Mobile in Phase 3.

Discussion

Visual acuity measurements may vary in the absence of true visual changes, even when measured by professional, experienced testers. Testing reliability is of particular concern when visual acuity is measured by numerous or inexperienced testers as may occur during community screenings outside of the eye professional’s office. This study demonstrated that visual acuity testing with EyeSpy Mobile smartphone algorithm produced clinically useful mean inter-rater reliability results within 2.5 letters (90% CI) when testing was conducted in children by parent volunteers using the application for the very first time.

For context regarding the clinical relevance of these findings, the EyeSpy smartphone visual acuity inter-rater reliability results of parents were compared to those of eye professionals using their typical vision testing techniques on an M&S computer in the eye clinic and using the EyeSpy smartphone algorithm. It is notable the best inter-rater reliability achieved in the study was when eye professionals used the EyeSpy Mobile application. Although experienced eye technician and pediatric ophthalmologist inter-rater reliability was within 1.5 letters (90% CI) using both the EyeSpy smartphone application and the M&S office computer, the Pearson test-retest correlation was statistically better when eye professionals used the standardized EyeSpy smartphone algorithm. In addition, the number of retests within 2 vision lines was significantly better overall in the study phases using EyeSpy Mobile algorithm compared to typical eye professional clinic testing performed without a standardized protocol, including when novice parent volunteers performed the testing.

EyeSpy Mobile inter-rater reliability results obtained by both eye professionals and parent volunteers compared favorably to previously published visual acuity test-retest reliability studies as performed by eye professionals in children using ETDRS charts and standardized, best practice protocols,1,6,7,17–20 The consensus of these studies suggest baseline visual acuity needs to change by more than 1 to 2 vision lines (0.1 to 0.2 logMAR) to indicate a true and statistically significant change in vision. In addition, the inter-rater results with EyeSpy Mobile on a smartphone platform compared favorably to test-retest results from a previous validation study of the EyeSpy algorithm when administered twice on the same M&S computer by the same, experienced optometrist.16 Our findings suggest neither using the algorithm on a mobile device for testing, nor a different professional examiner for retesting, adversely impacted repeatability of results.

Unlike in the eye professional’s office, vision screenings in schools or community settings are often conducted by volunteers or staff with limited formal vision education and training. Numerous studies have demonstrated the potential for accurate vision acuity assessments using mobile devices when testing is performed by well-trained or experienced test administrators.5,21,22 However, few visual acuity applications are available with scientifically-validated testing protocols to facilitate standardization of testing by non-professionals. Currently, there is limited use of adaptive visual acuity testing algorithms for community settings.23–25 Vision screenings in schools, primary care settings, and community screenings are typically performed in order to determine whether a referral to an eye professional for a comprehensive evaluation is necessary. National nonprofit vision organizations and State Health Departments often require training and certification in order to perform school vision screenings, associated with significant time commitments and costs.26 Use of widely available technology that incorporates vision testing algorithms to simplify and standardize visual acuity assessments could improve the reliability of vision screenings and increase accessibility by reducing training requirements, thereby expanding the potential pool of volunteer examiners. Although the mean visual acuity inter-rater 90% confidence interval for all eyes (right and left combined) was one letter worse when parents rather than eye professionals tested with EyeSpy, this statistical difference is unlikely to have meaningful real world clinical importance. Interestingly, most of the testing variation when administered by parents occurred with the right eye (always tested first), which may suggest a small learning effect when using the application for the very first time with minimal training. A preliminary practice test to gain familiarity with the application or short instructional video may be helpful for first-time users.

This study with EyeSpy Mobile supports the potential for adaptive testing visual acuity algorithms on standard smartphones to be used reliably by parent volunteers without formalized or extensive vision training or instruction. Notably, however, there are limitations to this study and its interpretations. The ages of children enrolled in this study varied from 5 to 17 years old match the typical age range for which childhood visual acuity screenings are performed outside of the eye professional’s office. Younger children are more commonly assessed by pediatricians and at schools using photoscreening or autorefraction devices rather than visual acuity.

Smartphone screen size also limited the upper range of visual acuity tested to 20/100 at the selected test distance. However, the visual acuity range studied is adequate for community vision screenings conducted to determine whether a child requires referral to an eye specialist. Test administration time is a critical factor to achieve widespread adoption of adaptive algorithms outside of clinical research venues. Total instruction time to parents followed by visual acuity testing times for both the right and left eyes were fast enough to measure each eye twice with EyeSpy during a routinely scheduled new patient office examination slot; however, exact testing times were not recorded. Recording test times would be valuable for any future evaluation of standardized visual acuity testing algorithms. EyeSpy Mobile does contain a critical line pass/fail option not evaluated in this study to expedite testing for vision screenings outside of the eye professional’s office. Finally, although the sample size in phase 2 (eye technician and pediatric ophthalmologist using the Eye Spy mobile app) was smaller than the other study phases, eye professional results using the application was not the focus of the study, but was included for comparison purposes. The sample size was adequate to determine that mean inter-rater reliability with EyeSpy was non- inferior to the conventional testing methods used in the Pediatric Ophthalmology Clinic, with a statistically significant better Pearson correlation and a greater number of overall retests within 2 vision lines when the EyeSpy standardized algorithm was used.

Conclusion

First-time parent volunteers were able to successfully use the EyeSpy Mobile algorithm with minimal training, and achieved clinically useful inter-rater visual acuity results similar to those of experienced testers in the eye professional’s office, and consistent with published results by researchers using best practices. Our findings support the program’s potential for use in community vision screenings by nonmedical personnel.

Finally, assessing visual acuity outside the controlled environment of an eye professional’s office may be less reproducible. Even when employing a standardized algorithm, factors such as measures to prevent peeking, how to maintain the child’s attention, how much time to allow for a response, whether to require the child to make a response by guessing, giving the child a second chance to correct any mistakes, and familiarity with the testing process (eg, glare on screen from tilting the device) can influence results.27 It has even been suggested that the personal relationship between parent “volunteers” and their child may influence how children respond to vision testing.21 Additional evaluations of EyeSpy Mobile in real world venues such as schools, community screenings, and pediatricians’ offices would be useful to establish the benefit of this standardized, electronic algorithm on mobile devices as another tool for the advancement of detection, monitoring, and treatment of visual disorders.

Acknowledgments

We would like to acknowledge the support of Malin Joseph with the Phoenix Children’s Biostatistics Core in performing all statistical analyses and creating all tables and figures. Additionally, we would like to acknowledge Victoria Bernaud with the Phoenix Children’s Scientific Writing Core for assistance with manuscript writing, editing, and formatting.

Disclosure

JWO developed and has commercial interest in EyeSpy Mobile. He did not participate in any visual acuity testing using the application (Phases 2 and 3). JWO reports 50% owner of Cloudscape LLC, outside the submitted work; In addition, Dr James O’Neil has a patent US 17/240,382 pending to James O’Neil, R. Tirendi. The authors report no other conflicts of interest in this work.

References

1. Kaiser PK. Prospective evaluation of visual acuity assessment: a comparison of Snellen versus ETDRS charts in clinical practice (An AOS Thesis). Trans Am Ophthalmol Soc (TAOS). 2009;107:311–324.

2. Ferris FL, Bailey I. Standardizing the measurement of visual acuity for clinical research studies; guidelines from the eye care technology forum. Ophthalmology. 1996;103(1):181–182. doi:10.1016/s0161-6420(96)30742-2

3. Anstice NS, Thompson B. The measurement of visual acuity in children: an evidence-based update. Clin Exp Optom. 2014;97(1):3–11. doi:10.1111/cxo.12086

4. Holladay JT. Proper method for calculating average visual acuity. J Refract Surg. 1997;13(4):388–391. doi:10.3928/1081-597X-19970701-16

5. Claessens J, Geuvers J, Imhof S, Wisse R. Digital tools for the self-assessment of visual acuity: a systematic review. Ophthalmol Ther. 2021;10(4):731–732. doi:10.1007/s40123-021-00360-3

6. Holmes JM, Beck RW, Repka MX, Leske DA, Kraker RT, Blair RC. The amblyopia treatment study visual acuity testing protocol. Arch Ophthalmol. 2001;19(9):1345–1353. doi:10.1001/archopht.119.9.1345

7. Moke PS, Turpin AH, Beck RW, Holmes JM, Repka MX, Birch EE. Computerized method of visual acuity testing: adaptation of the amblyopia treatment study visual acuity testing protocol. Am J Ophthalmol. 2001;132(6):903–909. doi:10.1016/s0002-9394(01)01256-9

8. Beck RW, Moke PS, Turpin AH, et al. A computerized method of visual acuity testing: adaptation of the early treatment of diabetic retinopathy study testing protocol. Am J Ophthalmol. 2003;135(2):194–205. doi:10.1016/s0002-9394(02)01825-1

9. Kupl MT, Dobson V, Peskin E, Quinn G, Schmidt P, Vision in Preschoolers Study Group. The electronic visual acuity tester: testability in preschool children. Optom Vis Sci. 2004;81(4):238–244. doi:10.1097/00006324-200404000-00009

10. Sabri K, Easterbrook B, Khosla N, Davis C, Farrokhyar F. Paediatric vision screening by non-healthcare volunteers: evidence based practices. BMC Med Educ. 2019;19(1):65. doi:10.1186/s12909-019-1498-x

11. Marmamula S, Khanna RC, Pehere NK. et al. Agreement and diagnostic accuracy of vision screening in preschool children between vision technicians and spot vision screener. Clin Exp Optom. 2018;101(4):553–559. doi:10.111/cxo.12559

12. Kupl MT, Vision in Preschoolers Study Group. Findings from the vision in preschoolers (VIP) study. Optom Vis Sci. 2009;86(6):619–623. doi:10.1097/OPX.0b013e3181a59bf5

13. Vision in Preschoolers Study Group. Preschool vision screening tests administered by nurse screeners compared with lay screeners in the vision in preschoolers study. Invest Ophthalmol Vis Sci. 2005;46(8):2639–2648. doi:10.1167/iovs.05-0141

14. Steren BJ, Young B, Chow J. Visual acuity testing for telehealth using mobile applications. JAMA Ophthalmol. 2021;139(3):344–347. doi:10.1001/jamaophthalmol.2020.6177

15. Baker CW, Josic K, Maguire MG, et al. Comparison of Snellen visual acuity measurements in retinal clinical practice to electronic ETDRS protocol visual acuity assessment. Ophthalmology. 2023;130(5):533–541. doi:10.1016/j.ophtha.2022.12.008

16. Vasudevan B, Baker J, Miller C, Feis A. Analysis of the reliability and repeatability of distance visual acuity measurement with EyeSpy 20/20. Clin Ophthalmol. 2022;16:1099–1108. doi:10.2147/OPTH.S352164

17. McClenaghan N, Kimura A, Stark LR. An evaluation of the M&S technologies smart system II for visual acuity measurement in young visually-normal adults. Optom Vis Sci. 2007;84(3):218–223. doi:10.1097/OPX.0b013e3180339f30

18. Cotter SA, Chu RH, Chandler DL, et al. Reliability of the electronic early treatment diabetic retinopathy study testing protocol in children 7 to <13 years old. Am J Ophthalmol. 2003;136(4):655–661. doi:10.1016/s0002-9394(03)00388-x

19. Manny RE, Hussein M, Gwiazda J, Marsh-Tootle W; COMET Study Group. Repeatability of ETDRS visual acuity in children. Invest Ophthalmol Vis Sci. 2003;44(8):3294–3300. doi:10.1167/iovs.02-1199

20. Laidlaw D, Tailor V, Shah N, Atamian S, Harcourt C. Validation of a computerised logMAR visual acuity measurement system (COMPlog): comparison with ETDRS and the electronic ETDRS testing algorithm in adults and amblyopic children. Br J Ophthalmol. 2007;93(2):241–244. doi:10.1136/bjo.2007.121715

21. Suo L, Ke X, Zhang D, et al. Use of mobile apps for visual acuity assessment: systematic review and meta-analysis. JMIR mHealth uHealth. 2022;10(2):e26275. doi:10.2196/26275

22. O’Neill S, McAndrew DJ. The validity of visual acuity assessment using mobile technology devices in the primary care setting. Aust Fam Physician. 2016;45(4):212–215.

23. Bastawrous A, Rono HK, Livingston IA. Development and validation of a smartphone-based visual acuity test (peek acuity) for clinical practice and community-based fieldwork. JAMA Ophthalmol. 2015;133(8):930–937. doi:10.1001/jamaophthalmol.2015.1468

24. Trivedi RH, Wilson ME, Peterseim MM, Cole KB, Teed RG. A pilot study evaluating the use of EyeSpy video game software to perform vision screening in school-aged children. J Am Assoc Pediatr Ophthalmol Strabismus. 2010;14(4):311–316. doi:10.1016/j.jaapos.2010.03.008

25. Yamada T, Hatt SR, Leske DA, et al. A new computer-based pediatric vision-screening test. J Am Assoc Pediatr Ophthalmol Strabismus. 2015;19(2):157–162. doi:10.1016/j.jaapos.2015.01.011

26. Prevent Blindness. Massachusetts children’s vision screening training course. Available from: https://childrensvision.preventblindness.org/. Accessed February 23, 2023.

27. Black J, Jacobs R, Chen L, Tan E, Tran A, Thompson B. An assessment of the iPad as a testing platform for distance visual acuity in adults. BMJ Open. 2013;3(6):e002730. doi:10.1136/bmjopen-2013-002730

Creative Commons License © 2024 The Author(s). This work is published and licensed by Dove Medical Press Limited. The full terms of this license are available at https://www.dovepress.com/terms.php and incorporate the Creative Commons Attribution - Non Commercial (unported, v3.0) License. By accessing the work you hereby accept the Terms. Non-commercial uses of the work are permitted without any further permission from Dove Medical Press Limited, provided the work is properly attributed. For permission for commercial use of this work, please see paragraphs 4.2 and 5 of our Terms.