A Smart Mobile Application to Monitor Visual Function in Diabetic Retinopathy and Age-Related Macular Degeneration: The CLEAR Study





Purpose


The purpose of this study was to determine if a mobile application, the Checkup Vision Assessment System, could reliably monitor visual acuity (VA) and metamorphopsia remotely versus standard VA reference tests in the clinic. With the current COVID-19 pandemic, an even greater need for remote monitoring exists. Mobile tools enhance the ability to monitor patients virtually by enabling remote monitoring of VA and Amsler grid findings.


Design


Prospective, multicenter reliability analysis.


Methods


Participants: Patients (N = 108) with near corrected VA better than 20/200 and a diagnosis of age-related macular degeneration, diabetic retinopathy, or healthy patients without retinal disease (best-corrected visual acuity [BCVA] of 20/32 or better). Intervention: participants were tested using the Checkup, reference VA, and Amsler tests, with the order of testing (Checkup or reference) randomized. Patients monitored their vision using Checkup at least twice a week at home between office visits. Main outcome measurements were near corrected VA and Amsler grid test results.


Results


Agreement was strong between Checkup and reference tests for VA (r = 0.86) and Amsler grid (sensitivity: 93%; specificity: 92%). Home versus clinic testing showed excellent agreement (r = 0.96). Patients reported successful home use. There were no serious adverse events or discontinuations. Patients rated the usability of Checkup to be excellent.


Conclusions


There was good agreement between Checkup and in-clinic test results for VA and Amsler grid. The low variance of Checkup testing, agreement between in-clinic and home results, and excellent usability support Checkup as a reliable method for monitoring retinal pathology in clinic and home settings.


N eovascular age-related macular degeneration (AMD) and diabetic retinopathy (DR) are the leading causes of preventable vision loss among adults in the United States. , In both AMD and DR, long-term patient outcomes are significantly affected by early intervention and treatment, before irreversible vision loss occurs. Disease progression is largely irreversible, and for patients with vision-threatening diseases, access to appropriate and timely treatment is essential for avoiding poor health outcomes. Ophthalmologists use a visual acuity (VA) measurement to test visual function in the clinical setting for diagnosis and measurement of treatment effectiveness and to project patient prognosis. However, significant barriers may exist for in-office assessment, such as distance, travel barriers for patients, and health concerns.


Remote patient monitoring may help facilitate access to specialized care when office visits are not feasible or practical and use of mobile devices and interest in cost-saving health care solutions have aided in the rise of remote health care. The current severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2; aka, COVID-19) pandemic accelerated the need for remote health care solutions to reduce risk of infection and adhere to various stay-at-home orders in the United States. The forced cancellation of many clinic visits and elective procedures also increased demand for telemedicine: teleheath claims were 5100% higher in March of 2020 than in monthly averages in 2019.


However, solutions for monitoring patients remotely need to be rigorously validated and user-friendly, and many mobile devices have not been validated to ensure accuracy. A recent study of VA testing applications using mobile devices revealed poor functionality and accuracy across all tested devices, with no applications suitable for telemedicine. Nevertheless, mobile device-based applications for monitoring patients in the home setting may hold great value in the management of retinal diseases. Therefore, this study validated the Checkup Vision Assessment System (Checkup; Verana Health, San Francisco, California, USA), a mobile-based application designed to measure VA and Amsler grid testing in the home in patients with AMD and DR. The objectives of this study were to evaluate the performance of the Checkup visual test device and compare its results to those of standard in-office procedures for assessing near corrected VA (NCVA) and metamorphopsia (Amsler grid), and to document patient usability of Checkup tests at home.


METHODS


Study Design


The CLEAR (Correlation of Paxos Checkup Mobile App to Standard in Office Visual Assessment; CLEAR; NCT02871817) study was a 2-month, prospective, multi-center, single-arm observational study that assessed agreement between Checkup Vision Assessment System (Verana Health) and 2 standard reference tests, the Lebensohn Near Card, for VA and the Amsler grid for metamorphopsia, for test-retest repeatability in the clinic and in the home setting. Participants provided voluntary and written consent and Health Insurance Portability and Accountability Act authorization to comply with study assessments for the full duration of the study. Institutional Review Board (IRB)/Ethics Committee approval was obtained. The trial adhered to the tenets of the HIPPA and Declaration of Helsinki and was conducted in accordance with the International Conference on Harmonization Guidance on Good Clinical Practice.


Inclusion and Exclusion Criteria


Adult participants, 18 years of age and older, with best-corrected near visual acuity (BCVA) 20/200 or better were eligible for the trial. Participants were required to provide consent and comply with study assessments for the duration of the study. For the subgroup of patients with normal vision, eligibility requirements included BCVA 20/32 or better in each eye and no concurrent systemic illness affecting the retina or vision. For the subgroups with AMD or DR, patients had to have a diagnosis of either AMD or DR in the study eye.


Participants were excluded if they had dementia or other neurological or psychological limitations that would prevent them from performing regular self-testing of visual function; or other comorbid ocular pathology affecting vision (with the exception of cataract, pseudophakia, refractive error, and/or presbyopia); or inability to successfully undergo training and certify ability to self-test with Checkup; or inability to return for follow-up. The database did not track screen failure patients.


Study Assessments


The study design, including assessments, is summarized in Table 1 . Following provision of informed consent, patients were given written instructions, trained, and certified in the use of the Checkup application. Patients were required to have a compatible device (iOS phone: Apple, Cupertino, California, USA) and complete the following tasks for certification: show dexterity with the phone by picking and clicking appropriate buttons and successfully login to the account. Patients were asked to complete 1 NCVA and one Amsler grid test while under observation to demonstrate testing procedure comprehension and the ability to repeat tests reliably in a non-proctored setting.



TABLE 1

Schedule of Study Assessments.





































































Form Screening/Baseline Visit 1
(1 Month)
Visit 2
(2 Months)
Informed consent X
Demographic information X
Medical history X
Randomization (to confirm order of testing) X
Checkup device, app training/retraining X X
Checkup, smartphone app corrected near visual acuity a X X
Checkup (smartphone app)
Amsler grid a
X X X
Lebensohn near vision card corrected near visual acuity a X X X
Amsler grid, paper version a X X X
ETDRS BCVA
(sites with ETDRS capability)
X X X
Checkup ease of use questionnaire X
Eye disease severity grading X

BCVA = best-corrected visual acuity; ETDRS = Early Treatment Diabetic Retinopathy Study; NVCA = nearly corrected VA.

a At baseline and visits 1 and 2, checkup Amsler, checkup VA, Lebensohn NVCA, and Amsler paper versions were each administered twice to each subject.



Patients were evaluated at baseline and at 2 monthly office visits. At visit 0, patients were randomized for the duration of the study to testing with either Checkup first or reference assessments first. A total of 52% of participants were randomized to use Checkup first and 48% to reference first. Randomization of order of performance of study procedures (PAXOS first or reference first) was performed via the Electronic Data Capture (EDC) system (IBM Clinical Development, Morrisville, North Carolina, USA [formerly eClinicalOS) at a ratio of 1:1.


At each office visit, 2 Checkup and 2 reference NCVA assessments were made in addition to 2 Checkup and 2 reference Amsler grid assessments (8 total assessments). The Checkup and reference tests were administered in an alternating manner to each enrolled subject, with the choice of first test randomly chosen and with at least 10 minutes between individual tests. The Amsler grid evaluations were completed between NCVA for purposes of time efficiency. Patients wore corrective lenses if applicable. For sites with Early Treatment Diabetic Retinopathy Study capability, one BCVA measurement was collected.


Between office visits, patients used Checkup for VA and Amsler testing at least twice per week at home. At visit 1, patients reported on home use of the Checkup application. Following the Checkup and reference assessments, patients were retrained on the use of the Checkup application. At the final visit, a patient usability survey was conducted to evaluate patient satisfaction and ease of use. Eye disease severity was assessed if grading was available.


Statistical Analysis


NCVA, assessed using Checkup results, was compared to the NCVA reference method results (Lebensohn Near Card) in replicated measurements at each of 2 in-office visits. Test results for the Checkup and Lebensohn assessments were expressed in LogMAR units for purposes of analysis. The average of the test-retest correlation for the reference method at the 2 office visits defined the performance goal to be met by the correlation between Checkup and the reference method across the 2 visits. The correlation for both the test-retest paired values of the reference method and for the paired reference-Checkup values were estimated using the Lin concordance correlation coefficient.


The Amsler grid assessment of AMD (outcome: normal or abnormal) using the Checkup was compared to the reference Amsler grid assessment in replicated measurements at each of 2 in-office visits. The agreement for both the test-retest paired values of the reference method and for the paired reference-Checkup values (ie, paired values of normal-normal; normal-abnormal; abnormal-normal; and abnormal-abnormal) was evaluated using the McNemar test for discordant pairs. The Bland-Altman test was used as the primary analysis to evaluate the agreement between the Checkup and the predicate. Bland-Altman analyses are conventionally used to compare 2 clinical measurements, each of which can have some error in their measurements, and are generally considered better assessments of the overall agreement than simple correlation.  A Bland-Altman plot is a scatterplot with the differences of the 2 measurements for each sample on the vertical axis and the average of the 2 measurements on the horizontal axis and is identical to a Tukey mean-difference plot.  The 95% limits of agreement (LOA) associated with the Bland-Altman estimate of the mean difference is given by the equation:


<SPAN role=presentation tabIndex=0 id=MathJax-Element-1-Frame class=MathJax style="POSITION: relative" data-mathml='limitsofagreement=meandifference±1.96×[SDofdifference].’>limitsofagreement=meandifference±1.96×[SDofdifference].limitsofagreement=meandifference±1.96×[SDofdifference].
limitsofagreement=meandifference±1.96×[SDofdifference].

Only gold members can continue reading. Log In or Register to continue

Stay updated, free articles. Join our Telegram channel

Jul 10, 2021 | Posted by in OPHTHALMOLOGY | Comments Off on A Smart Mobile Application to Monitor Visual Function in Diabetic Retinopathy and Age-Related Macular Degeneration: The CLEAR Study

Full access? Get Clinical Tree

Get Clinical Tree app for offline access