To investigate the potential influences that affect visual acuity (VA) outcome in a clinic-based cohort of age-related macular degeneration (AMD) patients undergoing anti–vascular endothelial growth factor (anti-VEGF) treatment for choroidal neovascularization.
Prospective interventional case series.
Patients with subfoveal choroidal neovascularization (CNV) secondary to AMD were prospectively recruited. A detailed questionnaire was given to patients at time of enrollment, to collect information relating to demographics, history of visual symptoms, visual acuity (VA), and treatment scheduling. Delay from symptoms to treatment (“Treatment delay”) was measured in terms of weeks and analyzed in tertiles. Information pertaining to treatment outcomes was collected over a 6-month period.
One hundred eighty-five eyes of 185 patients were recruited into the study. Longer delay from first symptoms suggestive of CNV to first injection was a significant predictor ( P = .015) of poorer treatment outcome, when controlling for age, sex, and baseline VA. Patients with a delay in treatment of 21 weeks or more compared to a delay of 7 weeks or less had an odds ratio of 2.62 (1.20, 5.68) for worsening vision after treatment.
Patients experiencing a longer delay between their first symptoms of CNV and their first anti-VEGF treatment have a significantly lower chance of improving vision at 6 months following anti-VEGF therapy. It is critical that this information reach those at potential vision loss from AMD, in order that prompt treatment may be instituted, to maximize the benefits of anti-VEGF treatment.
Ranibizumab (Lucentis; Genentech Inc, South San Francisco, California, USA) and bevacizumab (Avastin; Genentech Inc) are anti–vascular endothelial growth factor (anti-VEGF) drugs that have improved the treatment of neovascular age-related macular degeneration (AMD). However, between 10% and 15% of patients treated in the pivotal MARINA (Minimally Classic/Occult Trial of the Anti-VEGF Antibody Ranibizumab in the Treatment of Neovascular AMD) and ANCHOR (Anti-VEGF Antibody for the Treatment of Predominantly Classic Choroidal Neovascularization in AMD) randomized controlled trials did not experience the visual acuity (VA) benefits enjoyed by most participants but instead continued to lose vision despite treatment. Most of the acuity increase occurred in the first 3 months of treatment; however, in those patients who responded poorly, vision failed to improve from the outset, with continued losses noted during the course of the treatment regimen. To date, there is no satisfactory explanation as to why some patients with neovascular AMD respond poorly to treatment.
Ranibizumab and bevacizumab are effective for the treatment of subfoveal choroidal neovascularization (CNV), and it seems intuitive that the sooner the neovascular process is arrested, the less damage would be impacted on the retina and the sooner anatomic integrity of this structure would be restored, resulting in better outcome. Data from the MARINA and ANCHOR studies do not show any detrimental effect on outcome with delay to treatment, from the date of first angiographic diagnosis of CNV. This delay is often minimal, as once the patient has a confirmed lesion on angiography the decision to treat is usually swift, with little delay in its implementation. While some delay can occur because of bureaucratic processing of authority to prescribe expensive anti-VEGF treatment, or in cases of stable occult lesions where the decision to treat is often delayed until signs of progression appear, the main delay is often from the time of the first symptom suggestive of CNV (metamorphopsia, central blur, central scotoma) to presentation to the treating ophthalmologist. However, this time interval is imprecise and is often not asked for in any detail from patients, and, as such, is never presented in results of clinical trials. Despite this, a number of attempts have been made to document this period retrospectively. Retrospective evaluation can be difficult, however, since the onset of symptoms often cannot readily be determined without detailed specific questioning of the patient by a trained clinician. Additionally, the timing of symptom awareness is likely to differ, depending on the visual acuity in the other eye. Yet it is this period, from first onset of symptoms to the eventual treatment, that is likely to be quite variable and in some cases prolonged. Thus, although it is less accurate than the angiographic diagnosis, we thought it crucial to investigate to gain a better understanding of the influences on outcome to treatment, particularly as this time interval would be accessible to modification. We hypothesize that this time interval varies significantly among patients and, if shortened, is likely to have a profound effect on treatment outcome. We undertook a study to prospectively determine influences on treatment outcome with anti-VEGF drugs in AMD, and report here on the influence delay from first symptoms suggestive of CNV has on outcome, as well as the delay from the angiographically confirmed CNV to VA outcome.
This study was clinic-based with individual consultants planning their treatment schedules. In this study we chose to look at the 6-month VA outcomes, as from several pivotal randomized controlled trials it is clear that in the vast majority of cases the visual acuity increase was seen in the first 3 months of treatment, and in those patients who responded poorly, vision failed to improve from the outset, with continued losses noted early during treatment. Investigating VA outcomes at a later time point introduces the opportunity for other variables, such as injection timing and number and missed appointments, to play a role that is minimized at the shorter review time. Thus we felt it more useful to investigate an early time point outcome to address the specific issue of delay to first treatment.
Study Design and Eligibility
Study patients were recruited consecutively between August 1, 2007 and December 31, 2008 from the Medical Retina Clinic at the Royal Victorian Eye and Ear Hospital and the private rooms of the Vision Retinal Institute Eastern, in Melbourne. All patients were over the age of 50 years and were diagnosed with subfoveal CNV (subfoveal lesions included those juxtafoveal lesions that could not be safely treated with either thermal laser or photodynamic therapy, without involvement of the center of the fovea) secondary to AMD. Classic or occult leakage was included as long as the area of leakage extended under the fovea. Patients with retinal angiomatous proliferation were classified as having occult lesions, as were patients with pigment epithelial detachment. Such patients were only recruited if there was a component of occult choroidal neovascularization suggested by a notch or nonuniform filling. Lesions with significant sero-sanguinous detachments were excluded on the basis of indocyanine green angiography if the diagnosis of idiopathic polypoidal choroidal vasculopathy was suspected.
The main exclusion criteria were: 1) diagnosis of CNV secondary to other eye conditions; 2) laser photocoagulation or photodynamic therapy prior to anti-VEGF injections; and 3) non-white ancestry.
Data Collection and Follow-up
This study was designed to investigate all parameters that might influence treatment outcomes following anti-VEGF treatment. A questionnaire/information sheet was employed to collect demographic information and medical history as well as history of visual symptoms, VA, and treatment scheduling information. Great effort was taken to establish the time from first symptom suggestive of CNV, such as when the patient first described metamorphopsia, new onset of central blur, or central scotoma. This information was collected by a retinal clinician to try to minimize incorrect interpretation of patient symptoms. Treatment delay from the time of the “first symptom” was measured in terms of weeks. Time from angiographic diagnosis to first treatment was also noted.
Best-corrected visual acuity (BCVA) was recorded for all patients using an Early Treatment of Diabetic Retinopathy Study (ETDRS) chart at each appointment.
All patients underwent a dilated fundus examination including fundus autofluorescence, with fluorescein angiography and optical coherence tomography (OCT) images being taken prior to enrollment in the study confirming their neovascular status. Angiograms were accessed via IMAGEnet 2000 for analysis (Topcon Corporation, Tokyo, Japan).
At each review appointment, BCVA, fundus examination, and imaging with OCT were repeated. All treatment decisions were based solely on the discretion of the treating retinal ophthalmologist. No specific retreatment strategies were implemented; however, 63% of patients received their first 3 injections at 4- to 6-week intervals and all consultants treated to obtain a retina without fluid on OCT before extending the interval between appointments.
Review of all angiograms was undertaken to classify lesion types and size. Lesion size was measured in comparison to optic disc area using the middle phase frame of the fluorescein angiogram. Color fundus photographs and angiographic images were reviewed to determine the presence of atrophy or fibrosis. Atrophy was defined as the presence of window defect in the early phase of angiogram that did not increase in size or intensity in the later phases. Fibrosis was documented if >25% of the neovascular complex showed staining rather than leakage beyond the borders of the lesion in the late phase.
The same data were gathered at each review appointment over a 6-month follow-up period. This period was selected based on data collected from the ANCHOR and MARINA studies showing that most patients undergoing treatment declare whether they are going to respond or not respond in the first 3 to 6 months after commencement of treatment.
The primary outcome measurement of this case series was VA outcome following anti-VEGF treatment for neovascular AMD. To ensure sufficient numbers in the 3 defined outcome groups a change of 0.2 logMAR units was used. Improvement in visual acuity was defined as a decrease of 0.2 logMAR units in vision at 6 months after first treatment. Stable vision was defined as visual acuity within ±0.2 logMAR units of baseline after the same time period. Decreased acuity was considered to be an increase of 0.2 logMAR units or more at 6 months.
Individuals were recruited for subgroup analysis based on the following characteristics: 1) age (≤74, 75–84, ≥85 years); 2) sex (male or female); 3) smoking status (nonsmoker, past/current smoker); 4) treatment delay between first suggestive CNV symptoms and first treatment; 5) treatment delay between angiographically confirmed CNV and first treatment; 6) CNV lesion types (predominantly classic, non–predominantly classic); 7) lesion size (≤2 disc areas [DA], >2 DA); 8) injection number (≤3, >3); and 9) baseline VA. All of these treatment subgroups were selected before study commencement.
Data were reported as median and interquartile ranges (IQR) or mean ± standard deviations (SD) for continuous variables, and as proportions for categorical variables. The χ 2 test was used to analyze categorical data. Differences in continuous variables among different treatment outcome groups were evaluated by Kruskal-Wallis rank test for skewed distributed data and analysis of variance (ANOVA) for normally distributed data. Ordinal logistic regression analyses were undertaken to investigate the association between these subgroups and treatment outcomes. The validity of the model assumption for parallel lines was essentially assessed. All potential explanatory variables (as either a continuous or discrete measure) were examined by univariate analysis to determine if they should be included in the final model. The complete and the reduced models, along with the logit link, were used to generate the candidate models. The complete model contained all the explanatory variables while the reduced model included a subset of the predetermined explanatory variables. All factors with a univariate significance level of P < .10, except for age and sex, were included in the multivariable ordinal. Finally, the best model was chosen among all candidate models based on the model fitting statistics, the accuracy of the classification results, the validity of the model assumption, and the principle of parsimony. Statistical analysis was performed using Stata Statistical Software Release 10.0 (Stata Corporation, College Station, Texas, USA). Odds ratios (OR) with 95% confidence intervals (CI) were determined to estimate the relative risk. A 2-tailed P value of <.05 was considered statistically significant.
A total of 198 eyes from 192 patients were recruited into the study. Seven eyes of 7 patients failed to complete the follow-up period and were excluded. Of the 6 patients that had bilateral treatment, only the worse-seeing eye at 6 months was analyzed as we were particularly interested in determinants of poor outcomes. A total study population of 185 eyes of 185 patients was therefore analyzed for this study. Of these treated eyes, 75 had either bevacizumab or ranibizumab during the course of the treatment period and 110 had ranibizumab alone. The visual outcomes at 6 months were similar in the 2 groups, although a higher proportion of patients improved vision at 6 months in the ranibizumab group (n = 37, 34% vs n = 15, 20%). Counter to this, however, a larger number of eyes in the ranibizumab-only group lost vision at 6 months, n = 18/16% vs n = 5/6%. These differences were not statistically significant ( P = .40) and hence the entire cohort was analyzed as a whole.
Table 1 summarizes the baseline characteristics of our study population. In this table the responses of the eyes treated with bevacizumab or ranibizumab were analyzed together and classified according to improvement in VA (n = 53, 29%), stable VA (n = 109, 59%), or decreased VA (n = 23, 12%).
|Characteristics||Improved VA a (>2 line gain)||Stable VA a (<2 line gain or loss)||Decreased VA a (>2 line loss)||P Value|
|Age group (years)|
|≤74||16 (42)||17 (45)||5 (13)||.27|
|75–84||28 (26)||67 (63)||12 (11)|
|≥85||9 (23)||25 (63)||6 (14)|
|Male||27 (37)||31 (42)||15 (21)||.001|
|Female||26 (23)||78 (70)||8 (7)|
|No smoking||19 (26)||46 (63)||8 (11)||.62|
|Past/current smoker||31 (31)||55 (56)||13 (13)|
|Time delay: symptoms to treatment (weeks)|
|Lowest tertile (<7)||21 (38)||28 (51)||6 (11)||.34|
|Middle tertile (7–21)||16 (30)||30 (56)||8 (14)|
|Highest tertile (>21)||11 (20)||36 (67)||7 (13)|
|Time delay: diagnosis of CNV to treatment (weeks)|
|Lowest tertile (<1)||24 (29)||48 (57)||12 (14)||.42|
|Middle tertile (1–3)||17 (34)||26 (52)||7 (14)|
|Highest tertile (>3)||11 (22)||33 (66)||6 (12)|
|Baseline logMAR VA b|
|Median (interquartile range)||0.70 (0.52)||0.60 (0.40)||0.48 (0.22)||.002|
|CNV lesion type|
|Predominantly classic||13 (28)||27 (57)||7 (15)||.86|
|Non–predominantly classic b||39 (29)||81 (60)||16 (11)|
|CNV lesion size|
|≤2 DA||35 (31)||62 (55)||15 (14)||.37|
|> 2 DA||13 (21)||41 (67)||7 (12)|
|Number of treatments|
|≤3||23 (26)||54 (61)||11 (13)||.84|
|>3||28 (30)||54 (58)||11 (12)|
The majority of study participants were in the 75–to–84 age group; 58% of patients were past or present smokers. The most common timing in treatment delay between the first symptoms suggestive of CNV (visual distortion, blur, scotoma) and first anti-VEGF injection was within the shortest tertile, that being less than 7 weeks (n = 55, 34%). Similarly, the shortest tertile of less than or equal to 1 week was the commonest delay between CNV diagnosis and first treatment (n = 84, 48.3%). Patients in this study population had mostly non–predominantly classic lesion types (n = 136, 74%), smaller than or equal to 2 DA (n = 112, 66%).
The percentage of study eyes that improved generally decreased as treatment delay from first symptoms was prolonged across the 3 VA categories of treatment response. Among those with a delay to treatment of 7 weeks or less, 38% of eyes improved vision at 6 months. This compared to improvement in only 20% of eyes where the delay was 21 weeks or more.
A similar percentage of eyes had a significant loss in vision during the study irrespective of the time delay between initial symptoms and treatment: 11% with <7 weeks delay, 14% with 7 to 21 weeks delay, and 13% with over 21 weeks. Looking specifically at the eyes that improved by 2 or more lines compared to those that did not, eyes treated within 7 weeks (n = 21, 38%) were much more likely to have improved vision compared to eyes treated >21 weeks (n = 11, 20%) after initial symptoms (χ 2 4.17, P = .04). Table 2 characterizes the characteristics of these eyes with respect to baseline clinical and angiographic features.
|Delay to Presentation||Clinical/Angiographic Appearance a|
|Fibrosis||Atrophy||Chronic PED||Subretinal Hemorrhage||PC||NC|
a Fibrosis: >25% staining of lesion on angiography; Atrophy: window defect in early phase of angiogram with loss of retinal pigment epithelium clinically; Chronic PED: chronic pigment epithelial detachment; Subretinal hemorrhage: large subfoveal hemorrhage with significant masking of details on angiography; PC: predominantly classic choroidal neovascular complex with no features of fibrosis or atrophy; NC: Non–predominantly classic choroidal neovascular complex with no features of fibrosis or atrophy.
Most eyes that lost vision had features of atrophy or fibrosis at baseline, with a smaller number having significant subretinal blood or chronic pigment epithelial detachment prior to treatment. No difference was noted in these characteristics with respect to extent of delay in presentation, although lesions with more atrophy appeared to be more prevalent in patients with longer delays in presentation and those with significant subfoveal hemorrhage tended to present earlier.
In eyes with good vision at baseline (<0.30 logMAR), we found a trend that eyes treated within 7 weeks of initial symptoms had a greater chance of maintaining or experiencing improvement in vision compared to eyes with delays of >21 weeks (90% vs 67%, χ 2 = 1.48, P = .48). Where baseline vision was poor (logMAR >1.00), there was a significantly greater chance of vision improvement if treatment was prompt (80% vs 25%, χ 2 = 6.71, P = .05).
Of the other baseline characteristics, men were more likely to improve with treatment than women but were also more likely to lose vision ( P = .001). Baseline vision was associated with treatment outcome at 6 months. Those patients with better initial vision were more likely to deteriorate despite treatment, compared to those with poorer starting vision ( P = .002). There was no correlation, however, between baseline vision and length of delay in presentation from first symptoms (r = 0.14, P = .08). Lesion size had no effect on treatment outcome ( P = .37).
Considering the delay to presentation with regard to the visual acuity in the fellow eye, there were 102 eyes where the fellow eye had an acuity of <0.30 logMAR and 61 where the acuity was >0.30 logMAR. Of these 2 groups, the median delay to presentation was 12 weeks (range 0–107) in eyes with good vision in the fellow eye, compared to 10 weeks (range 0–89) in those eyes with impaired vision in the fellow eye ( P = .73).
The primary aim of univariate analysis was to assess whether any particular clinical characteristics impacted on VA outcomes with anti-VEGF treatment. As no covariate violated the parallel assumption of ordinal logistic regression analysis, we used this statistical methodology to compare the variables within each subgroup and their relationship with treatment outcomes.
The following characteristics were found to have a significant impact on “adverse” visual outcome (increased risk of deterioration in vision or reduced chance of visual improvement) at 6 months. Patients with a longer delay between their first visual symptoms of CNV (visual distortion, blur, scotoma) and their first anti-VEGF treatment were more likely to have poor treatment outcomes (OR 1.98; 95% CI 1.00, 4.08; P = .05. Better baseline VA was strongly associated with a greater loss of vision compared to those with poor vision at baseline (OR 0.30; 95% CI 0.15, 0.61; P = .001.
Age, sex, treatment delay from diagnosis of CNV (as seen on fluorescein angiography) to first treatment, lesion size, and lesion type had no influence on visual outcome ( P > .10).
The finding of better baseline vision and longer delay to treatment from symptoms being associated with a greater chance of adverse outcome was also seen when eyes were characterized with respect to lesion type, delay to presentation, visual acuity in the fellow eye, and type of anti-VEGF agent used in treatment.
In terms of lesion type, we found that better baseline vision was a significant predictor of adverse vision outcome in both non–predominantly classic lesions (OR 0.23; 95% CI 0.08, 0.58; P = .002) and predominantly classic lesions (OR 0.47; 95% CI 0.20, 0.97; P = .06). Similarly, a delay of 21 or more weeks in presentation was seen to influence outcome in both lesion types, with the risk of adverse outcome being OR 2.1; 95% CI 0.97, 5.20; P = .06 in non–predominantly classic lesions and OR 2.06; 95% CI 0.93, 4.61; P = .07, in predominantly classic lesions.
To assess whether visual acuity in the fellow eye impacts on the finding that increasing delay to presentation affects visual outcome (as the estimation of delay might be more robust in the second eye if the first eye has poor vision), we analyzed eyes in 2 groups: those with “good” vision in the fellow eye (logMAR <0.30) and those with “bad” vision (logMAR >0.30).
Of the eyes with good vision in the fellow eye, we found that increasing age (OR 8.06; 95% CI 1.84, 35.30; P = .006), better baseline vision (OR 0.30; 95% CI 0.12, 0.76; P = .01), and increasing delay to presentation (OR 5.46; 95% CI 1.91, 15.7; P = .002) were likely to affect outcome adversely. In eyes with pre-existing bad vision in the fellow eye, similar trends were seen for increasing age (OR 1.62; 95% CI 0.30, 9.69; P = .61), sex (OR 0.37; 95% CI 0.10, 1.35; P = .13), increasing delay to presentation (OR 1.49; 95% CI 0.38, 5.87; P = .56), and better baseline vision (OR 0.20; 95% CI 0.02, 1.92; P = .16), although none were statistically significant.
With respect to the type of anti-VEGF agent used during treatment, we found that in both the combined bevacizumab and ranibizumab group and ranibizumab-alone group, baseline vision was a significant determinant of vision, P = .004 and P = .02 respectively, with better vision less likely to improve (or more likely to deteriorate). Delay of 21 weeks or more was associated with an increased likelihood of adverse visual outcome in both groups, although this was only significant in the ranibizumab-only group (OR 3.20; 95% CI 1.20, 8.50; P = .02), with a trend being seen in the combined group (OR 1.95; 95% CI 0.50, 8.50; P = .30).
We also considered the time delay between angiographic diagnosis of CNV and treatment. For this parameter we saw no significant difference in outcome when these shorter time periods were considered. In our study the time delay between angiographic diagnosis and treatment was uniformly short (median time for treatment from diagnosis was 2 weeks [interquartile range 1–4 weeks]), and when this delay was divided into <1 week, 1 to 3 weeks, and >3 weeks, no significant impact was seen with respect to adverse outcome (OR 0.82; 95% CI 0.40, 1.65; P = .57 for 1–3 weeks and OR 1.04; 95% CI 0.53, 2.06; P = .90 for >3 weeks delay).
All clinical characteristics with a significance level of P < .10 identified in the univariate analysis were incorporated into a multivariate analysis ( Table 3 ). Age and sex was also included to be consistent with existing literature. Longer treatment delay from first symptoms of CNV was a highly significant predictor ( P = .015) of adverse outcome when adjusting for age, sex, and baseline VA, with an OR of 2.62 (95% CI 1.20, 5.68). As such, a treatment delay of greater than 21 weeks, compared to 7 weeks or less, resulted in a 2.6-fold increase in the likelihood of adverse response despite treatment.