Glaucomatous Influence on Visual Function
In addition to the previously discussed visual field changes in glaucoma (see Chapter 5), other visual function tests may have abnormal results early in glaucoma. Some of these tests may one day prove useful in detecting the presence and progression of glaucoma and in judging the efficacy of glaucoma therapy.
BRIGHTNESS SENSITIVITY
Patients with glaucomatous optic atrophy have decreased light sensitivity when dark adapted, which correlates with the degree of nerve damage (1), and dark adaptation, tested with chromatic stimuli, has been reported to be abnormal in patients with ocular hypertension (2). The results of some studies provided little evidence for photoreceptor abnormalities in glaucoma (3,4), but other studies suggested that the photoreceptors may be involved in glaucomatous damage (5,6). Light sensitivity can also be evaluated with a brightness ratio test, in which the patient discriminates the difference in sensitivity of the two eyes to light, and it has been suggested that tests of this type may be useful in glaucoma screening (7,8). In preliminary studies, patients with open-angle glaucoma had abnormal responses on dichoptic testing, in which one half of a test object is presented to one eye, and the other half to the fellow eye, to help determine the location of a defect in the visual pathway (9).
COLOR VISION
Reduced sensitivity to colors has been described in patients with ocular hypertension, tilted discs, and various forms of glaucoma, and may precede any detectable loss of peripheral or central vision by standard acuity or visual field testing (10). Compared with achromatic sensitivity, color sensitivity was found to be more affected in glaucoma (11). Most studies agree that the color vision deficit is associated primarily with blue-sensitive pathways (12–25). This is consistent with the observation that blue signals are detected by the short-wavelength cones, and then processed by the blue–yellow bistratified ganglion cells, which are different from the midget ganglion cells (26,27). These cells project their axons to the interlaminar koniocellular layers of the lateral geniculate nucleus (28). Blue cones contribute little to the sensation of brightness or to visual acuity, which may account for why standard visual acuity tests, perimetry, or contrast sensitivity studies might miss an associated visual deficit. The color visual dysfunction is strongly related to elevated intraocular pressure (IOP) levels (22,23), suggesting that the damage is pressure induced. Selective loss of red–green sensitivity has been observed in some patients with glaucoma (29). However, chromatic visual-evoked potential (VEP), which utilizes red–green flicker, was found to be altered in nonglaucomatous optic neuropathies, but not glaucoma (30).
It is unclear whether the loss of color vision and the visual field changes associated with nerve fiber bundle loss share the same mechanism. Ocular hypertensive eyes with yellow–blue and blue–green defects were found to have diffuse early changes in visual field sensitivity (17) and an increased risk of glaucomatous visual field loss, compared with similar eyes that did not have these color vision disturbances (14). The same color abnormalities in patients with early glaucoma correlated significantly with diffuse retinal nerve fiber loss (24). However, no significant correlation between color vision scores and visual field performance was found among patients with ocular hypertension when age correction was applied to the color variable (31), and another study revealed no clear association between early glaucomatous cupping and color vision anomalies (18). Specificity is limited by the fact that the tritan deficit is also the one most frequently seen with age-related changes. When study populations were matched for age and lens density, however, color vision loss in glaucoma was still attributable in part to the disease process (21).
In most reported studies, the color vision testing was performed with the Farnsworth–Munsell 100-hue test, dichotomous (D-15) tests, or variants of these, all of which are laborious and of questionable precision. One study has shown that halogen lighting is preferable for the Farnsworth–Munsell 100-hue test in glaucoma and confirmed the presence of blue–yellow pathway deficiency in glaucoma (32). Another study has shown that although the error scores on the Farnsworth–Munsell 100-hue test were elevated in glaucomatous eyes, the test did not always discriminate well and seemed to lack a high diagnostic value (33).
Various tests have been devised to overcome limitations of the Farnsworth–Munsell test, including computer-driven monitors that present flickering color contrasts or peripheral color contrasts, an automatic anomaloscope, a color contrast sensitivity test in which the target and surround have the same luminance but different chromaticity, and a personal computer (34–38). Even with the most sensitive, precise system, however, glaucoma is not always detected, suggesting that some patients with glaucoma have true preservation of color vision (37,39,40).
As discussed in Chapter 5, short-wavelength automated perimetry (SWAP), which projects a blue target on a yellow background, has been shown to detect glaucoma damage earlier than conventional white-on-white perimetry (41–44). SWAP has also been found to be more sensitive to progression of visual field loss and to progression of glaucomatous disc cupping (45,46).
Contrast Sensitivity
Subtle loss of both central and peripheral vision can be demonstrated in some patients with glaucoma, before visual field changes are detectable with standard techniques, by measuring the amount of contrast required for a patient to discriminate between adjacent visual stimuli (47–52). In some studies, the contrast sensitivity impairment correlates with visual field (48–50,53), especially with the central visual field and optic nerve head (50,54) damage. The yield of detecting glaucoma may be increased by measuring peripheral contrast sensitivity, 20 to 25 degrees eccentrically (55,56). Tests to measure contrast sensitivity may use spatial or temporal strategies. Although spatial contrast sensitivity may be a useful adjunct, caution has been advised in interpreting the results without considering additional clinical data (52). The overlap with other causes of reduced spatial contrast sensitivity, including age, creates high false-negative and false-positive rates (50,51,57,58). Spatial contrast sensitivity has been shown to decrease in persons with healthy eyes after 50 years of age, which appears to be independent of the crystalline lens (59,60). Although spatial summation properties differ between M- and P-mediated pathways, the underlying spatial summation properties associated with these pathways are similar in control patients and those with glaucoma (61). In a study comparing the decrease in contrast sensitivity between normal aging and glaucoma, aging decreased low-spatial frequency-sensitive components of both the M and P pathways. Glaucoma results in a further reduction of sensitivity that does not seem to be selective for M or P functions, which the investigators presumed were mediated by cells with larger receptive fields (62). For reference, frequency doubling technology (FDT) measures the contrast threshold to low spatial frequency, high temporal frequency sinusoidal luminance profile bars (63).
Sine-wave gratings of parallel light and dark bands (Arden gratings), in which the patient must detect the striped pattern at various levels of contrast and spatial frequencies, have been evaluated extensively in this group of psychophysical tests (47). The original Arden gratings were limited by the subjectivity of the required responses (64,65). A modification, in which the patient must indicate the orientation of the gratings, has been reported to minimize this limitation (65). The testing methods include computer-controlled video displays and photographically reproduced grating patterns, both of which have given good approximations of the spatial contrast sensitivity function (66). One of these tests uses sine-wave gratings of low spatial frequency and laser interference fringes to increase sensitivity to peripheral defects (67–69).
Performing these techniques, including sinusoidal grating targets, is difficult and time consuming. An effort to minimize these limitations has led to the development of high-pass resolution perimetry (discussed in Chapter 5).
Temporal contrast sensitivity, in which the patient must detect a visual stimulus flickering at various frequencies, provides another measure of contrast sensitivity and appears to be more useful than spatial contrast sensitivity in patients with glaucoma. The stimulus may be presented as a homogeneous flickering field (flicker fusion frequency) or as a counterphase flickering grating of low spatial frequency (spatiotemporal contrast sensitivity) (59,70,71). Patients with glaucoma may have reduced function with either method, although the latter appears to be a more sensitive test (71,72). Spatiotemporal contrast sensitivity was also found to be more useful in detecting glaucoma than spatial contrast sensitivity testing of the central retina was, although, again, the usefulness of the test is limited to those younger than 50 years (59). Other studies have found age to be a less significant factor in sensitivity loss, although one study suggested that cardiovascular disease may be associated with foveal dysfunction (73–75). There is also a question as to whether temporal contrast sensitivity loss among patients with ocular hypertension represents early glaucomatous damage or a transient effect of raised IOP. One study suggested that either mechanism may be found within subsets of this population (76). Reducing the IOP in patients with glaucoma may improve contrast sensitivity at high frequencies of 18 cycles/degree (77).
Several techniques have been evaluated to improve the usefulness of contrast sensitivity testing. One study suggested that the determination of a ratio between spatial contrast sensitivity and flicker sensitivity measures visual pathology more precisely than the absolute value of either test does (78). Another test of temporal contrast sensitivity, in which the patient must discriminate two rapidly successive pulses of light from a single pulse, is reported to be highly sensitive and specific in distinguishing glaucomatous eyes from healthy ones (79). Another test, the whole-field scotopic retinal sensitivity test, uses a flashlight-sized device in which the patient views a white light in the entire visual field and is asked to detect alternating illuminated and dark fields at 1-second intervals (80). This test may be useful as a screening tool (80,81), although one study found too much overlap between healthy persons and individuals with ocular hypertension (82).
Another attempt to use a temporal contrast or flickering target has been called temporal modulation perimetry or flicker perimetry (83–85). In healthy eyes there is an age-related loss of temporal modulation sensitivity (83). It appears to be less affected by visual acuity or retinal degradation than either light-sense or resolution perimetry, and it is more sensitive than light-sense perimetry to increasing IOP (84–86).
Different target shapes and patterns, which the patient must distinguish, are also reported to be of particular value in detecting optic nerve disease (87). In one study with pattern discrimination perimetry, long-term and short-term fluctuations were clinically significant but did not prevent adequate separation between normal and abnormal measurements (88). Visual function in glaucomatous eyes, as measured by contrast sensitivity, has been shown to improve after β-blocker therapy (89).
ELECTROPHYSIOLOGIC STUDIES
Most measures of visual fields and other visual functions are dependent on the patient’s subjective response. A significant amount of work is also being done on alternative, objective methods of evaluating the visual field. The pattern electroretinogram, the photopic negative response of the electroretinogram, and the multifocal VEP (mfVEP) appear to have the most potential to detect early glaucomatous damage that may not be detected by standard automated perimetry (90–96). Of the currently available electrophysiologic tests, the mfVEP is the only one that can provide topographic information about the visual field defects. The relation between electrophysiologic tests and the underlying damage to ganglion cells is still not completely understood, but it has been suggested that the signal in the mfVEP response may be linearly related to the ganglion cells loss (93). Patients with glaucoma were also found in one study to have increased baseline values with electro-oculography (97), but a subsequent study did not confirm that finding (98).
Electroretinograms
Electroretinograms (ERGs) evoked by reversing checkerboard or grating patterns, referred to as pattern ERGs (PERGs), are sensitive to retinal ganglion cell and optic nerve dysfunction and have reduced amplitudes in patients with glaucoma (92,99–106). PERG may detect early damage to ganglion cells (91), which may explain why reduced PERG amplitudes appear in the early stages of glaucoma and in some eyes with ocular hypertension, especially those at elevated risk for glaucoma (101,105–110). These findings suggest that PERG may be useful in discriminating between those patients with ocular hypertension who will develop visual field loss and those who will not.
Studies differ on whether PERG correlates with IOP and disc topography, with one study showing no correlation and others showing an association with IOP control, computed optic nerve head analysis, or the retinal nerve fiber layer thickness (108,111–113). The PERG has been shown to correlate with visual field indices (114), and visual field defects are associated with PERG reduction in the corresponding hemisphere (115). However, no precise correlation was found with color vision deficits (116). Decreased amplitude and an increase in peak latency were found to correlate with increasing age (104), paralleling the estimated normal loss of ganglion cells. Indeed, reduction in PERG was directly related to histologically defined optic nerve damage in a monkey model (117). PERG in combination with SWAP was shown in one study to improve the power to predict progression of visual field loss (118).
The ERG evoked by a flash of light (flash ERG) is affected more by outer retinal elements and is not typically abnormal in glaucoma. Acute IOP elevation in cats, however, caused a reduction in both pattern and flash ERG, proportional to the reduction in perfusion pressure and regardless of the absolute IOP, suggesting a vascular mechanism to which the ganglion cells are less likely to recover (119). Patients with glaucoma in one study had reduced ERG amplitudes in response to a flickering stimulus (flicker ERG) (106). One study suggested that the flash and pattern ERG changes in glaucoma cannot be attributed simply to optic atrophy, suggesting additional outer retinal damage in glaucoma (120).
Multifocal ERG (mfERG) (Fig. 6.1) permits simultaneous recording of multiple spatially localized ERG (121,122). It consists of the same components as a standard ERG (123). Preliminary studies suggest that it does not appear to correlate well with glaucomatous damage and may be able to detect abnormalities before automated achromatic visual fields can (124–129).