Expand this Topic clickable element to expand a topic
Skip to content
Optica Publishing Group

Preserving polarization maintaining photons for enhanced contrast imaging of the retina

Open Access Open Access

Abstract

The purpose of this study is to demonstrate the feasibility of using polarization maintaining photons for enhanced contrast imaging of the retina. Orthogonal-polarization control has been frequently used in conventional fundus imaging systems to minimize reflection artifacts. However, the orthogonal-polarization configuration also rejects the directly reflected photons, which preserve the polarization condition of incident light, from the superficial layer of the fundus, i.e., the retina, and thus reduce the contrast of retinal imaging. We report here a portable fundus camera which can simultaneously perform orthogonal-polarization control to reject back-reflected light from the ophthalmic lens and parallel-polarization control to preserve the backscattered light from the retina which partially maintains the polarization state of the incoming light. This portable device utilizes miniaturized indirect ophthalmoscopy illumination to achieve non-mydriatic imaging, with a snapshot field of view of 101° eye-angle (67° visual-angle). Comparative analysis of retinal images acquired with a traditional orthogonal-polarization fundus camera from both normal and diseased eyes was conducted to validate the usefulness of the proposed design. The parallel-polarization control for enhanced contrast in high dynamic range imaging has also been validated.

© 2023 Optica Publishing Group under the terms of the Optica Open Access Publishing Agreement

1. Introduction

Ocular diseases continue to pose a significant threat to eye health worldwide. It is projected that around 560 million people will be affected by three major retinal diseases, namely diabetic retinopathy (DR), glaucoma, and age-related macular degeneration (AMD) by the year 2045 [13]. Early detection of these diseases requires comprehensive observation of both the peripheral and the central part of the fundus and retina, which could be achieved by using modern scanning laser ophthalmoscopy (SLO) devices [4,5]. Despite providing ultra-wide field images, commercially available SLO devices can be expensive, and the bulky dimension makes it challenging to deploy in remote areas. By providing depth-resolved imaging capability of the retina, ultra-widefield optical coherence tomography (OCT) has been also explored for clinical management of eye diseases [6]. However, the sophisticated ultra-wide field OCTs are also expensive and bulky, limiting their applications for clinical management and telemedicine. The majority of people worldwide who are blind or have moderate to severe visual impairment reside in low- and middle-income countries [7]. A significant portion of that could have been prevented by screening programs that utilize fundus imaging. Therefore, developing cost-effective, portable wide-field fundus imaging devices is imperative as we implement screening programs for DR, AMD, glaucoma, and other potentially blinding eye conditions.

Several commercial portable fundus cameras are already being used by clinicians. Despite being portable and affordable, the field of view (FOV) of these devices is typically around 45°-67.5° eye-angle (30°- 45° visual angle) [8], making them difficult to access the peripheral retina. To facilitate the visualization of the peripheral retina, miniaturized indirect illumination based portable fundus cameras were proposed [911]. The FOV of these nonmydriatic devices is around 101° eye-angle (67° visual angle) and could be extended up to 190° eye-angle (134° visual angle) with the aid of fixation targets. Both traditional and miniaturized indirect illumination based portable fundus cameras use the pupillary path for illumination and imaging. The ophthalmic lens used in these systems reflects a significant amount of light back into the system, creating reflection artifacts in the image. To remove these artifacts, orthogonal polarizers are placed between the illumination path and the imaging path [10]. In addition to removing the back-reflected light from the ophthalmic lens, this configuration removes the reflected and scattered light from the retina which partially maintains the polarization state of the illumination. Therefore, the retinal image is predominated by multiply scattered photons. Van de Kraats et. al. [12] calculated that about 90% of the reflected light from the human fundus in mid-spectral range (480 nm-580 nm) originates from the photoreceptor outer segment (OS) layer, and a significant portion of that light partially maintains its polarization state. While the specific values vary in different publications, there is no doubt that a large portion of the light in the green and blue region of the spectrum reflected from the retina preserves the polarization state of the input, and this decreases with the increase of wavelength [13,14]. Crucially, the green and blue spectrum of light is needed to clearly visualize the retinal vasculature and the biomarkers of most retinal diseases [15]. Therefore, eliminating polarization maintaining photons with orthogonal polarizers indeed disproportionately reduce the information content in the green channel, eventually reducing the image contrast and the useful information regarding critical retinal disease biomarkers.

In this article, we present a portable fundus camera with orthogonal-polarization control for rejecting the back-reflected light from the ophthalmic lens, and parallel-polarization detection of the specularly reflected or backscattered light from the retina which fully or partially maintains the original polarization state that goes through to the sensor to provide high contrast fundus images. Miniaturized indirect illumination is implemented to achieve a FOV of 101° eye-angle (67° visual angle). The resultant images show inner retinal pathology and biomarkers more clearly, compared to images taken with a conventional fundus camera, which would aid clinicians to detect retinal diseases more efficiently.

Traditionally, the visual angle has been the unit of FOV measurement for fundus cameras. Recently, widefield fundus imagers, such as Optomap SLO (Optos Inc., Marlborough, MA, USA), ICON (Neolight, Pleasanton, CA, USA), and Retcam (Natus Medical Systems, Pleasanton, CA, USA) use eye angle as the unit of FOV. To avoid unnecessary confusion, both eye and visual angle are presented every time the FOV of the system is discussed [16].

2. Materials and methods

2.1 Experimental setup

Figure 1(a) illustrates the optical diagram of the proposed system. The light source (LS) and the camera lens (CL) (8 mm f/2.5 micro video lens, 58-203, Edmund Optics Inc., Barrington, NJ) are placed in a perpendicular plane, we call this the CL-LS plane. The detailed diagram of this plane is illustrated in Fig. 1(b). The CL-LS plane is conjugated to the pupil plane of the eye with a magnification of 0.25X with the help of the ophthalmic lens (OL) (f = 25 mm) (Volk 40D, Volk, Mentor, OH, USA). Since the diameter of the pupil is ∼ 4 mm in room light conditions, the camera lens and the light source must be situated inside a 16 mm diameter circle in the CL-LS plane, otherwise, the light source would be blocked by the iris and no light will go inside the retina. Figure 1(b) shows that the light source and the camera lens are well within this limit. The light source is composed of two separate LEDs, an 810 nm NIR LED (M850LP1, Thorlabs Inc., Newton, NJ, USA) for guidance during initial alignment and a visible light LED with a center wavelength of 565 nm (M565D2, Thorlabs Inc., Newton, NJ, USA) for color fundus imaging. This visible light LED has a relatively wide bandwidth (full width at half maximum of 104 nm) stretching from the blue region of the light spectrum to the red end of the spectrum, which has been proven to comprehensively show retinal vasculatures and the disease biomarkers of most common retinal diseases [15]. There is a buffer between the camera lens and the light source of about 5 mm, which ensures the reflected light from the corneal surface doesn’t enter the imaging path [17]. Linear polarizers LP1 and LP2 (LPVISE050-A, Thorlabs Inc., Newton, NJ, USA) are placed in front of the LS and CL to reject the back reflected light from OL. A quarter waveplate (QW) (90-940, Edmund Optics Inc., Barrington, NJ) is placed in front of the OL with its axis 45° angled with the transmission axis of LP1 and LP2 to convert the linearly polarized input light to circularly polarized one. After the light reaches the retina, an image is constructed at the retina conjugate plane (RCP) and is relayed to the sensor (CS) (FL3-U3-120S3C-C, FLIR Integrated Imaging Solutions Inc., Richmond, Canada) by the CL. To minimize the effect of eye movement and select the region of imaging, a dimly lit external fixation LED was used (not shown in the optical diagram). An imaging session with functional prototype is shown in Fig. 1(c). The FOV of the system is calculated following the steps mentioned in the ISO standard “Ophthalmic Instruments—Fundus Cameras” (10940: 2009) [18].

 figure: Fig. 1.

Fig. 1. (a) The optical diagram of the system. (b) Detailed layout of the CL-LS plane. (c) Photographic illustration of the imaging operation. (d) Schematic illustration of polarization states of illumination and imaging light paths in the system with orthogonal-polarization control. (e) Schematic illustration of polarization states of illumination and imaging light paths in the system with a QW between the eye and OL to preserve polarization maintaining photons for fundus imaging. CS: camera sensor; CL: camera lens; LS: light source; LP1-LP2: linear polarizers; OL: ophthalmic lens; RCP: retina conjugate plane; QW: quarter waveplate.

Download Full Size | PDF

The performance of the optical system is illustrated in Fig. 2. The spots from 0° to 33.5° visual fields are plotted, verifying the FOV to be 67° visual angle (101° eye angle). The root-mean-square (RMS) spot radii is 1.78 µm at 0° visual field which gradually increases to 14.44 µm at 33.5° visual field. From the modulation transfer function (MTF) plot in Fig. 2(b), it is observed that ∼730 cycles/mm can be resolved at all field angles. An MTF of over 0.3 is considered clearly recognizable [19]. For our design, the special frequency values are 44, 71, 190, and 300 cycles/mm at 33.5°, 20°, 10°, and 0° visual fields when MTF is greater than 0.3. According to the requirement specified in ISO 10940:2009(E) standard, for a fundus camera with a FOV greater than 30°, the minimum resolving power should be 60, 40, and 25 cycles/mm for the center, middle and peripheral fields, respectively. It is evident that the performance of the system is well within the requirement specified by the ISO standard.

 figure: Fig. 2.

Fig. 2. (a) The spot diagram for different field angles. (b) The modulus of the optical transfer function vs Spatial frequency graph for different field angles.

Download Full Size | PDF

2.2 Preservation of the polarization maintaining photons reflected from the retina

The human retina consists of multiple functional layers and the reflected light from the retina is a complex combination of the specularly reflected, diffusely reflected, and multiply scattered light coming from these layers. Out of these different layers, the photoreceptor OS layer is regarded as the primary layer responsible for specular or directional reflection which maintains its polarization state [12,20,21]. As the light goes deeper than the OS layer and into the retinal pigment epithelium (RPE) and the choroid, it becomes depolarized because of dense scatterers with variable sizes present in those layers.

Figure 1(d)-(e) explains the theory of preserving the polarization maintaining light reflected from the retina in our setup. Figure 1(d) shows a conventional setup that works in the orthogonal-polarizing configuration. The light from the light source (LS) goes through a linear polarizer (LP1) and becomes linearly polarized (blue arrows showing the direction of polarization). A portion of this light reflects from the ophthalmic lens (OL) towards the camera lens (marked in red arrows) and gets blocked by the linear polarizer LP2. The light that goes through the ophthalmic lens enters the retina, and the polarization maintaining photons (marked as brown arrows) along with depolarized photons (marked as green arrows) come back into the imaging system. The polarization maintaining photons are blocked by the orthogonal polarizer (LP2), and the depolarized photons with the polarization direction of LP2 would primarily contribute to the image formed at the image sensor. It should also be noted that, because of the birefringence property of the retina, some photons will change their polarization direction slightly and contribute to the image formation along with the aforementioned multiply scattered photons. Although the artifacts from the ophthalmic lens back reflection are rejected, there would be a significant loss of information as the polarization maintaining light is rejected.

Figure 1(e) illustrates our method of preserving the polarization maintaining photons coming from the retina. As we can see, a quarter waveplate is placed in between the ophthalmic lens and the lens of the eye, and the axis of the waveplate is fixed at 45° relative to the polarizer’s axis. Linearly polarized input light goes through the quarter waveplate and becomes circularly polarized light before entering the retina. Inside the retina the circularly polarized light goes through the almost transparent inner-retina and the first significant reflector faced by the circularly polarized light is the photoreceptor OS layer [12]. The reflection from the OS is specular or directional in nature [20,21], therefore, the portion of the circularly polarized input light that is reflected by it is helically flipped [22]. The remainder of the light goes into the posterior layers, gets depolarized and reflected from the choroid and the sclera. The helically flipped output photons (brown curved arrows) and the depolarized photons (green arrow) come back into the imaging system. After going through the waveplate on the return path, the circularly flipped photons become linearly polarized again, but their polarization axis aligns with the axis of LP2. Therefore, all these photons along with the depolarized photons which have their polarization axis parallel to LP2 pass through to the camera sensor and create an image of the retina. The proposed system still works as an orthogonal-polarization setup when blocking the back-reflected light from the OL (marked by red arrows). However, for convenience, we would refer to it in the subsequent sections as parallel-polarization control for retinal imaging since the reflected light from the retina achieves a polarization state parallel to the polarizer in front of the camera.

Images taken with conventional orthogonal-polarization control should be dominated by depolarized light from the outer retinal layers, and consequently, should be red dominant. Green light is mostly reflected from the inner retinal layers which maintains polarization. Green light that goes into the outer retinal layers gets absorbed and contributes very little in the image [13]. Image taken with parallel configuration still preserves the depolarized light reflected from the outer retinal layers, like the orthogonal configuration, but it also preserves the reflected light from the inner retina, providing valuable information of the disease biomarkers.

2.3 Human subjects and imaging protocol

This study was approved by the Institutional Review Board of the University of Illinois Chicago (UIC) and followed the ethical standards stated in the Declaration of Helsinki. The subjects with retinal diseases and the controls were recruited from the UIC Department of Ophthalmology & Visual Sciences clinic. The recruits consented properly before participating in the experiment. After the subjects were ready to do the imaging, the light in the room was dimmed to make sure the pupillary diameter was equal to or above 4 mm. The patients were asked to keep looking at the fixation target throughout the imaging session with the eye that was not being imaged. Once the eye is fixed, NIR guidance was used to align the system with the eye without stimulating the pupillary reflex, and once it is well aligned, the images were taken with visible light. The exposure time of each acquisition was 50 ms, which is much shorter than the pupillary reflex time (150 ms). After taking images with our proposed design, we removed the waveplate from the prototype and again followed the same procedure to take comparative images. For only one of those subjects, we also removed all the polarizers from the system and took an image for comparison (Fig. 2). A LABVIEW interface was designed to facilitate the whole imaging procedure properly. For high dynamic range (HDR) fundus imaging, three sequential images, with the illumination power being doubled with each acquisition, were captured. The exposure time was controlled to 35 ms for each image to ensure the imaging could be done within the pupillary reflex time. The HDR image was generated from these three low dynamic range images by a MATLAB program.

2.4 Light safety

To evaluate the safety of using the prototype, we calculated the radiant exposure on the retina for visible light and the total permitted time for continuous operation for NIR guidance illumination. The safety standard was provided by ISO standard “Ophthalmic Instruments—Fundus Cameras” (10940:2009) [18], which allowed 10J/cm2 radiant exposure on the retina. The optical power on the pupil plane for visible light and NIR light was 5 mW and 0.4 mW respectively. Using the photochemical hazard weighting function provided in the standard, the weighted irradiance of the visible light was found, which was multiplied by the exposure time to get the radiant exposure on the retina. According to our calculation, the radiant exposure was found to be $2.4 \times {10^{ - 6}}$ J/cm2, which is deemed safe for operation. Similarly, the weighted irradiance of NIR light was also calculated and found to be 0.06 mW/cm2. We divided the maximum permitted radiant exposure of 10J/cm2 by our weighted irradiance to find out the permitted time of continuous operation. It was calculated that the NIR guidance could be used continuously for 46.3 hours.

3. Results

Figure 3 illustrates fundus images of a healthy subject taken with no polarization control (Fig. 3(a)), orthogonal-polarization control (Fig. 3(b)), and our proposed parallel-polarization control (Fig. 3(c)). The images taken without polarization control (Fig. 3(a)) show the artifacts due to the ophthalmic lens back-reflection.

 figure: Fig. 3.

Fig. 3. (a) A color fundus image (a1) and separate red (a2), green (a3) and blue (a4) channels of the color image taken by the system without polarization control. (b) A color fundus image (b1) and separate red (b2), green (b3) and blue (b4) channels of the color image taken by the system with orthogonal-polarization control. (c) A color fundus image (c1) and separate red (c2), green (c3) and blue (c4) channels of the color image taken by our lab prototype fundus camera.

Download Full Size | PDF

The images taken with orthogonal-polarization control (Fig. 3(b)) and the parallel-polarization control (Fig. 3(c)) are free from the reflection artifacts. It is evident from Fig. 3(b) that the images with orthogonal-polarization control have poor vessel contrast in the color image and the brightness of the green channel is low. Only the major retinal blood vessels could be seen in the red channel image in Fig. 3(b2). Compared to that, the images in Fig. 3(c) show enhanced vessel contrast in the color image and brightness in the green channel, painting a better picture of the retina. The red channel image in Fig. 3(c2) shows a similar level of choroid in addition to more details of retinal blood vessels than Fig. 3(b2). Red light can penetrate more inside the RPE and the choroid and get depolarized. The depolarized light contributes equally to each polarization state [23]; thus, the choroidal image is similar in both images. On the other hand, our prototype also preserves the polarization maintaining red light reflected from the retina, which is proven by the improved retinal vessel contrast in the red channel image. Without polarization control, no light reflected from the retina is rejected. Nevertheless, as a major portion of the green light reflected from the retina maintains its polarization state, the vessel contrast is similar in the green channel images in Fig. 3(a3) and Fig. 3(c3). Overall, the images taken without polarization control look very similar to the images taken by our prototype, except for a slightly brighter choroid in the red channel image.

We calculated the mean intensity of the red, green, and blue channel images from twenty-two different eyes with orthogonal-polarization control and our demonstrated parallel-polarization control. Then the green and blue channel mean intensities were normalized by the mean intensity of the red channel. The results are shown in the box plot in Fig. 4.

 figure: Fig. 4.

Fig. 4. The normalized mean intensities of green and blue channels with orthogonal-polarization and parallel polarization controls.

Download Full Size | PDF

In Fig. 4, the normalized mean intensity of the green channel image is much lower for images taken with orthogonal-polarization control. Since green light preserves more polarization maintaining photons reflected predominantly from the superficial layer of the retina, orthogonal-polarization control rejects it more severely than red light. Therefore, the ratio of green channel intensity with the red channel intensity (we call it normalized green channel intensity) is low. With parallel-polarization control, the mean intensity of the red and the green channel images are similar, so the normalized mean intensity of the green channel is close to 1. The increase in the normalized mean intensity is due the acceptance of polarization maintaining photons in our prototype. The blue light is strongly absorbed by the ocular lens and retinal photopigments, and mostly the optic nerve head and regions close to it can be seen in the blue channel image. As the optic nerve head also depolarizes incoming light, the normalized mean intensity of blue channel stayed similar with orthogonal and parallel-polarization controls.

To quantitatively evaluate the image contrast enhancement, the sharpness and noise of the images taken with both configurations were analyzed and the results are presented in Table 1. The sharpness and noise were estimated by the gradient method [24] and fast noise variance estimation method [25,26] respectively. The sharpness and the noise score vary significantly based on the pigmentation level of the subject. To ensure a fair comparison, for each image pair, the sharpness and noise scores for the parallel-polarization control image were divided by the scores measured from the orthogonal-polarization control image. We call them the normalized sharpness and normalized noise index respectively.

Tables Icon

Table 1. Image Sharpness and Noise Measurementa

From Table 1, it is observed that the sharpness of grayscale and green channel image is significantly better for the system with parallel-polarization control while the noise is reduced. The sharpness and noise performance for the red and blue channels also improved, although not significantly. However, the claim of increasing the structural details could not be validated with these metrics and will be proven in the subsequent sections.

Figure 5 illustrates the effectiveness of parallel-polarization control to detect biomarkers related to retinal diseases. Figure 5(a) shows an image of the retina of a DR patient taken with parallel-polarization control and Fig. 5(b) shows the image of the same patient with orthogonal-polarization control. From visual observation, it could be said that the image taken with parallel-polarization control has more green channel information, and thus provides more detailed information about the retina. To further investigate this, one rectangular region in the peripheral retina (marked by the blue rectangles), one rectangular region in the central retina (marked by the yellow rectangles), and a region along a straight line in the fovea (marked by the red lines) were selected from both images. Figure 5(d1) and Fig. 5(e1) show the enlarged images of the regions in blue markings, and Fig. 5(f1) and Fig. 5(g1) illustrate the enlarged images of the regions with yellow markings. As the selected regions in the two images are different in brightness and tone, the corresponding hot colormaps of those regions were further illustrated in Fig. 5(d2-g2). The colormap negates the other factors and primarily shows the difference in contrast. The comparative line profile of the region along the red straight line is illustrated in Fig. 5(c).

 figure: Fig. 5.

Fig. 5. (a) Image of the retina of a DR patient taken with parallel-polarization control. (b) Image of the same retina taken with orthogonal-polarization control. (c) The comparative line profiles of the regions along the red line in Fig. 5(a) and Fig. 5(b). (d1) Enlarged image of the region marked with blue rectangle in Fig. 5(a). (d2) Hot colormap of the region marked with blue rectangle in Fig. 5(a). (e1) Enlarged image of the region marked with blue rectangle in Fig. 5(b). (e2) Hot colormap of the region marked with blue rectangle in Fig. 5(b). (f1) Enlarged image of the region marked with yellow rectangle in Fig. 5(a). (f2) Hot colormap of the region marked with yellow rectangle in Fig. 5(a). (g1) Enlarged image of the region marked with yellow rectangle in Fig. 5(b). (g2) Hot colormap of the region marked with yellow rectangle in Fig. 5(b).

Download Full Size | PDF

As it is visible from the enlarged images in Fig. 5(d1) and Fig. 5(e1) and hot colormaps in Fig. 5(d2) and Fig. 5(e2), the image taken with parallel-polarization control significantly shows a greater number of exudates (marked with black arrows) with better image contrast. We can also see similar results in Fig. 5(f2) and Fig. 5(g2), where a larger number of microaneurysms (marked with green arrows) can be detected in the image taken with parallel-polarization control. The comparative line profiles of the foveal hard exudates (Fig. 5(c)) show that the parallel-polarization image depicts four distinct peaks of four exudates, whereas the peaks are lumped in the orthogonal-polarization image. This further elucidates the enhancement of contrast by the proposed method.

HDR imaging of the retina, coupled with orthogonal-polarization control has been demonstrated to enhance the image contrast and information content [10]. To explore HDR imaging with the proposed parallel-polarization control, we followed the procedure of HDR fundus imaging (outlined in section 2.3) with both parallel and orthogonal-polarization controls. Figure 6(a) and Fig. 6(b) illustrate two HDR images taken with parallel and orthogonal-polarization controls respectively, of a patient diagnosed with ocular histoplasmosis syndrome.

 figure: Fig. 6.

Fig. 6. (a) HDR fundus image of a patient diagnosed with ocular histoplasmosis syndrome taken with parallel-polarization control. (b) HDR fundus image of the same eye taken with orthogonal-polarization control. (c) Hot colormaps of grayscale(c1), green channel (c2), and red channel (c3) of the region marked in yellow in Fig. 6(a). (d) Hot colormaps of grayscale (d1), green channel (d2), and red channel (d3) of the region marked in yellow in Fig. 6(b).

Download Full Size | PDF

From visual inspection, it is apparent that the parallel-polarization control preserves the histo spots (white dots) which are scattered throughout the retina. With orthogonal-polarization control, a few of those histo spots could barely be identified. Furthermore, we selected a region (marked with yellow squares) with a few histo spots not far from the optic nerve head in both images. Figure 6 (c) and Fig. 6(d) show the hot colormap of the grayscale, the green channel, and the red channel images of the selected region taken with parallel and orthogonal-polarization controls to illustrate the contrast enhancement. The blue channel is excluded in this analysis since the information content is poor in both images for the reasons mentioned before. The histo spots are observed to be significantly more resolved (marked with green circles) in the images with parallel-polarization control. The green channel of the parallel-polarization image shows the best contrast illustrated by the hot colormap. The grayscale and red channel of the parallel-polarization image also show better contrast compared to their counterparts.

Figure 7 illustrates comparative images taken with a commercial portable fundus camera Pictor Plus (Volk, Mentor, OH, USA), the orthogonal-polarization system and our proposed parallel polarization system. Three regions of the retina were selected for observation (yellow, black, and blue squares in Fig. 7). Figure 7 (d1) shows clear nerve fiber structures of the retina, which is not visible in Fig. 7(d2) and Fig. 7(d3). Similarly, the microaneurysms present in the sections highlighted with the black and blue squares have visibly better contrast in the parallel-polarization image (Fig. 7 (e1) and Fig. 7 (f1)). The orthogonal-polarization image also shows the aneurysms, but with compromised contrast (Fig. 7 (e2) and Fig. 7 (f2)). For the commercial device, the aneurysms are barely recognizable (Fig. 7 (e3) and Fig. 7 (f3)). This comparison proves that the preservation of polarization maintaining photons in turn conserves critical information of the retina, which is otherwise lost in orthogonal polarization based commercial portable fundus cameras.

 figure: Fig. 7.

Fig. 7. (a) Image of the retina of a DR patient taken with parallel-polarization control. (b) Image of the same eye taken with orthogonal-polarization control. (c) Image of the same eye taken with Volk Pictor Plus. (d1) Enlarged image of the region marked with yellow rectangle in Fig. 7(a). (d2) Enlarged image of the region marked with yellow rectangle in Fig. 7(b). (d3) Enlarged image of the region marked with yellow rectangle in Fig. 7(c). (e1) Enlarged image of the region marked with black rectangle in Fig. 7(a). (e2) Enlarged image of the region marked with black rectangle in Fig. 7(b). (e3) Enlarged image of the region marked with black rectangle in Fig. 7(c). (f1) Enlarged image of the region marked with blue rectangle in Fig. 7(a). (f2) Enlarged image of the region marked with blue rectangle in Fig. 7(b). (f3) Enlarged image of the region marked with blue rectangle in Fig. 7(c).

Download Full Size | PDF

4. Discussion

In summary, we have demonstrated a fundus camera that rejects polarization maintaining back-reflected light from the ophthalmic lens with orthogonal polarization control yet accepts the polarization maintaining light reflected from the retina with parallel polarization control. With the use of miniaturized indirect illumination, a wide FOV of 101° eye-angle (67° visual-angle) was achieved, in addition to superior image contrast than conventional portable fundus systems with orthogonal-polarization control. Adopting the HDR imaging protocol can further increase the image contrast of the retina and give clinicians a comprehensive tool to detect and diagnose retinal diseases.

The retina is a layered media of different cell types superimposed onto each other with different scattering coefficients and refractive indexes of individual layers [27]. When polarized light passes through the retina, the retinal layers start to depolarize the light. However, the depolarization effect is slight for the inner retina before the RPE-BM complex. The light reflected and backscattered from RPE and choroid are almost completely depolarized [28]. This applies to the light of all wavelengths. Yet, for shorter wavelengths, namely the green and blue light, mostly the directly reflected and singly scattered photons return to the imaging system. The multiply scattered photons that pass within the retina and go into the deeper layers are absorbed strongly by the melanin and choroidal pigments [13]. As the depolarized light in the shorter wavelength is efficiently absorbed inside the retina, a significant portion of the short wavelength light coming back to the imaging system at least partially maintains the original polarization state. Since the absorption is weaker for longer wavelengths, the portion of light in red wavelength returning back to the imaging path that maintains its input polarization property is lower [13].

The light reflected or scattered back from the retina is a superimposition of several subretinal components [12,29]. Yet, multiple studies found the existence of a particular layer that reflects light almost specularly. Whale et. al. [14] hypothesized Bruch’s membrane to be the reflecting layer, whereas, Millodot et. al. [30] proposed it to be somewhere near the OS of the photoreceptors. Van de Kraats et. al [12] pinpointed the OS layer of the photoreceptors to be the specularly reflecting layer, and the anterior layers to this layer are responsible for the diffusely reflected light which also maintains the polarization partially. The layer posterior to the OS layer, primarily the RPE-choroid complex and the sclera is responsible for the depolarized light [12]. As retinal disease biomarkers, such as microaneurysms, cotton wool spots, exudates etc. reside in the retinal layers anterior to the OS layer [3133], preserving the polarization maintaining light from the retina would significantly improve the detection of these biomarkers. By introducing the quarter waveplate between the ophthalmic lens and the eye lens, in addition to the depolarized light from the RPE-choroid complex and the sclera, the polarization maintaining photons from the OS layer are preserved, thus covering all major reflectors inside the eye.

Polarized light has been used for selectively enhancing superficial layer image of turbid tissues [34]. Polarization-preserving surface reflection from the air-tissue interface is one of the artifacts found in this technique. Using methods such as the usage of optically flat plate with index matching fluid and polarization gating with elliptically polarized light is used to reduce the reflection from the surface [3436]. However, for retinal imaging, the surface reflection from the inner limiting membrane (ILM)-vitreous humor interface is insignificant compared to the reflection from the OS, choroid and sclera [20,21]. The reason could be attributed to the insignificant difference between the refractive index of ILM and vitreous humor [37], and the transparent nature of the inner retina. Nevertheless, significant amount of surface reflection can come into the imaging system from the air-cornea interface and the lens-aqueous humor interface. The usage of para-axial illumination combined with proper buffer between the illumination and imaging window helps preventing the stray light from those abovementioned surfaces entering the system. An interesting property of circularly polarized light is that it can probe significantly more into scattering tissues without losing its polarization property compared to linearly polarized light when the scattering particles are bigger than the light wavelength [23], which is true for retinal imaging. As the illumination light becomes circularly polarized after passing the waveplate in our system, the depolarization effect in the retina should be even less significant compared to linearly polarized light. Moreover, given variable orientations of the neural axons and dendrites in the retina, the depolarization effect can be locally different due to the angular complications between the polarization direction of a linearly polarized illumination and the local structures, such as retinal nerve fibers. With circularly polarized light illumination and detection, the effect of orientational dependence of neural axons and dendrites can be minimized [38], and thus increasing the image homogeneity of polarization-controlled fundus photography.

Orthogonal-polarization imaging is a popular technique for tissue imaging, as it facilitates imaging of deeper structures by partially rejecting reflected light from the superficial layers. However, for retinal disease detection, both the superficial retinal layer information and the information from deeper layers are important. We have demonstrated that the proposed system enhances the structural information of the retina significantly (Fig. 5-Fig. 7). Due to the highly scattering nature of the choroid and the sclera, the light reaching these layers is depolarized, and it could be assumed that the intensity of light in each polarization state is equal. Therefore, regardless of the orientation of the polarizer in front of the camera, the choroidal information is similar in nature, meaning the structural information of the choroid imaged by the parallel-polarization system and orthogonal-polarization systems are comparable. For parallel-polarization control, the enhanced retinal layer information might overlay the choroidal layer information in the color image, making it seem less clear. However, the choroidal layer information could be separated simply by separating the color channels of the image as the longer wavelength lights mostly have the information of the choroidal layer.

Trans-pars-planar and trans-palpebral illumination in fundus imaging can provide wide FOV retinal images in a cost-effective manner [3943]. The oblique illumination also means that the image is primarily constructed by light scattered from the retina as the reflected light doesn’t reach the imaging path. Therefore, this oblique illumination path could provide some valuable complementary information if it’s combined with the proposed system. However, the illumination light must pass through complex skin and scleral tissues before it reaches the retina, which makes the imaging efficiency directly affected by the illumination location, light wavelength, and subject pigmentation [44]. Further research on the polarization property of these oblique illumination techniques is required before they could be used for polarization-controlled fundus photography.

5. Conclusion

We have demonstrated a portable, non-mydriatic, widefield fundus camera that simultaneously has orthogonal-polarization control to reject back reflected light from the ophthalmic lens and parallel-polarization control to preserve polarization maintaining light reflected from the retina. A FOV of 101° eye-angle (67° visual angle) is achieved with the help of miniaturized indirect illumination. Comparative imaging with a conventional orthogonal-polarization setup revealed the superior image quality of our proposed design. This portable, high-contrast fundus camera can potentially provide clinicians in rural and under-deserved areas an opportunity to acquire high-quality fundus images with a wide FOV in a cost-effective manner, which will be essential for telemedicine programs in ophthalmology.

Funding

Richard and Loan Hill Department of Biomedical Engineering, University of Illinois Chicago; Research to Prevent Blindness; National Eye Institute (P30 EY001792, R01 EY023522, R01 EY029673, R01 EY030101, R01 EY030842, R44 EY028786).

Acknowledgment

The authors thank Dhara Shah for recruiting the patients from the UIC eye clinic.

Disclosures

Patent applications for wide field and high contrast fundus photography.

Data availability

Data may be obtained from the authors upon reasonable request.

References

1. K. Allison, D. Patel, and O. Alabi, “Epidemiology of Glaucoma: The Past, Present, and Predictions for the Future,” Cureus 12, e11686 (2020). [CrossRef]  

2. Z. L. Teo, Y.-C. Tham, M. Yu, M. L. Chee, T. H. Rim, N. Cheung, M. M. Bikbov, Y. X. Wang, Y. Tang, Y. Lu, I. Y. Wong, D. S. W. Ting, G. S. W. Tan, J. B. Jonas, C. Sabanayagam, T. Y. Wong, and C.-Y. Cheng, “Global Prevalence of Diabetic Retinopathy and Projection of Burden through 2045: Systematic Review and Meta-analysis,” Ophthalmology 128(11), 1580–1591 (2021). [CrossRef]  

3. W. L. Wong, X. Su, X. Li, C. M. Cheung, R. Klein, C. Y. Cheng, and T. Y. Wong, “Global prevalence of age-related macular degeneration and disease burden projection for 2020 and 2040: a systematic review and meta-analysis,” The Lancet. Global health 2(2), e106 (2014). [CrossRef]  

4. N. Quinn, L. Csincsik, E. Flynn, C. A. Curcio, S. Kiss, S. R. Sadda, R. Hogg, T. Peto, and I. Lengyel, “The clinical relevance of visualising the peripheral retina,” Prog. Retinal Eye Res. 68, 83–109 (2019). [CrossRef]  

5. A. Nagiel, R. A. Lalane, S. R. Sadda, and S. D. Schwartz, “ULTRA-WIDEFIELD FUNDUS IMAGING: A Review of Clinical Applications and Future Trends,” Retina 36(4), 660–678 (2016). [CrossRef]  

6. Q. Zhang, K. A. Rezaei, S. S. Saraf, Z. Chu, F. Wang, and R. K. Wang, “Ultra-wide optical coherence tomography angiography in diabetic retinopathy,” Quant. Imaging Med. Surg. 8(8), 743–753 (2018). [CrossRef]  

7. P. Ackland, S. Resnikoff, and R. Bourne, “World blindness and visual impairment: despite many successes, the problem is growing,” Community eye health 30, 71–73 (2017).

8. N. Panwar, P. Huang, J. Lee, P. A. Keane, T. S. Chuan, A. Richhariya, S. Teoh, T. H. Lim, and R. Agrawal, “Fundus Photography in the 21st Century–A Review of Recent Technological Advances and Their Implications for Worldwide Healthcare,” Telemed J E Health 22(3), 198–208 (2016). [CrossRef]  

9. D. Toslak, C. Liu, M. N. Alam, and X. Yao, “Near-infrared light-guided miniaturized indirect ophthalmoscopy for nonmydriatic wide-field fundus photography,” Opt. Lett. 43(11), 2551–2554 (2018). [CrossRef]  

10. A. Rossi, M. Rahimi, D. Le, T. Son, M. J. Heiferman, R. V. P. Chan, and X. Yao, “Portable widefield fundus camera with high dynamic range imaging capability,” Biomed. Opt. Express 14(2), 906–917 (2023). [CrossRef]  

11. A. Rossi, T. Son, M. Rahimi, and X. Yao, Miniaturized indirect illumination with light polarization and power controls for high dynamic range fundus photography (SPIE, 2023).

12. J. A. N. van de Kraats, T. T. J. M. Berendschot, and D. van Norren, “The Pathways of Light Measured in Fundus Reflectometry,” Vision Res. 36(15), 2229–2247 (1996). [CrossRef]  

13. G. J. Van Blokland and D. Van Norren, “Intensity and polarization of light scattered at small angles from the human fovea,” Vision Res. 26(3), 485–494 (1986). [CrossRef]  

14. R. A. Weale, “Polarized light and the human fundus oculi,” The Journal of physiology 186(1), 175–186 (1966). [CrossRef]  

15. T. Alterini, F. Diaz-Douton, F. J. Burgos-Fernandez, L. Gonzalez, C. Mateo, and M. Vilaseca, “Fast visible and extended near-infrared multispectral fundus camera,” J. Biomed. Opt. 24(09), 1–7 (2019). [CrossRef]  

16. X. Yao, D. Toslak, T. Son, and J. Ma, “Understanding the relationship between visual-angle and eye-angle for reliable determination of the field-of-view in ultra-wide field fundus photography,” Biomed. Opt. Express 12(10), 6651–6659 (2021). [CrossRef]  

17. E. DeHoog and J. Schwiegerling, “Optimal parameters for retinal illumination and imaging in fundus cameras,” Appl. Opt. 47(36), 6769–6777 (2008). [CrossRef]  

18. I. O. f. Standardization, “Ophthalmic instruments—fundamental requirements and test methods—part 2: light hazard protection,” (ISO Geneva (Switzerland), 2007).

19. X. Song, Z. Chen, Q. He, J. Wei, and L. Song, Optimization method to eliminate the influence of the conical acoustic lens on the transmission of laser beam using ZEMAX (SPIE, 2020).

20. J. van de Kraats and D. van Norren, “Directional and nondirectional spectral reflection from the human fovea,” J. Biomed. Opt. 13(2), 024010 (2008). [CrossRef]  

21. T. T. Berendschot, J. van de Kraats, M. J. Kanis, and D. van Norren, “Directional model analysis of the spectral reflection from the fovea and para-fovea,” J. Biomed. Opt. 15(6), 065005 (2010). [CrossRef]  

22. F. S. Crawford, “Waves, Berkeley physics course,” (1968).

23. S. P. Morgan and M. E. Ridgway, “Polarization properties of light backscattered from a two layer scattering medium,” Opt. Express 7(12), 395–402 (2000). [CrossRef]  

24. T. Birdal, “Sharpness Estimation From Image Gradients,” Mathworks, 2023, https://www.mathworks.com/matlabcentral/fileexchange/32397-sharpness-estimation-from-image-gradients.

25. T. Birdal, “Fast Noise Estimation in Images,” Mathworks, 2023, https://www.mathworks.com/matlabcentral/fileexchange/36941-fast-noise-estimation-in-images.

26. J. Immerkaer, “Fast Noise Variance Estimation,” Computer Vision and Image Understanding 64(2), 300–302 (1996). [CrossRef]  

27. J. D. Rogers, “Measurement of optical scattering properties of retinal layers,” Invest. Ophthalmol. Visual Sci. 60, 171 (2019).

28. M. Pircher, E. Götzinger, R. Leitgeb, H. Sattmann, O. Findl, and C. K. Hitzenberger, “Imaging of polarization properties of human retina in vivo with phase resolved transversal PS-OCT,” Opt. Express 12(24), 5940–5951 (2004). [CrossRef]  

29. F. C. Delori and K. P. Pflibsen, “Spectral reflectance of the human ocular fundus,” Appl. Opt. 28(6), 1061–1077 (1989). [CrossRef]  

30. M. Millodot and D. O’Leary, “The discrepancy between retinoscopic and subjective measurements: Effect of age,” American journal of optometry and physiological optics 55(5), 309–370 (1978). [CrossRef]  

31. S. Davoudi, E. Papavasileiou, R. Roohipoor, H. Cho, S. Kudrimoti, H. Hancock, S. Hoadley, C. Andreoli, D. Husain, M. James, A. Penman, C. J. Chen, and L. Sobrin, “Optical coherence tomography characteristics of macular edema and hard exudates and their association with lipid serum levels in type 2 diabetes,” Retina 36(9), 1622–1629 (2016). [CrossRef]  

32. A. Couturier, V. Mane, S. Bonnin, A. Erginay, P. Massin, A. Gaudric, and R. Tadayoni, “Capillary plexus anomalies in diabetic retinopathy on optical coherence tomography angiography,” Retina 35(11), 2384–2391 (2015). [CrossRef]  

33. A. Ioannides, N. D. Georgakarakos, I. Elaroud, and P. Andreou, “Isolated cotton-wool spots of unknown etiology: management and sequential spectral domain optical coherence tomography documentation,” Clinical ophthalmology (Auckland, N.Z.) 5, 1431–1433 (2011). [CrossRef]  

34. S. L. Jacques, J. R. Roman, and K. Lee, “Imaging superficial tissues with polarized light,” Lasers Surg. Med. 26(2), 119–129 (2000). [CrossRef]  

35. S. P. Morgan and I. M. Stockford, “Surface-reflection elimination in polarization imaging of superficial tissue,” Opt. Lett. 28(2), 114–116 (2003). [CrossRef]  

36. S. Sridhar and A. Da Silva, “Enhanced contrast and depth resolution in polarization imaging using elliptically polarized light,” J. Biomed. Opt 21(7), 071107 (2016). [CrossRef]  

37. Á. B. Peña, S. Ketelhut, P. Heiduschka, G. Nettels-Hackert, J. Schnekenburger, and B. Kemper, “Refractive index properties of the retina accessed by multi-wavelength digital holographic microscopy,” in Three-Dimensional and Multidimensional Microscopy: Image Acquisition and Processing XXVI (SPIE2019), pp. 44–49.

38. R. W. Lu, Q. X. Zhang, and X. C. Yao, “Circular polarization intrinsic optical signal recording of stimulus-evoked neural activity,” Opt. Lett. 36(10), 1866–1868 (2011). [CrossRef]  

39. D. Toslak, T. Son, M. K. Erol, H. Kim, T.-H. Kim, R. V. P. Chan, and X. Yao, “Portable ultra-widefield fundus camera for multispectral imaging of the retina and choroid,” Biomed. Opt. Express 11(11), 6281–6292 (2020). [CrossRef]  

40. T. Son, J. Ma, D. Toslak, A. Rossi, H. Kim, R. V. P. Chan, and X. Yao, “Light color efficiency-balanced trans-palpebral illumination for widefield fundus photography of the retina and choroid,” Sci. Rep. 12(1), 13850 (2022). [CrossRef]  

41. B. Wang, D. Toslak, M. N. Alam, R. V. P. Chan, and X. Yao, “Contact-free trans-pars-planar illumination enables snapshot fundus camera for nonmydriatic wide field photography,” Sci. Rep. 8(1), 8768 (2018). [CrossRef]  

42. D. Toslak, F. Chau, M. Erol, C. Liu, R. Chan, T. Son, and X. Yao, “Trans-pars-planar illumination enables a 200° ultra-wide field pediatric fundus camera for easy examination of the retina,” Biomed. Opt. Express 11(1), 68 (2020). [CrossRef]  

43. M. Rahimi, A. Rossi, T. Son, and X. Yao, “High dynamic range fundus camera with trans-palpebral illumination,” Invest. Ophthalmol. Vis. Sci. 64, 5016 (2023).

44. X. Yao, M. Rahimi, A. Rossi, T. Son, D. Toslak, D. Le, M. Abtahi, R. Chan, and M. Heiferman, “Evaluating spatial dependency of spectral efficiency in trans-palpebral illumination for widefield fundus photography,” Biomed. Opt. Express 14(11), 5629–5641 (2023). [CrossRef]  

Data availability

Data may be obtained from the authors upon reasonable request.

Cited By

Optica participates in Crossref's Cited-By Linking service. Citing articles from Optica Publishing Group journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (7)

Fig. 1.
Fig. 1. (a) The optical diagram of the system. (b) Detailed layout of the CL-LS plane. (c) Photographic illustration of the imaging operation. (d) Schematic illustration of polarization states of illumination and imaging light paths in the system with orthogonal-polarization control. (e) Schematic illustration of polarization states of illumination and imaging light paths in the system with a QW between the eye and OL to preserve polarization maintaining photons for fundus imaging. CS: camera sensor; CL: camera lens; LS: light source; LP1-LP2: linear polarizers; OL: ophthalmic lens; RCP: retina conjugate plane; QW: quarter waveplate.
Fig. 2.
Fig. 2. (a) The spot diagram for different field angles. (b) The modulus of the optical transfer function vs Spatial frequency graph for different field angles.
Fig. 3.
Fig. 3. (a) A color fundus image (a1) and separate red (a2), green (a3) and blue (a4) channels of the color image taken by the system without polarization control. (b) A color fundus image (b1) and separate red (b2), green (b3) and blue (b4) channels of the color image taken by the system with orthogonal-polarization control. (c) A color fundus image (c1) and separate red (c2), green (c3) and blue (c4) channels of the color image taken by our lab prototype fundus camera.
Fig. 4.
Fig. 4. The normalized mean intensities of green and blue channels with orthogonal-polarization and parallel polarization controls.
Fig. 5.
Fig. 5. (a) Image of the retina of a DR patient taken with parallel-polarization control. (b) Image of the same retina taken with orthogonal-polarization control. (c) The comparative line profiles of the regions along the red line in Fig. 5(a) and Fig. 5(b). (d1) Enlarged image of the region marked with blue rectangle in Fig. 5(a). (d2) Hot colormap of the region marked with blue rectangle in Fig. 5(a). (e1) Enlarged image of the region marked with blue rectangle in Fig. 5(b). (e2) Hot colormap of the region marked with blue rectangle in Fig. 5(b). (f1) Enlarged image of the region marked with yellow rectangle in Fig. 5(a). (f2) Hot colormap of the region marked with yellow rectangle in Fig. 5(a). (g1) Enlarged image of the region marked with yellow rectangle in Fig. 5(b). (g2) Hot colormap of the region marked with yellow rectangle in Fig. 5(b).
Fig. 6.
Fig. 6. (a) HDR fundus image of a patient diagnosed with ocular histoplasmosis syndrome taken with parallel-polarization control. (b) HDR fundus image of the same eye taken with orthogonal-polarization control. (c) Hot colormaps of grayscale(c1), green channel (c2), and red channel (c3) of the region marked in yellow in Fig. 6(a). (d) Hot colormaps of grayscale (d1), green channel (d2), and red channel (d3) of the region marked in yellow in Fig. 6(b).
Fig. 7.
Fig. 7. (a) Image of the retina of a DR patient taken with parallel-polarization control. (b) Image of the same eye taken with orthogonal-polarization control. (c) Image of the same eye taken with Volk Pictor Plus. (d1) Enlarged image of the region marked with yellow rectangle in Fig. 7(a). (d2) Enlarged image of the region marked with yellow rectangle in Fig. 7(b). (d3) Enlarged image of the region marked with yellow rectangle in Fig. 7(c). (e1) Enlarged image of the region marked with black rectangle in Fig. 7(a). (e2) Enlarged image of the region marked with black rectangle in Fig. 7(b). (e3) Enlarged image of the region marked with black rectangle in Fig. 7(c). (f1) Enlarged image of the region marked with blue rectangle in Fig. 7(a). (f2) Enlarged image of the region marked with blue rectangle in Fig. 7(b). (f3) Enlarged image of the region marked with blue rectangle in Fig. 7(c).

Tables (1)

Tables Icon

Table 1. Image Sharpness and Noise Measurementa

Select as filters


Select Topics Cancel
© Copyright 2024 | Optica Publishing Group. All rights reserved, including rights for text and data mining and training of artificial technologies or similar technologies.