Expand this Topic clickable element to expand a topic
Skip to content
Optica Publishing Group

Development of a new scleral contact lens with encapsulated photodetectors for eye tracking

Open Access Open Access

Abstract

Most eye trackers nowadays are video-based, which allows for a relatively simple and non-invasive approach but also imposes several constraints in terms of necessary computing power and conditions of use (e.g., lighting, spectacles, etc.). We introduce a new eye tracker using a scleral lens equipped with photodiodes and an eyewear with active illumination. The direction of gaze is obtained from the weighted average of photocurrents (centroid) and communicated through an optical link. After discussing the optimum photodiodes configuration (number, layout) and associated lighting (collimated, Lambertian), we present prototypes demonstrating the high performances possibilities (0.11° accuracy when placed on an artificial eye) and wireless optical communication.

© 2020 Optical Society of America under the terms of the OSA Open Access Publishing Agreement

1. Introduction

History of eye trackers can be traced back to the end of the 19th century [1,2]. During this period, a relatively wide range of eye tracking methods has been proposed. Electro-oculography, for instance, is based on recording the small potential variations on the skin around the eyes which are due to the rotation of the eye themselves (the eye acts as a dipole with a permanent voltage between the cornea and ocular fundus) [3], a technically simple method but that suffers from poor accuracy (2° [4]). Other approaches include the use of lenses with mirror [5] or search coil, a method first presented by Robinson in 1963 [6] and further studied by Collewijn et al. in 1974 [7]. The scleral search coil eye tracker is based on the measurement of the voltage induced in the coil by an external magnetic field to estimate gaze direction. Traditionally, this approach offers high accuracy (0.1°), but a wire from scleral lens to an external measurement unit is required and the magnetic field generation is still an issue leading to bulky system or complex processing [4], [8]. More recently, the need to include eye trackers into virtual reality headset led to new design, tracking the limbus [9], using sensing LEDs [10], or using a PSD to detect the spatial position of a laser beam reflected by the cornea [11].

Nowadays however, the very large majority of commercial eye trackers are based on imaging. Various approaches exist such as the dual Purkinje image eye tracker, achieving an accuracy of 0.2° but requiring a bit bar to strictly immobilize the head [4]. The most common method is video-based [12]. In this approach, one or multiple camera(s) take an image of the eyes, which are often illuminated by infrared light sources. Gaze direction is deducted from analysis of pupil and corneal reflections positions [13]. This approach owns its success to its non-invasive nature and constant progresses in terms of sensors, computing power and image processing. Such eye trackers are now common tools in numerous fields from cognitive science to human-computer interface passing by marketing, gaming, neurological diagnosis [14] or visual rehabilitation [15].

Eye trackers based on image processing suffer however from several limitations [16,17]. First of all, various imaging conditions (dark iris, lighting conditions, spectacles, etc.) can significantly decreases performances. Since the gaze detection requires the determination of pupil and corneal reflections position from images, poor raw data quality inevitably leads to poor feature extraction. Secondly, the necessary use of powerful computers and fast cameras make them difficult to use in demanding environments (e.g. in surgical theatre when the surgeon wears magnifiers, in constrained environments, etc.). In addition, the recording and analysis of Mb of information to extract only the direction of gaze may be considered not optimum. Moreover, despite good technological progresses (e.g. the Tobii glasses use several IR light sources and two cameras per eye) the accuracy of head mounted trackers remains higher than 0.5° (for instance 0.82° for the Pupil Labs glasses [18]). Eye-tracking, even video-based, is still an open research topic. Complex algorithms are developed to correct the errors introduced by the simplest approaches, using for instance convolutional neural network [19]. Despite this, the reported accuracy remains between 0.5° and 1° especially for mobile eye trackers [13,16,17].

In this context, the aim of this study was to revisit the scleral lens approach. Such an eye tracker could provide a better accuracy [4] and would allow bringing back the measurement at the heart of the problem (using sensors whose response varies directly with eye movements). The purpose was also to design a simple scleral lens eye tracker that could achieve a 0.2° accuracy over a 30° field of view (which corresponds to the angular space in which the user will move his eyes rather than his head) and that could be easily integrated into a head mounted device (e.g. VR or AR headset). The basic principle of this new eye tracking device is the following (Fig. 1). Photodetectors (PTD) are encapsulated into the scleral lens. These photoreceptors are illuminated by infrared LEDs placed in front of the eye, for instance on a glasses frame. Depending on the direction of gaze, the amount of light falling on each PTD will vary. As such, the direction of gaze can be calculated from the centroid of the PTDs currents. A scleral lens prototype fitting with photodiodes was fabricated and mounted on an eye mock-up to validate the method. The idea was first investigated in [20] but has not explored the different possibilities offered by varying PTDs position, number and illumination. In this study, we investigated which PTD and lighting configuration were necessary to achieve the desired performances and present it in section 2. In section 3, we present experimental results obtained on an artificial eye and compare them to results from section 2. Second part of section 3 presents a method and experimental results to transmit these centroid calculations to a processor on the glasses frame using wireless infrared communications. These results, together with practical considerations on the scleral lens (need for an ASIC, antennas, batteries, clinical validation, etc.) are discussed in section 4. Section 5 concludes the paper.

 figure: Fig. 1.

Fig. 1. Proposed system: IR sources mounted on the eyewear frame illuminate photodetectors encapsulated into a scleral lens. (a) Eye tracking data is computed on the lens thanks to an ASIC and wirelessly transmitted to the eyewear with an optic link (IrDA type). (b) View from the top, (c) from side.

Download Full Size | PDF

2. Scleral lens design

2.1 Photodetectors placement

The available space on the scleral lens is limited, therefore the number and placement of PTDs must be carefully chosen to minimize occupied area while maximizing gaze-tracking accuracy. In our system, the PTDs are located on a conical flexible substrate encapsulated into a standard scleral lens (16 mm large). The substrate with the PTDs has an inner diameter of 6 mm to avoid obstructing vision and an external diameter of 12 mm. PTDs are located on an 8 mm diameter circle.

Previous works on quadrant detectors (QPs) [21,22] show influence of illumination and sensors position on accuracy and response dynamic. In order to find the optimum number and position of photodetectors on the scleral lens we modelled the eye as a sphere of radius r = 13.1 mm and chose its centre of rotation as the orthonormal coordinate centre. In this coordinate system, gaze direction can be located by an angle θ around the z-axis and an angle φ around the x-axis (Fig. 1).

The spherical coordinates of the photodetectors are thus Θ + θ around z-axis and Φ + φ around x-axis, where Θ and Φ are initials angular coordinates when θ = φ = 0°. All PTDs are placed at the same distance from the rotation center, so for a given photodiode, the photocurrent generated is directly proportional to the received light irradiance and given by Eq. (1):

$${I_i} = {E_i}S{R_\lambda }$$
where Ii denotes the photocurrent of the ith PTD, Ei (in W/m2) the received irradiance, and S and Rλ are respectively the photodiode surface area and sensitivity at a given wavelength λ.

A centroid computation requires at least 3 PTDs to distinguish between the two directions θ and φ. On the other hand, due to limited place, no more than 8 PTDs can be disposed on the lens (assuming using off the shelf components of 1 mm2 which correspond to the smallest off the shelf PTDs). Considering these elements, four possible configurations are selected and presented in Fig. 2.

 figure: Fig. 2.

Fig. 2. PTDs number and position on the scleral lens. The configurations will be referred to as follows: (a) flexible substrate dimension, (b) 3 PTDs, (c) 4 PTDs square, (d) 4 PTDs diamond and (e) 8 PTDs.

Download Full Size | PDF

2.2 Scleral lens illumination

Photocurrents delivered by PTDs depend on the received irradiance, Eq. (1), and so on the light emission profile of the infrared source. In this study, we considered two usual models for light sources: collimated and Lambertian.

For a collimated source (Fig. 3(a)), light rays are parallel and the received irradiance Ei varies only with PTD#i inclination, following Eq. (2):

$${E_i} = {E_0}\cos (\Theta + \theta )$$
where E0 is the maximum irradiance received at the cornea apex when Θ + θ = 0°.

 figure: Fig. 3.

Fig. 3. Top view of the system for the 2 sources type. (a) Collimated source and (b) Lambertian source.

Download Full Size | PDF

For a Lambertian source (Fig. 3(b)), radiant intensity I(α) in W/sr is angularly uniform and respect the Lambert’s cosine law Eq. (3):

$$I(\alpha ) = {I_0}\cos (\alpha )$$
where I0 is the radiant intensity in the normal direction of the source and $\alpha $. the angle with respect to normal. Moreover, irradiance Ei received by a PTD#i decreases over distance D with the source due to the inverse-square law Eq. (4) and varies with PTD#i inclination in the same way than Eq. (2).
$${E_i} = I(\alpha )/{D^2}$$
Combining Eqs. (2), (3) and (4), irradiance Ei received by PTD#i illuminated by a Lambertian source is given by Eq. (5). D, I0 and α can be deduced from geometrical configuration and are given Eqs. (6), (7) and (8) respectively.
$${E_i} = \frac{{{I_0}}}{{{D^2}}}\cos (\alpha )\cos (\alpha + \Theta + \theta )$$
$${D^2} = {[d + r(1 - \cos (\Theta + \theta ))]^2} + {[r\sin (\Theta + \theta )]^2}$$
$${I_0} = {E_0}{d^2}$$
$$\alpha = {\tan ^{ - 1}}\left( {\frac{{r\sin (\Theta + \theta )}}{{d + r(1 - \cos (\Theta + \theta ))}}} \right)$$
where d is the distance between the cornea apex and the source, r is the eye radius and E0 is the maximum irradiance (at the cornea apex). Due to the symmetry of the problem, similar expressions can be obtained for the angle φ (which we do not present here for sake of clarity). For the same reason, we only consider afterward the angle θ.

2.3 Simulation results

Illumination models (cf. Eqs. (1) and (5) from previous sections) allowed us to determine the photocurrent Ii for each photodiode. From these values, the gaze direction with respect to θ can be calculated using a centroid calculation, similar to what is achieved with quadrants detectors [20,21]. The associated analytical expression for the 3 PTDs, 4 PTDs square, 4 PTDs diamond and 8 PTDs configuration, is given respectively by Eqs. (9), (10), (11) and (11).

$${\theta _{\textrm{computed,3}}} = {K_3}\left[ {\frac{{{\Theta _2}{I_1} - {\Theta _3}({I_2} + {I_3})}}{{{I_1} + {I_2} + {I_3}}}} \right]$$
$${\theta _{\textrm{computed,4s}}} = {K_{4s}}\left[ {{\Theta _1}\frac{{{I_1} - {I_4} + {I_2} - {I_3}}}{{{I_1} + {I_2} + {I_3} + {I_4}}}} \right]$$
$${\theta _{\textrm{computed,4d}}} = {K_{4d}}\left[ {{\Theta _2}\frac{{{I_1} - {I_4}}}{{{I_1} + {I_4}}}} \right]$$
$${\theta _{\textrm{computed,8}}} = {K_8}\left[ {\frac{{{\Theta _1}({I_1} - {I_4} + {I_2} - {I_3}) + {\Theta _2}({I_8} - {I_5})}}{{{I_1} + {I_2} + {I_3} + {I_4} + {I_8} + {I_5}}}} \right]$$
where Ii is the photocurrent delivered by PTD#i given by Eq. (1), and K is a slope factor taking into account the illumination profile. In these expressions, the term between brackets corresponds to the classic centroid formula for n points (here currents) in a 2D plane. However, since the PTDs are not placed in the same 2D plane, photocurrents variation is not linearly proportional to position. We are then in a case similar to the 4 QPs when a not uniform spot is projected on the detector [21,22]. A K slope factor is thus introduced to take into account this illumination profile and so constant KΘ converts a barycentric variation to an angle in degrees. K is calculated by taking 2 points (–15° and +15°) and setting that the θcomputed is a linear function equal to actual θ. Difference between the computed θcomputed and the actual θ gives the accuracy.

As the centroid calculation is differential, the impact of the component φ on the photocurrents value is cancelled in the θcomputed calculation by the subtraction. Thus, with an eye orientation having an angle θ and an angle φ, the calculation of the θcomputed direction depends only on the angle θ. The study can therefore be carried out by taking into account only pure θ orientation.

To simulate the accuracy of our prototype in each of the given configuration, photocurrents Ii (Eqs. (1) and (5)) and the resulting gaze direction (Eqs. (9) to (12)) were computed for several eye orientations (θ from –15° to +15° with a step of 0.5°). The error between the computed direction and the eye orientation gives the accuracy and is presented in Fig. 4.

 figure: Fig. 4.

Fig. 4. Simulation results for collimated source (solid line) and Lambertian source (dashed line) for (a) 3 PTDs, (b) 4 PTDs square, (c) 4 PTDs diamond and (d) 8 PTDs configuration. On the x-axis is the actual angular orientation θ of the eye. On the y-axis is the error between the computed angle θcomputed and the actual θ. This difference gives the accuracy of the calculation method.

Download Full Size | PDF

According to Fig. 4, there is a significant difference between accuracy with 3 PTDs and other configurations. Asymmetry leads to an offset for collimated and Lambertian case. Moreover, with a Lambertian source the error reaches 2.32°, is not bounded and is not symmetrical with respect to 0°. That is why for the rest of the study this configuration with 3 PTDs is not considered.

According to the simulations, the collimated lighting allows a maximum error of 0.13° for the four PTDs configuration. In case of using a Lambertian source, the maximum error depends on the PTDs configuration and is respectively of 0.69°, 0.88° and 0.70° for configurations (b), (c), (d), thus roughly 5 times larger than with a collimated source. For comparison, actual camera based eye-trackers announce a maximal accuracy of around 0.5° and a mean accuracy of 1° [16,17].

Due to the cosine dependence of the irradiance Ei, Eq. (2) and Eq. (5), there is an error fluctuation as a function of the actual θ angle. PTDs are not in a plane but on a sphere, so their position according to the source depends on cosine of the eye angular rotation. For collimated source, combining Eq. (1), Eq. (2) and Eqs. (10), (11) or (12) leads to a computed θcomputed angle depending on tan(θ) as explained in [20]. In the case of Lambertian source, received irradiance also depends on inverse-square of the distance with the source, Eq. (4). Because of low distance between scleral lens and eyewear, this phenomenon has a significant impact on the irradiance value and explains the fluctuation difference between the two light sources. Cosine dependency and inverse-square law lead to the non-linearity shown on Fig. 4 and to an inherent fluctuant error. Because this error is predictive, a digital method with a look-up table (LUT) can reduce it as presented in [20]. Applied to the Lambertian case, this limits the error to 0.1°.

Although 8 PTDs disposition bring redundancy and therefore robustness, this configuration occupies the largest surface. Moreover, the centroid computation Eq. (12) requires more calculation steps than the other solutions. Based on circuit design presented in [20], the necessary additional circuitry to perform calculation of Eq. (12) instead of Eq. (10) leads to an approximately 50% increase in power consumption, a significant impact due to the limited energy available on the lens. Since 8 PTDs do not provide a gain in accuracy, the 4 PTDs configurations are thus preferred.

In addition using 4 PTDs rather than 8 allows using larger photodiodes which provides additional advantages. The active surface of a photodiode may correspond to less than half of the component itself so choosing too small components may reduce the signal to noise ratio. Also very small photodiodes will be difficult to position symmetrically on the circuit thus decreasing the eye tracker’s performances. Finally, between the square and diamond configurations, the square one offers the advantage of a better accuracy with the Lambertian illumination. Consequently, since perfectly collimated lighting is difficult to obtain in practice, we prefer the 4 PTDs square configuration to limit errors due to a pseudo-collimated source.

2.4 Robustness and calibration

Before first use, the system needs to be calibrated to take into account the initial positioning of the lens and eyewear and associated errors. These errors can be categorized into two categories: offset and gain. The first one is mainly due to scleral lens misplacement and the difference between optical axis and visual axis [16]. In fact, the optical axis is defined by the center of curvature of the cornea and the pupil center while the visual axis is defined by the center of corneal curvature and the fovea. Since the system measures the optical axis while the axis of interest is the visual one, a calibration is required to cancel the difference. One calibration point at 0° is enough to record and cancel this error for all computed angles. A gain error is an incorrect adjustment of the slope factor K, Eqs. (10) to (12), to the actual system and illumination. This can be corrected separately for θ and φ with two calibration points for each, e.g. –15° and +15°. Then, a simple slope calculation gives a corrective gain to adapt the value of K to the actual situation. In summary, the calibration process could be achieved with a 5-point steps similar to the n-point procedures used by video-based eye trackers. First, the user is asked to look at a marker placed in front of him to record the offset. Then the user is asked to look at two know angles (e.g. –15° and +15°) in the horizontal plan, and two others in the vertical plan. The measurement is recorded and the slope calculation gives the adjusted K for each direction. Following calibration, the subject is asked again to fixate the marker and in case of poor correspondence between the calibrated gaze point and the marker location, the calibration procedure is repeated.

Once calibrated, three major causes can degrade gaze-tracking accuracy: ambient light and scleral lens or eyewear displacements. With respect to the first one, centroid computation reduces the impact of unwanted lighting. Indeed, differential computation of the method cancels common mode noise. This is the case for large light variations like ambient lighting of a room or proximity of a strong source (e.g. sunlight from a window). Moreover, PTDs can be chosen to have a peak of sensitivity for the infrared light emitted by the eyewear and to be less sensitive to other light sources. Usual constraints of avoiding PTDs saturation and obtaining enough signal however still apply.

Regarding eyewear slippage, this is an issue common to video based eye trackers. The solution is usually to have the eyewear firmly strapped to the head to ensure that the device does not move during use. In addition, one of the main advantages of collimated light is to reduce influence of misplacement on accuracy since the lighting received by the lens does not vary with distance and shifts.

With regards to lens slippage, the use of a scleral lens, i.e. a lens that extends over the sclera, below the eyelids, ensures that the lens will be very stable on the eye (e.g. 100 microns shift over an hour [23] which would only induce an error of few arcmin on the gaze calculation).

3. Experimental results

3.1 Eye tracking scleral lens prototype

To validate our simulation results, we wished to repeat the same calculations as in section 2.3 but with photocurrent values obtained in real conditions. A 4 PTDs square configuration prototype was thus manufactured Fig. 2(b). The PTDs are infrared silicon photodiodes (Silonex SLCD-61N8) with a maximum absorption at λ = 930 nm and an area of S = 2.7 mm2 for a thickness of 0.4 mm. Amorphous-silicon or organic photodiodes may have offered a thinner solution and helped avoiding wire bonding difficulties, but their electronic performances, especially efficiency, are lower and these photodiodes are not readily available.

Preliminary characterization shows a sensitivity of Rλ = 0.55 A/W at λ = 940 nm. Photodiodes are bonded on a 0.13 mm thick flexible substrate according to the dimensions shown Fig. 2(d). A scleral lens of 16 mm diameter encapsulated the bonded PTDs. The prototype is mounted on an eye-size ball rotating ±16° for angle θ. Initial angles for the PTDs are Θ = = 12.75°.

In case of a Lambertian lighting, the prototype (Fig. 5(a)) is illuminated by infrared 6 LEDs (Vishay VSMY12940, λ = 940 nm) placed at 13 mm (the standard distance between the apex of the cornea and glasses) and for which we checked that it provides the wanted light distribution. The lighting system could be integrated into a spectacles frame (Fig. 5(b)) or a visor. In case of a collimated lighting, we used a single LED placed at a distance and collimated using a pinhole, a spatial filter and a lens. In practice, such a system could be replaced by a waveguide similar to those used in augmented reality displays to not to block the line of sight [24,25].

 figure: Fig. 5.

Fig. 5. (a) Left panel: instrumented scleral lens prototype with 4 photodiodes. The presented set up is for experiments with Lambertian ligth source (generated by the 6 IR LEDs). (b) Right panel: spectacles frame with 4 LEDs producing a Lambertian illumination.

Download Full Size | PDF

Ocular safety limits standards require that light (at λ = 950 nm) received by the eye does not exceed a Maximal Power Exposure (MPE) equal to MPEretina = 7 mW/mm2 for the retina and MPEcornea = 0.1 mW/mm2 for the cornea [26,27]. Light emission in our system was thus chosen so that the maximum irradiance at the eye surface was less than 0.1 mW/mm2 in both lighting conditions (around 20 µW/mm2, measured with an optical power meter).

A precision Source/Measure Unit (SMU, Keysight B2900A) applies a reverse bias voltage of 1.2 V to the photodiodes and measures delivered photocurrents. To assess the experimental accuracy offered by our prototype, the direction of gaze was calculated for 17 different orientations of the lens (from –16° to +16° with a step of 2°). For each orientation, 251 measuring points were recorded which corresponds to a recording time of 2 s at a rate of 125 Hz. Measurements were taken in a closed room, i.e. where no sunlight could enter but with neon lighting on.

The main results from Fig. 6 are summarized in Table 1. In the collimated case, experimental precision is poor (0.44°), as well as accuracy which reaches 0.88° before LUT correction and 0.73° afterwards (cf. Table 1), so much more than the 0.1° (after LUT) expected from section 2. These results can be explained by the fact that in the collimated case, photocurrent variations are only caused by PTDs inclinations (cf. Eq. (2)) which leads to a very low dynamic range (of 0.3 µA as opposed to 12 µA with a Lambertian source). As a result, systematic errors, such as misalignment, poor collimation, or noise on photocurrents have a significant impact on accuracy, limiting it to 0.8° with a wide dispersion of results.

 figure: Fig. 6.

Fig. 6. Raw experimental error on gaze direction (dotted line) and after LUT correction (solid line) compared to theoretical error (dashed line) for a collimated source (a) and a Lambertian source (b). On the x-axis is the actual angular orientation θ of the eye. On the y-axis is the error between the computed angle θcomputed and the actual θ. The curve is the average of the 251 data points, providing the accuracy, and the error bar is the standard deviation, giving the precision.

Download Full Size | PDF

Tables Icon

Table 1. Experimental comparison between the impact of a Lambertian source and a collimated one on accuracy

On the contrary, wide photocurrents dynamic range with a Lambertian source leads to a more robust system. The accuracy reaches 0.1° after LUT compensation (Fig. 6(b)), so 10 times better than current camera-based eye tracker, with a precision of 0.01°.

3.2 Wireless optical data transmission

The aim of this paper was first to present a scleral lens eye tracker and to investigate which PTDs and lighting configurations should be chosen to achieve optimum performances in terms of accuracy and robustness. Our prototype does not include, at this stage [28], any wireless communication system and PTDs currents are thus read through a ribbon and processed externally. However, for such device to be fully functional, this data transmission should be obviously done wirelessly. An emitter placed within the lens would then send the data (i.e. the raw photocurrents for each PTDs or the result of the centroid calculation if it is performed “on board” by an ASIC encapsulated into the lens) to a receptor placed on the eyewear (so as to limit energy requirements). This data transmission could be achieved using radiofrequencies or optical signals. Since the first case has been demonstrated (cf. for instance the Sensimed Triggerfish [29]), a limited data rate is required and to develop an “all-optical” device, we chose to consider a wireless optical communication. The approach is to encapsulate an infrared light emitting device in the scleral lens and use a photoreceptor on the eyewear. Digital data are transmitted by infrared pulses. This method is widely used, for example for remote controls under the Infrared Data Association (IrDA) protocol. To validate the possibility of optical data transmission from the lens to a receptor placed on the spectacles, and to assess the power supply required for our application, the following test was carried out.

A second prototype encapsulating a small infrared LED (Osram SFH 4043, λ = 940 nm) into a scleral lens (Fig. 7(a)) was built and placed on an artificial eye as in section 2. The purpose of this second prototype is to demonstrate the possibility to wirelessly transmit, through an optical link, the eye tracking data from the scleral lens to the eyewear for further analysis. The LED was powered with the lowest possible current (0.6 mA, 1.2 V) modulated with 100 µs pulses. An infrared photodiode (Vishay BPW34) was used as receiver and placed 13 mm away. The photodiode answer with respect to the lens orientation is depicted in Fig. 7(b). It shows that the photodiode delivers a current impulsion of at least 0.5 µA, enough to detect LED light pulse. In addition, a pulsed signal of 100 µs with a duty cycle of 50% corresponds to a theoretical data rate of 5 kbit/s and is thus enough to transmit 20 bits of data (computed θ and φ on 10 bits [20]) at a minimum rate of 120 Hz. RMS power consumption is 360 µW, allowing system to operate at least 10 min using a micro-battery [30]. This remains a low autonomy, but this is a first step forward compared to the RF transmission with which the autonomy of the system is 3 min.

 figure: Fig. 7.

Fig. 7. (a) Infrared LED encapsulated. (b) Received current after an emission pulse of 100 µs for different scleral lens inclinations. Current pulse amplitude in the emitter is of 0.6 mA. Measured are performed with ambient light on (neon light).

Download Full Size | PDF

4. Discussion

In this study, we presented the principle and design of an innovative scleral contact lens eye tracker as well as experimental results validating its ability to provide accurate results. The lens embeds photodiodes, for which we investigated the impact of their size, number and configurations. The influence of their initial tilt w.r.t. the pupillary axis was not investigated but in view of the limited possible position for the photodiodes on the lens and the limited variations in corneal radius among the population, this parameter should have a limited influence. The lens also embeds a communication device and we demonstrated the possibility to use IrDA protocol. This also paves the way for the use of other communication methods based on an optical link, such as LED-to-LED communication [31], using LEDs as transmitters, which would allow bi-directional data transmission. The lens works with an eyewear carrying the illumination system and for which we investigated the impact of two standard configurations (Lambertian, collimated).

When compared to the scleral search coil approach, it can be considered less invasive since the search coil approach usually requires a wire that irritates the eyelids and sclera. On the contrary, in or approach, all electronic components are encapsulated in the scleral lens, so there are no external element risking to cause irritation. In addition, the lens design will minimize slippage. In terms of accuracy the two systems are similar with an error between 0.1° and 0.2°.

When compared to video based eye trackers, the main drawback of our method is clearly its invasiveness. On the other hand, the advantages are numerous including no need for image processing (thus no need for expensive and power hungry computers), possibility to be easily integrated into constrained environments (e.g. VR headset) and the possibility to be used with people who needs visual aids (in Fig. 5(b) for instance, we used a spectacle frame with no glasses, but it could be added without affecting the measurement, or the scleral lens could easily be designed to include a refractive correction). Also, in terms of performances, according to our results, a scleral lens eye tracker with 4 PTDs illuminated by a Lambertian light source should allow achieving a 0.11° accuracy with a 0.01° precision over a field of ±15° which is almost an order of magnitude better than the one offered by the best commercial head-mounted eye trackers and probably enough for applications such as human-machine interface (HMI) where the user won’t be asked to look at large eccentricities. For comparison, the Tobii Pro Glasses 2 offers an accuracy of 0.5° with a precision of 0.3°. These results were however only obtained on an artificial eye, giving an indication of the validity of the method and the expected accuracy, but not ensuring the same performances on a real eye. Wang et al. [32] suggest that in the case of video-based eye-trackers, precision error can triple by moving from artificial eye to real eye. As our system operates differently, the correlation between the accuracy and precision obtained with an artificial eye and those that will be obtained with a real eye is not yet known. The performances of our eye tracker in real conditions will be undoubtedly poorer than on the artificial eye and assessing it will be our next research objective now that we have demonstrated a proof of concept.

In terms of sampling rate, we chose 125 Hz to demonstrate the system. Although it is as high as those of high-end commercial wearable eye trackers, it is far less that the one achieved by some desktop based video-based eye trackers used in research on eye movements. Our method is however not limited in terms of sampling frequency. The limitation comes from the integrated circuit (the ASIC) which carries out the calculation and the digitization. A first circuit design and simulations show that a sampling rate of 250 Hz is achievable.

In addition, we did not consider the issue of eyelid occlusion. This could be a limiting factor if wishing to extend the vertical angular range, particularly for people with small eyes or a dropping eyelid. If technological progresses are made allowing to reduce the size of the PTDs, this issue could be addressed by placing PTDs closer to the pupillary axis (e.g. 2.5 mm) and closer to the horizontal plane.

In view of these advantages and disadvantages, our approach could be useful for a number of applications, particularly those where an eye tracker has to be integrated into an head mounted device (e.g. VR, AR, HMD) and concerning a specific population (e.g. pilots, surgeons, for who prescribing and fitting the lens would be easier). Among these applications, one of interest would be vision restoration [33]. With current approaches such as optogenetics therapy or retinal prostheses, an image of the world is captured and projected, after processing on the small light sensitive part of the retina. Eye trackers can be particularly useful to record the most relevant image and to improve the optical projection on the retina but the size and inaccuracy of current devices limit their practical use.

In terms of manufacturability, progress in miniaturization and integration are still needed to fit into the lens all the required components, i.e. ASIC (if centroid calculations are performed on the lens), battery and harvesting system (the system should be usable for at least 45 minutes), communication system, but individually these functions have already been demonstrated (cf. section 3.2 as well as [20,34,35]). It is thus only a matter of time before a fully integrated system is demonstrated.

With regards to wearing such an electronic scleral lens, the ribbon used here to read the PTDs current prevented its wear but the dimension of the lens embedding the 4 photodiodes was in agreement with standard commercial scleral lens dimensions. In addition, a number of electronic scleral lenses have already been demonstrated [35,36] or commercialized [29] demonstrating that electronic functions can be placed safely on to the eye.

5. Conclusion

A novel method for eye tracking using a scleral lens fitted with photodetectors was presented. A theoretical study was first conducted on this eye tracker to determine the lens configuration and the light source type. A prototype was then fabricated to validate the method. Measurements with an artificial eye show that an accuracy of 0.11° is achievable, a significant improvement compared to most mobile video-based eye trackers. In addition, it was validated that a wireless optical communication is possible between the lens and the eyewear. This approach is part of the integration of increasingly complex functions on electronic scleral lenses.

Funding

Institut Mines-Telecom Fundation.

Disclosures

The authors declare no conflicts of interest.

References

1. N. J. Wade, “Pioneers of eye movement research,” I-perception 1(2), 33–68 (2010). [CrossRef]  

2. R. Dodge and T. S. Cline, “The angle velocity of eye movements,” Psychological Review 8(2), 145–157 (1901). [CrossRef]  

3. A. Meroni, P. Pradhapan, P. van der Heijden, and N. Shahriari, “System, Device and Method for Eye Activity Monitoring,” U.S. Patent Application No. 16/387,950, (2019).

4. C. H. Morimoto and M. R. M. Mimica, “Eye gaze tracking techniques for interactive applications,” Computer Vision and Image Understanding 98(1), 4–24 (2005). [CrossRef]  

5. L. R. Young and D. Sheena, “Eye-movement measurement techniques,” Am. Psychol. 30(3), 315–330 (1975). [CrossRef]  

6. D. A. Robinson, “A Method of Measuring Eye Movement Using a Scleral Search Coil in a Magnetic Field,” IEEE Trans. Bio-Med. Electron. 10(4), 137–145 (1963). [CrossRef]  

7. H. Collewijn, F. Van der Mark, and T. C. Jansen, “Precise recording of human eye movements,” Vision Res. 15(3), 447–IN5 (1975). [CrossRef]  

8. E. Whitmire, L. Trutoiu, R. Cavin, D. Perek, B. Scally, J. Phillips, and S. Patel, “EyeContact: scleral coil eye tracking for virtual reality,” in Proceedings of the 2016 ACM International Symposium on Wearable Computers (2016), pp. 184–191.

9. L. Tianxing, Q. Liu, and X. Zhou, “Ultra-low power gaze tracking for virtual reality,” in Proceedings of the 15th ACM Conference on Embedded Network Sensor Systems (2017), pp. 1–14.

10. K. Akşit, J. Kautz, and D. Luebke, “Gaze-Sensing LEDs for Head Mounted Displays,” arXiv preprint arXiv:2003.08499 (2020).

11. S. Sangu, T. Shimokawa, and S. Tanaka, “Ultracompact eye and pupil tracking device using VCSEL arrays and position sensitive detector,” Proc. SPIE 11310, 113101F (2020). [CrossRef]  

12. A. Al-Rahayfeh and M. Faezipour, “Eye Tracking and Head Movement Detection: A State-of-Art Survey,” IEEE J. Transl. Eng. Health Med. 1, 2100212 (2013). [CrossRef]  

13. P. Majaranta and A. Bulling, “Eye Tracking and Eye-Based Human–Computer Interaction,” Advances in Physiological Computing (SpringerLink, 2014), pp. 39–65 . [CrossRef]  

14. D. Jansson, A. Medvedev, H. Axelson, and D. Nyholm, “Stochastic Anomaly Detection in Eye-Tracking Data for Quantification of Motor Symptoms in Parkinson’s Disease,” Signal and Image Analysis for Biomedical and Life Sciences 823, 63–82 (2015). [CrossRef]  

15. J.-L. de Bougrenet, M. Lamard, J. Bolloch, and V. Nourrit, “Automatic Detection of Oculomotor Disorders,” in Proceedings of 15th Asia-Pacific Conference on Vision (APCV) (2019).

16. A. Kar and P. Corcoran, “A Review and Analysis of Eye-Gaze Estimation Systems, Algorithms and Perfromance Evaluation Methods in Consumer Platforms,” IEEE Access 5, 16495–16519 (2017). [CrossRef]  

17. A. J. Larrazabal, C. E. García Cena, and C. E. Martínez, “Video-oculography eye tracking towards clinical applications: A review,” Comput. Biol. Med. 108, 57–66 (2019). [CrossRef]  

18. B. V. Ehinger, K. Groß, I. Ibs, and P. König, “A new comprehensive eye-tracking test battery concurrently evaluating the Pupil Labs glasses and the EyeLink 1000,” PeerJ 7, e7086 (2019). [CrossRef]  

19. X. Zhang, Y. Sugano, M. Fritz, and A. Bulling, “MPIIGaze: Real-World Dataset and Deep Appearance-Based Gaze Estimation,” IEEE Trans. Pattern Anal. Mach. Intell. 41(1), 162–175 (2019). [CrossRef]  

20. L. Massin, C. Lahuec, V. Nourrit, F. Seguin, and J.-L. de Bougrenet, “Towards accurate camera-less eye tracker using instrumented scleral lens,” in Proceedings of 2019 17th IEEE International New Circuits and Systems Conference (NEWCAS) (2019), pp. 1–4.

21. C. Lu, Y.-S. Zhai, X.-J. Wang, Y.-Y. Guo, Y.-X. Du, and G.-S. Yang, “A novel method to improve detecting sensitivity of quadrant detector,” Optik 125(14), 3519–3523 (2014). [CrossRef]  

22. M. Chen, Y. Yang, X. Jia, and H. Gao, “Investigation of positioning algorithm and method for increasing the linear measurement range for four-quadrant detector,” Optik 124(24), 6806–6809 (2013). [CrossRef]  

23. S. J. Vincent, D. Alonso-Caneiro, and M. J. Collins, “The temporal dynamics of miniscleral contact lenses: Central corneal clearance and centration,” Contact Lens and Anterior Eye 41(2), 162–168 (2018). [CrossRef]  

24. C. M. Bigler, P.-A. Blanche, and K. Sarma, “Holographic waveguide heads-up display for longitudinal image magnification and pupil expansion,” Appl. Opt. 57(9), 2007–2013 (2018). [CrossRef]  

25. Q. Wang, D. Cheng, Q. Hou, Y. Hu, and Y. Wang, “Stray light and tolerance analysis of an ultrathin waveguide display,” Appl. Opt. 54(28), 8354–8362 (2015). [CrossRef]  

26. ICNIRP, “ICNIRP Guidelines on limits of exposure to incoherent visible and infrared radiation,” Health Phys. 105(1), 74–96 (2013). [CrossRef]  

27. V. Mazlin, P. Xiao, E. Dalimier, K. Grieve, K. Irsch, J.-A. Sahel, M. Fink, and A. Claude Boccara, “In vivo high resolution human corneal imaging using full-field optical coherence tomography,” Biomed. Opt. Express 9(2), 557–568 (2018). [CrossRef]  

28. J.-L. de Bougrenet de la Tocnaye, L. Dupont, F. Ferranti, C. Lahuec, V. Nourrit, and F. Seguin, “A wireless contact lens eye tracking system (example of a smart sensors development platform),” presented at SENSO2017: 5th international conference sensors, energy harvesting, wireless network and smart objects workshop (2017).

29. AG Sensimed, “Products: Sensimed Triggerfi sh,” http://www.sensimed.ch (accessed: February 2020).

30. M. Nasreldin, R. Delattre, M. Ramuz, C. Lahuec, T. Djenizian, and J.-L. de Bougrenet de la Tocnaye, “Flexible Micro-Battery for Powering Smart Scleral lens,” Sensors 19(9), 2062 (2019). [CrossRef]  

31. G. Corbellini, K. Aksit, S. Schmid, S. Mangold, and T. R. Gross, “Connecting networks of toys and smartphones with visible light communication,” IEEE Commun. Mag. 52(7), 72–78 (2014). [CrossRef]  

32. D. Wang, F. B. Mulvey, J. B. Pelz, and K. Holmqvist, “A study of artificial eyes for the measurement of precision in eye-trackers,” Behavior Research Methods 49(3), 947–959 (2017). [CrossRef]  

33. E. H. Wood, P. H. Tang, I. de la Huerta, D. Korot, S. Muscat, D. A. Palanker, and G. A. Williams, “Stem Cell Therapies, Gene-Based Therapies, Optogenetics, And Retinal Prosthetics: Current State And Implications For The Future,” Retina 39(5), 820–835 (2019). [CrossRef]  

34. J. Pandey, Y.-T. Liao, A. Lingley, R. Mirjalili, B. Parviz, and B. P. Otis, “A Fully Integrated RF-Powered Scleral lens With a Single Element Display,” IEEE Trans. Biomed. Circuits Syst. 4(6), 454–461 (2010). [CrossRef]  

35. K. Hayashi, S. Arata, S. Murakami, Y. Nishio, A. Kobayashi, and K. Niitsu, “A 6.1-nA Fully Integrated CMOS Supply Modulated OOK Transmitter in 55-nm DDC CMOS for Glasses-Free, Self-Powered, and Fuel-Cell-Embedded Continuous Glucose Monitoring Scleral lens,” IEEE Trans. Circuits Syst. II 65(10), 1360–1364 (2018). [CrossRef]  

36. Y.-T. Liao, H. Yao, A. Lingley, B. Parviz, and B. P. Otis, “A 3µW CMOS Glucose Sensor for Wireless Contact-Lens Tear Glucose Monitoring,” IEEE J. Solid-State Circuits 47(1), 335–344 (2012). [CrossRef]  

Cited By

Optica participates in Crossref's Cited-By Linking service. Citing articles from Optica Publishing Group journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (7)

Fig. 1.
Fig. 1. Proposed system: IR sources mounted on the eyewear frame illuminate photodetectors encapsulated into a scleral lens. (a) Eye tracking data is computed on the lens thanks to an ASIC and wirelessly transmitted to the eyewear with an optic link (IrDA type). (b) View from the top, (c) from side.
Fig. 2.
Fig. 2. PTDs number and position on the scleral lens. The configurations will be referred to as follows: (a) flexible substrate dimension, (b) 3 PTDs, (c) 4 PTDs square, (d) 4 PTDs diamond and (e) 8 PTDs.
Fig. 3.
Fig. 3. Top view of the system for the 2 sources type. (a) Collimated source and (b) Lambertian source.
Fig. 4.
Fig. 4. Simulation results for collimated source (solid line) and Lambertian source (dashed line) for (a) 3 PTDs, (b) 4 PTDs square, (c) 4 PTDs diamond and (d) 8 PTDs configuration. On the x-axis is the actual angular orientation θ of the eye. On the y-axis is the error between the computed angle θcomputed and the actual θ. This difference gives the accuracy of the calculation method.
Fig. 5.
Fig. 5. (a) Left panel: instrumented scleral lens prototype with 4 photodiodes. The presented set up is for experiments with Lambertian ligth source (generated by the 6 IR LEDs). (b) Right panel: spectacles frame with 4 LEDs producing a Lambertian illumination.
Fig. 6.
Fig. 6. Raw experimental error on gaze direction (dotted line) and after LUT correction (solid line) compared to theoretical error (dashed line) for a collimated source (a) and a Lambertian source (b). On the x-axis is the actual angular orientation θ of the eye. On the y-axis is the error between the computed angle θcomputed and the actual θ. The curve is the average of the 251 data points, providing the accuracy, and the error bar is the standard deviation, giving the precision.
Fig. 7.
Fig. 7. (a) Infrared LED encapsulated. (b) Received current after an emission pulse of 100 µs for different scleral lens inclinations. Current pulse amplitude in the emitter is of 0.6 mA. Measured are performed with ambient light on (neon light).

Tables (1)

Tables Icon

Table 1. Experimental comparison between the impact of a Lambertian source and a collimated one on accuracy

Equations (12)

Equations on this page are rendered with MathJax. Learn more.

I i = E i S R λ
E i = E 0 cos ( Θ + θ )
I ( α ) = I 0 cos ( α )
E i = I ( α ) / D 2
E i = I 0 D 2 cos ( α ) cos ( α + Θ + θ )
D 2 = [ d + r ( 1 cos ( Θ + θ ) ) ] 2 + [ r sin ( Θ + θ ) ] 2
I 0 = E 0 d 2
α = tan 1 ( r sin ( Θ + θ ) d + r ( 1 cos ( Θ + θ ) ) )
θ computed,3 = K 3 [ Θ 2 I 1 Θ 3 ( I 2 + I 3 ) I 1 + I 2 + I 3 ]
θ computed,4s = K 4 s [ Θ 1 I 1 I 4 + I 2 I 3 I 1 + I 2 + I 3 + I 4 ]
θ computed,4d = K 4 d [ Θ 2 I 1 I 4 I 1 + I 4 ]
θ computed,8 = K 8 [ Θ 1 ( I 1 I 4 + I 2 I 3 ) + Θ 2 ( I 8 I 5 ) I 1 + I 2 + I 3 + I 4 + I 8 + I 5 ]
Select as filters


Select Topics Cancel
© Copyright 2024 | Optica Publishing Group. All rights reserved, including rights for text and data mining and training of artificial technologies or similar technologies.