Expand this Topic clickable element to expand a topic
Skip to content
Optica Publishing Group

Perceptual-driven approach to statically foveated head-mounted displays

Open Access Open Access

Abstract

A foveated display is a promising technique to realize displays offering both a large field of view (FOV) and high spatial resolution. Although several prior works have attempted to apply a foveation method to the design of a head-mounted display (HMD) system, the common method is based on a dual-resolution dynamic foveation scheme which is inevitably complex and has a high cost due to the requirements for multiple display sources, a 2D steering mechanism, and eye tracker. In this paper, a new perceptual-driven approach to the design of a statically foveated HMD is proposed with the goal of offering a wide FOV across which the degradation of the perceived image resolution is nearly imperceptible or minimal within regions of frequent eye movements. Compared to a dual-resolution discrete and dynamic foveation approach in the prior art, the static foveation approach will not only maintain resolution continuity but also eliminate the need for a scanning mechanism, multiple display sources, and an eyetracker, and therefore minimize hardware complexity. We present the general approach for creating a static foveation scheme, performance metrics for evaluating the perceived image quality, and the process of optimizing a foveation scheme to meet different requirements. Finally, we experimentally demonstrate and validate the proposed foveation scheme using a testbed system. Overall, we demonstrate a statically foveated scheme is capable of offering a display with a total 160° FOV, a constant resolution of 0.5 or 1 arcminutes per pixel within the ±10° region where frequent eye movements occur, an adequate resolution no less than 45% of peak resolution within the parafovea region of ±30°, and a data sampling efficiency as high as 90%.

© 2021 Optical Society of America under the terms of the OSA Open Access Publishing Agreement

1. Introduction

Conventional head-mounted displays (HMD) adopt the well-established rectilinear sampling method for 2D display and imaging systems where a finite number of pixels are spread evenly across an entire field of view (FOV), thus are subject to the inherent trade-off between their FOV and spatial resolution. For a given number of available pixels, the larger is the FOV, the lower the angular resolution. Consider an HMD design with a high-definition (HD) display of 1920 × 1200 pixels. The angular resolution is about 3.75 arc minutes per pixel for a design spreading the 1920 pixels evenly across an FOV of 120° in the horizontal direction. Based on the same sampling method, achieving an angular resolution of 1 arc minutes per pixel for the same FOV would require a display of 7200x 4500 pixels and require a data bandwidth 14 times of a typical HD device. Such high-resolution displays are not only very challenging to produce, but also computationally challenging to process, transfer, and store such images. Besides these technical challenges, the rectilinear sampling scheme is very inefficient, leading to a large amount of redundant data for the human visual system (HVS) because the visual acuity (VA) of the human eye drops drastically beyond the fovea region on retina. For instance, a 4 K display based on a rectilinear sampling scheme can only support an HMD of 66° circular FOV to achieve 1 arc minute angular solution, while over 88% of the rendered information is not perceived by the human visual system at a given time instance.

To mitigate the trade-off between FOV and resolution, a foveated display, inspired by the foveation properties of human eyes, can be generally characterized as a method that identifies a users’ region of interest (ROI) and allocates the limited resources, such as a finite number of pixels or data processing and transmission bandwidth, differently between the ROI and the peripheral area outside the ROI region. For example, the number of pixels allocated to a display region is a function of its distance to the center of the ROI. The ROI may be determined by means of a gaze tracker, by tracking the salient points of the scene rendered by the display, by pre-determining the ROIs of the scene, or by other mechanisms.

Many efforts have been made to explore foveation techniques in imaging and display applications and they fall into one of three categories. The first category is experimental research to understand visual processing and perceptual artifacts, such as perceptible image blur and image motion, when viewing software-simulated foveated images [1]. The second category is an algorithmic approach in which foveation techniques are applied primarily to spatially variant image processing and video encoding [2] and variable levels of detail graphics rendering [35] to achieve real-time video communication and save data processing resources. In this approach, the display or imaging sensor hardware has a uniform high resolution, but the resolution of the rendered image decreases as it deviates away from the attended ROI. The third category of work takes a hardware approach, in which various imaging sensors or displays with spatially varying resolution are developed to reduce the requirements for high-resolution detectors and displays or high-quality and complex optical systems. For example, Sandini et al. demonstrated a retina-like image sensor characterized by spatially variant resolution similar to that of the human retina and demonstrated that 35 times fewer pixels were needed in the spatially variant resolution sensor as compared with a constant high-resolution image of 1100 × 1100 pixels [6]. Wick et al. presented the designs of foveated imaging systems in which a spatial light modulator (SLM) was used to dynamically correct the optical aberrations of a simple wide FOV optics at the region of interest [7]. Hua and Liu demonstrated a dual-sensor foveated imaging system where two separate imaging paths, one for foveal and one for peripheral vision, were integrated to capture foveated images and the high-resolution imaging path was steered by a 2D scanner according to the ROI [8]. Qin and Hua applied the dual-sensor architecture to develop multi-resolution foveated laparoscopes for minimally invasive surgery [9]. Iwamoto et al. demonstrated a bench prototype of a foveated display which dynamically scans a high-resolution inset image over a wide FOV low-resolution background display through 2D opto-mechanical scanners [10]. Rolland et al. reported the conceptual design of a high-resolution inset HMD system, in which a pair of microlens arrays optically duplicates a high-resolution inset image over a background display and a liquid crystal shutter is used to select one of the copies corresponding to the gazed ROI [11]. More recently, Tan et al. demonstrated a dual-resolution HMD design with two display panels of different optical magnifications as the image sources and a switchable Pancharatnam-Berry phase deflector for shifting the position of the foveated view [12]. Boris Greenberg reported a foveated HMD design based on a direct retinal projection method integrated with eyetracking, two dual-axis microelectro-mechanical system scanners, and two laser sources offering different scanline densities [13].

Among the prior works that attempted to apply a foveation method to the hardware design of an HMD system [1013], the common method for implementing a foveated HMD is a dynamic discrete foveation approach where a foveated region offering a high image resolution is dynamically steered in response to a user’s gaze direction and a relatively low-resolution region offers peripheral awareness. Such dynamic foveation method typically utilizes a dual-display architecture in which two displays of different pixel resolutions or two different optical paths of different optical magnifications are utilized to render the foveated and peripheral areas, respectively. The foveated area with a higher resolution typically covers a small FOV while the peripheral area with a substantially lower resolution covers a large portion of the entire FOV. In a dynamically foveated display, an eye tracking is typically required to track the line of sight of a viewer and thus determine the instantaneous ROI toward which the display of higher resolution is aimed. Finally, a dynamically foveated display requires a scanning method to mechanically [10,13] or optically [11,12] steer and align the high-resolution foveated display approximately with the viewer’s line of sight to achieve the goal of foveated rendering. Consequently, such dynamically foveated display is inevitably complex, high cost, large volume, and heavy weight because multiple displays and imaging paths are necessary to render multi-level resolution displays, an eye tracking device is required to track ROI, and a 2D steering mechanism, either mechanical or optical, is required for steering the foveated region. Finally, the multi-resolution approach provides multiple discrete samples of resolution and thus discontinuous perception of image quality as eye moves, leading to visual artifacts.

In this paper, we explore a new perceptual-driven approach to the design of statically foveated HMDs with the goal of offering a wide FOV across which the degradation of the perceived image resolution may be minimal during the course of eye movement and the perceivable image artifacts and image resolution discontinuity are minimized. Compared to the multi-level discrete and dynamic foveation approach in the prior art, the static foveation approach will not only maintain resolution continuity, but also eliminate the need for an eyetracker or scanning mechanism and therefore minimize hardware complexity. The rest of the paper is organized as follows. Section 2 presents the perceptual-driven approach, Section 3 describes the methods for evaluating the data sampling efficiency and perceived image resolution and quality as a function of eye gaze direction, Section 4 demonstrates the process of optimizing foveation schemes to meet the requirements of different applications and hardware constraints, Section 5 presents the assessment of an optimized foveation scheme by applying the proposed performance metrics, and finally Section 6 experimentally demonstrates the validation and assessments of the static foveation scheme through a testbed.

2. Perceptual-driven design of statically foveated displays

In the human visual system (HVS), only a narrow region around the fovea offers exceptional resolution, contrast, and color sensitivities, while these properties fall off rapidly with an increasing retinal eccentricity which is defined as the angular distance of a field point from the fovea center. The object field along a viewer’s line of sight (LoS) is imaged at the fovea center and the HVS adapts its LoS to be centered with the attended ROI through eye or head movements. Due to such inherent capability of dynamic gazing, we often anticipate that a foveated display is able to allocate its finite number of pixels in such a fashion that its angular pixel density follows the same distribution function as that of the photoreceptors on a human retina and is able to dynamically steer its high pixel density region, referred to as the foveated region of the display, such that it is always centered at the fovea of the retina within the limits of eye movements. As summarized in the previous section, a few examples of dynamically foveation display schemes have been demonstrated based on a dual-display architecture accompanied by either 2D opto-mechanical or optical scanners [1013]. Such dynamic foveation scheme, however, takes too much toll on the hardware complexity and may yield perceivable artifacts of resolution discontinuity.

Motivated by the fact that an HMD is generally attached to a user’s head with a relatively fixed viewing position, we propose a perceptual-driven static foveation approach where the characteristics of eye and head motions and the perceived visual effects are taken into account such that the degradation of the perceived quality of a statically foveated display may be imperceptible or minimal during the course of eye movements. Figure 1(a) shows a schematic illustration of a statically foveated display where the display plane is the conjugate virtual image of a display source seen by a viewer through the optics in an HMD system. Figure 1(b) plots an example of the resolution distribution for such a display as a function of field angles. For convenience, a reference coordinate system, OXYZ, is defined in the visual space as shown in Fig. 1(a). The origin O is located at the center of the entrance pupil of the right eye; the Z axis coincides with the corresponding LoS when the eye gaze direction is parallel to the head pose direction (in other words, no eye movements are engaged in both horizontal and vertical directions and the eye is gazing naturally straight forward); the OXY plane is perpendicular to the Z-axis, and the Y axis is pointing upward. For simplicity, the virtual display plane is assumed to be perpendicular to the Z-axis and a display reference coordinate system, IX'Y'Z’, is defined where the origin I is the intersection of the Z-axis of the OXYZ reference with the display plane, and the X'-, Y’, and Z’ axes are parallel to the X-, Y, and Z-axes, respectively. The display plane is displaced from the OXY plane by a distance L along the Z-axis, where L corresponds to the virtual display distance in an HMD system. A pixel position, P, on the virtual display plane can be uniquely defined by its corresponding field angle, θ, to the reference center I or equivalently the angular deviation of the pixel from the Z-axis of the OXYZ reference system. ${\theta _x}$ and ${\theta _y}$ correspond to the X-component and Y-component of the field angle $\theta $ in the horizontal and vertical directions, respectively.

 figure: Fig. 1.

Fig. 1. (a) The schematic illustration of a continuously foveated display where its angular resolution varies as a function of the field angle θ and symmetric about the display center I; (b) Example of an angular resolution distribution function along a given direction crossing the display center I as a function of the field angle; (c) Example of a pixel density distribution function on a microdisplay as a function of the field angle with an eyepiece optics of 1-inch focal length.

Download Full Size | PDF

Unlike a display based on a rectilinear sampling method, the virtual pixel pitch, p, on the display plane increases as the distance of the pixel from the display center, I, increases. Along a given radial direction, as the pixel position, P, deviates away from the center I, the pixel density of the display monotonically decreases and the angular resolution degrades. Here the pixel density is defined as the number of pixels per unit distance, while the angular resolution is defined as the visual angle subtended by a single pixel to the eye.

As illustrated in Fig. 1(a), the display plane may be divided into three functional regions, a fovea region, a parafovea region, and a peripheral region. Without loss of generality, we assume all three regions are rotationally symmetrical and centered with the display center I. Similar to the fovea of the retina, the fovea region of the display is statically fixed at the center region of the display and offers the highest pixel resolution. It shall offer a small and uniform or nearly uniform pixel pitch to ensure high angular resolution when the eye gaze direction falls within this region. The fovea region is bounded by a critical field angle, ${\theta _{c1}}$, which is considered as the visual and musculoskeletal balance point and defines a central region for frequent and comfortable eye movements. Based on the physiological characteristics of eye movements, a preferred choice for ${\theta _{c1}},\; $ is between 5 and 20. For instance, Burgess-Limerick et al. reported that comfortable eye movements occur within a field angle of ±15° for a good compromise between visual and musculoskeletal needs [14]. The parafovea region of the display is immediately adjacent to the fovea region and offers a medium rate of degradation in resolution. The parafovea region is the annular zone bounded by two critical field angles, ${\theta _{c1}}$ and ${\theta _{c2}}$, which are considered as the balance points between eye movements and head motion. Within the angular range of ${\pm} ({\theta _{c1,}}{\theta _{c2}})$, eye movements are expected to be gradually less preferred than an alternative choice of head motion due to muscular strain and discomfort [14]. A preferred choice for ${\theta _{c2}}\; $ is between 20 and 40. Cook and Stark reported that head motion instead of eye movements likely occurs when the field angle is greater than 30° [15]. The virtual pixel pitch within the angular range of ${\pm} ({\theta _{c1,}}{\theta _{c2}})$ is expected to increase monotonically at a rate such that the angular resolution is still relatively high when the eye is gazed within this region. The peripheral region of the display is immediately next to the parafoveal region for field angles greater than ${\pm} {\theta _{c2}}$. It offers a rapid rate of degradation in resolution and mainly serves the purpose of peripheral vision and the sense of immersion. Within this region, we anticipate that comfortable eye movements unlikely occur and head or body motion is preferred. In an HMD system, a head tracker can be utilized for updating the scene rendering according to head motion without change of the relative position of the eye gaze to the display field. Therefore, eye gaze direction much less likely falls within the peripheral region. More control points may be added to further divide the peripheral region as needed by specific applications of the proposed scheme.

The pixel distribution of a foveated display can be characterized by the angular resolution distribution function, denoted as ${F_{FD}}({{\theta_x},{\theta_y}} )$, which is defined as the reciprocal of the angular resolution of the display in minutes of arc, where ${\theta _x}$ and ${\theta _y}$ correspond to the X-component and Y-component of the field angle $\theta $ in the horizontal and vertical directions, respectively. Based on the division of the functional regions described above, the angular resolution distribution of a statically-foveated display may be expressed as

$${F_{FD}}({\theta _x},{\theta _y}) =\begin{cases}\begin{array}{lr}{{f_1}({\theta_x},{\theta_y})}&\textrm{ }|\theta |\le {\theta_{C1}}\\ {{f_2}({\theta_x},{\theta_y})}&{\theta_{C1}} < |\theta |\le {\theta_{C2}}\\ {{f_3}({\theta_x},{\theta_y})}&{\theta_{C2}} < |\theta |\le {\theta_{\max }} \end{array}\end{cases},$$
where f1, f2, and f3 are the segmented functions that characterize the resolution distribution within each corresponding region, and ${\theta _{\max }}$ is the maximum field angle in a radial direction. Figure 1(b) schematically illustrated the resolution distribution function (angular pixel density), FFD, along a radial direction $\overrightarrow r $ as a function of the field angle, $\theta $. In this example, the rate of resolution degradation, which is characterized by the slope of the resolution distribution curve, vary with the field position. One of the distinct features of the proposed scheme compared to the prior dual-resolution foveation scheme is its ability to ensure continuous change of resolution distribution. To ensure resolution continuity and image smoothness along the boundaries of a foveated display, the function values and the derivatives at the first and second critical balance points, ${\theta _{C1}}\textrm{ }$ and ${\theta _{C2}}\textrm{ }$, need to be equal. In other words, the following conditions shall be satisfied:
$$\begin{array}{l} {f_1}(|{{\theta_{C1}}} |) = {f_2}\textrm{(}|{{\theta_{C1}}} |\textrm{)}\\ f_1^{\prime}(|{{\theta_{C1}}} |) = {f_2}^\prime \textrm{(}|{{\theta_{C1}}} |\textrm{)} \end{array} \;\;\;\;\;\;\;\textrm{and} \;\;\;\;\;\;\;\begin{array}{l} {f_2}(|{{\theta_{C2}}} |) = {f_3}\textrm{(}|{{\theta_{C2}}} |\textrm{)}\\ {f_2}^\prime (|{{\theta_{C2}}} |) = {f_3}^\prime \textrm{(}|{{\theta_{C2}}} |\textrm{)} \end{array}.$$

The angular resolution distribution in Eq. (1) is independent of the virtual display distance to a viewer and characterizes a viewer’s visual experiences. The pixel density, characterized by the number of pixels per unit distance, is commonly used as an engineering parameter for measuring the spatial resolution of a 2D display and provides direct guidance on hardware requirements. The virtual pixel pitch, p, measured in millimeters on the virtual display plane for a given field angle $\theta$ can be obtained from the resolution distribution in Eq. (1) as

$${p_{VD}}({\theta _x},{\theta _y}) = \frac{{L \cdot \textrm{tan}(\frac{1}{{60 \cdot {F_{FD}}({\theta _x},{\theta _y})}})}}{{{{\cos }^2}\theta }}, $$
where L is the distance between the virtual display plane and the eye. The corresponding pixel density distribution per inch (PPI) on the virtual display plane is described as
$$PP{I_{VD}}({\theta _x},{\theta _y}) = \frac{{25.4{{\cos }^2}\theta }}{{L \cdot \textrm{tan}(\frac{1}{{60 \cdot {F_{FD}}({\theta _x},{\theta _y})}})}}. $$

To guide the optical design, it is preferable to convert the PPI measurement on the virtual display to the PPI on a microdisplay panel to be optically magnified by an eyepiece. Let us consider an eyepiece with a constant optical power with a focal length of fEP, the pixel density distribution per inch on the microdisplay plane can be obtained as

$$PP{I_{MD}}({\theta _x},{\theta _y}) = \frac{{25.4{{\cos }^2}\theta }}{{{f_{EP}} \cdot \textrm{tan}(\frac{1}{{60 \cdot {F_{FD}}({\theta _x},{\theta _y})}})}}. $$

The pixel pitch for at the center field is ${f_{EP}} \cdot \textrm{tan}(\frac{1}{{60 \cdot {F_{FD}}(0,0)}})$. As an example, Fig. 1(c) plots the microdisplay pixel density distribution for the resolution distribution function illustrated in Fig. 1(b), where the eyepiece focal length is assumed to be 1 inch. Practically, instead of requiring a microdisplay panel with spatially varying pixel density distribution characterized by Eq. (5), we can adopt microdisplay panels with uniform pixel pitch, p0, and carefully design an eyepiece with spatially varying optical power, ${\Phi _{EP}}$, characterized as

$${\Phi _{EP}}({\theta _x},{\theta _y}) = \frac{1}{{{f_{EP}}({\theta _x},{\theta _y})}} = \frac{{\textrm{tan}(\frac{1}{{60 \cdot {F_{FD}}({\theta _x},{\theta _y})}})}}{{{p_0}{{\cos }^2}\theta }}. $$

3. Performance metrics for evaluating a foveated display

In general, many forms of resolution distribution functions may be utilized for implementing the proposed static-foveation method, but carefully optimized functions can lead to minimally perceivable quality degradation or more data saving or ease of implementation. One of the key aspects of optimization is to develop adequate quality metrics that can be utilized to evaluate the performance and artifacts of different resolution distribution functions and to establish meaningful merit values functions to obtain optimal function forms that meet the requirements of different applications. This section will present two types of the proposed performance metrics, one focusing on evaluating perceived image quality variations, and the other focusing on data saving efficiency.

3.1. Perceived visual acuity of a foveated display

The key hypothesis of a statically-foveated display is that we anticipate that eye movements occur frequently in the fovea region and much less frequently in the parafovea region. Therefore, evaluating the visual quality of such a display needs to account for the dynamics of eye movements and the visual acuity (VA) characteristics of the human visual system. A perception-driven quality metric is to find an optimal resolution distribution for a statically-foveated display scheme such that the resulted display offers nearly imperceptible resolution degradation within the fovea region and small degradation within the parafovea region when a viewer’s eye is gazing at the corresponding regions. To achieve this goal, we characterize the perceived visual acuity of a display, also known as the perceived resolution, as a function of its field angle and eye gaze direction by factoring the visual acuity of a 20/20 standard observer and the resolution distribution of the display.

As illustrated in Fig. 1(a), let us consider the eye is gazing at a point G on the display plane. The gaze direction ${\phi _G}$ is the angle of eye rotation with respect to the Z-axis, and ${\phi _G}$ may be decomposed into two orthogonal components, $({\phi _{Gx}},{\phi _{Gy}})$, corresponding to the X-component and Y-component eye rotations in the horizontal and vertical directions, respectively. The relative visual acuity (VA) of the HVS, denoted as VAHVS, describes the resolution distribution as a function of the eccentricity angle of a given visual field from the eye gaze direction, and is defined as the normalized reciprocal of the angular resolution in minutes of arc and may be modeled as [16,17]

$$\textrm{V}{\textrm{A}_{HVS}}({\theta _x},{\theta _y},{\phi _{Gx}},{\phi _{Gy}}) = {e_2}/({e_2} + \sqrt {{{({\theta _x} - {\phi _{Gx}})}^2} + {{({\theta _y} - {\phi _{Gy}})}^2}} ), $$
where $\; {e_2} \cong 2.3^\circ .$ The eccentricity angles, $({e_x},{e_y})$, of the given field from the fovea center are given as ${e_x} = {\theta _x} - {\phi _{Gx}}$ and ${e_y} = {\theta _y} - {\phi _{Gy}}$ in the horizontal and vertical directions, respectively.

The perceived visual acuity of a foveated display, denoted as, $V{A_{FD}}$, for a given field angle $\theta $ varies with the eye gaze direction, ${\phi _G}$, and is determined by obtaining the smaller one between the values of the display resolution distribution and the VA curve of a 20/20 standard observer depicted by Eq. (7). Generally, it can be expressed as

$$V{A_{FD}}({\theta _x},{\theta _y},{\phi _{Gx}},{\phi _{Gy}}) = \min [{F_{FD}}({\theta _x},{\theta _y}),V{A_{HVS}}({\theta _x},{\theta _y},{\phi _{Gx}},{\phi _{Gy}})]. $$

For a conventional single-resolution display offering a uniform pixel sampling across its entire FOV, its perceived visual acuity is a constant, independent of eye gaze direction. For instance, for a display that matches the fovea resolution of a 20/20 standard observer (i.e. 1 arcmin per pixel), its perceived VA for a 20/20 standard observer can be characterized as $V{A_{FD}}({\theta _x},{\theta _y},{\phi _{Gx}},{\phi _{Gy}}) \equiv 1$. For a foveated display, however, its perceived VA varies with gaze direction. Figure 2 illustrates an example of modeling the perceived VA of a foveated display in which the resolution distribution (in solid red line) is a simple Gaussian function, ${\textrm{F}_{\textrm{FD}}}(\theta ) = {e^{ - 0.001 \times {\theta ^2}}}$, to describe a rotationally symmetric foveated scheme with its peak centered on the field angle of 0 (i.e. the Z-axis). In this example, the eye is gazed at a 40angle away from the center and the VA of the HVS given by Eq. (7) is plotted in black solid line. The perceived VA of the display under this gaze condition is obtained by applying Eq. (8) and is plotted by the yellow line with “*” markers. The yellow-shaded area under the yellow curve indicates the region where the perceived image quality is as high as the VA of the observer and the black-shaded area illustrates the region where the perceived image quality is limited by the display resolution rather than by the VA of the observer. In this example, at a 40 eye gaze angle, the maximum perceived VA value is about 0.31, occurring at the 34field angle. The display outperforms on the left side of the 34 peak and underperforms on the right side.

 figure: Fig. 2.

Fig. 2. Illustration of the perceived resolution of a foveated display with its resolution distribution function in the form of a simple Gaussian function while the eye is gazed in the direction of 40° away from the display center.

Download Full Size | PDF

Further analytical metrics can be computed from the perceived resolution in Eq. (8) to evaluate the perceived performance and make comparison among different distribution functions for foveated displays. For instance, the perceived maximum resolution of a display, denoted as $V{A_{FD}}|{_{\max }} $, is defined as the perceived maximum resolution across the overall FOV at different eye gaze directions to evaluate the perceived peak performance with respect to eye motion. It is expressed as

$$V{A_{FD}}|{_{\max }} ({\phi _{Gx}},{\phi _{Gy}}) = \max [V{A_{FD}}({\theta _x},{\theta _y},{\phi _{Gx}},{\phi _{Gy}})]. $$

Clearly, an ideal foveated display with $V{A_{FD}}|{_{\max }} ({\phi _{Gx}},{\phi _{Gy}})$ equal to or approaching to 1 is highly desirable for the region with active eye movements. Similarly, we can also define a summative metric, denoted as $V{R_{FD}}$, to assess whether the perceived resolution of a display is below the perceptible limit of the HVS by computing the ratio of the volume enclosed by the perceived resolution curve of a display to the volume enclosed by the VA curve of the HVS across the display FOV for different eye gaze directions. The volume ratio is defined as

$$V{R_{FD}}({\phi _{Gx}},{\phi _{Gy}}) = \frac{{\int_{ - {\theta _{\textrm{Xmax}}}}^{{\theta _{\textrm{Xmax}}}} {\int_{ - {\theta _{\textrm{Ymax}}}}^{{\theta _{\textrm{Ymax}}}} {V{A_{FD}}^2({\theta _x},{\theta _y},{\phi _{Gx}},{\phi _{Gy}})} } d{\theta _x}d{\theta _y}}}{{\int_{ - {\theta _{\textrm{Xmax}}}}^{{\theta _{\textrm{Xmax}}}} {\int_{ - {\theta _{\textrm{Ymax}}}}^{{\theta _{\textrm{Ymax}}}} {V{A_{HVS}}^2({\theta _x},{\theta _y},{\phi _{Gx}},{\phi _{Gy}})} } d{\theta _x}d{\theta _y}}},$$
where θXmax and θYmax are the maximum half FOVs of the display system in the horizontal and vertical directions, respectively. This metric effectively evaluates the ratio between the whole volumes below the yellow curve and the black curve in Fig. 2. For a given eye gaze direction, a ratio of 1 indicates that the display performs to the limit of the HVS across its entire FOV and a ratio less than 1 indicates the display underperforms to the HVS limit at some field angles. The metrics defined in Eqs. (8) through (10) characterize the perceived resolution performance provided by a display as a function of its field angles and gaze directions.

3.2 Bandwidth and data sampling efficiency

One of the major objectives for a foveated display is to improve the sampling efficiency by allocating finite hardware sources such as limited display pixels or data bandwidth differently between the attended ROI of the peripheral area outside the ROI. By adopting the analytical method described by Hua and Liu in [8], the total amount of raw data rendered by a system can be calculated by integrating its resolution distribution across all fields, and can be expressed as

$$B = \int_{ - {\theta _{\textrm{Xmax}}}}^{{\theta _{\textrm{Xmax}}}} {\int_{ - {\theta _{\textrm{Ymax}}}}^{{\theta _{\textrm{Ymax}}}} {{F_{FD}}^2({\theta _x},{\theta _y})} } d{\theta _x}d{\theta _y}. $$

The metric in Eq. (11) measures the bandwidth requirement of a system. To evaluate and compare the relative data sampling efficiency of different foveation schemes, we consider a uniformly sampled, single-resolution display (SRD) as a reference where the SRD offers the same resolution across its entire FOV as the peak resolution of a foveated system, i.e. ${F_{SRD}}({\theta _x},{\theta _y}) = {F_{FD}}(0,0)$. The data sampling efficiency of a foveated display (FD) is given as:

$${S_{F\textrm{D}}} = \frac{{{B_{SRD}} - {B_{FD}}}}{{{B_{SRD}}}}. $$

Alternatively, as adopted in [8], the effective information throughput of a display may be evaluated against a reference system whose spatial resolution distribution represents the just-adequate resolvability required for its users. For instance, the visual acuity response of a standard observer with 20/20 vision can be utilized as the reference resolution distribution for evaluating the efficiency of a display system. Displays with the same or better resolvability than that of a standard observer are considered to be perceptually equivalent. Compared against the reference system providing the same FOV and same peak resolution, the information throughput, E, of a given display scheme can be defined as:

$$\textrm{E} = \frac{{\int_{ - {\theta _{\textrm{Xmax}}}}^{{\theta _{\textrm{Xmax}}}} {\int_{ - {\theta _{\textrm{Ymax}}}}^{{\theta _{\textrm{Ymax}}}} {{F_{HVS}}^2({\theta _x},{\theta _y})} } d{\theta _x}d{\theta _y}}}{{\int_{ - {\theta _{\textrm{Xmax}}}}^{{\theta _{\textrm{Xmax}}}} {\int_{ - {\theta _{\textrm{Ymax}}}}^{{\theta _{\textrm{Ymax}}}} {{F_{FD}}^2({\theta _x},{\theta _y})} } d{\theta _x}d{\theta _y}}} = \frac{{{B_{HVS}}}}{{{B_{FD}}}}. $$

Hereby it is assumed that the center field of the display is aligned with the center of the reference system. Figure 3 plots the information throughputs of a SRD and a foveated display as a function of the overall FOV in logarithmic scale. In this example, a circular field of view is assumed. The foveated display assumes the same resolution distribution as the Gaussian distribution exampled used in Fig. 2 with a peak resolution of 1 arc minute at the center field, while the SRD assume a uniform resolution of 1 arc minute across its entire FOV. Although the effective information throughputs for both the SRD and FD monotonically decrease with an increasing FOV, the ratio of throughput of the FD scheme to SRD increases rapidly. The throughput of a SRD is as low as ∼2% for a system of 60◦ FOV and less than 1% for a 100◦ FOV, which suggest that the amount of redundant information produced by a 60-FOV is nearly 47 times more than that of the perceptually equivalent system. On the other hand, the throughputs of a FD scheme with a Gaussian distribution are 4.4% and 4.9% for the same FOVs, respectively.

 figure: Fig. 3.

Fig. 3. Illustration of information throughputs of a SRD and a foveated display as a function of the overall FOV.

Download Full Size | PDF

4. Optimization of a statically-foveated display scheme

The key for implementing the proposed method described in Sec. 2 is to optimize the choices of the critical balance points, ${\theta _{C1}}$ and ${\theta _{C2}}$, as well as the resolution distribution functions for the different functional regions as defined in Eq. (1) by accounting for the statistic characteristics of eye and head motion, the perceived performance of a display as a function of eye motion and field angle characterized by Eqs. (8) through (10), and the relative data saving efficiency by Eq. (12). By considering the various factors and using the metrics and constraints described in Sec. 3, we adjusted the general forms of the segmented functions defined in Eq. (1) to obtain resolution distribution functions that offer a good balance among the factors such as maximum perceived resolution, the volume ratio, data saving efficiency, and overall FOV.

The optimization process starts with an initialization step, which consists of three key aspects, the initialization of the resolution distribution functions, target display specifications, and threshold performance metrics. The initialization of the resolution distribution functions requires choosing the general form of functions and the associated critical balance points and specifying the parametric space and range for optimization. It involves in several critical considerations for a statically-foveated display and will be detailed below. Target display specification provides not only the performance requirements of a target display such as spatial resolution and field of view, but also hardware constraints such as total pixel counts and data bandwidth available. The threshold performance metrics specify the thresholds to determine if a given parametric combination yields a solution that satisfies target display specifications. Following the initialization step, each set of the parametric combinations within the specified range is evaluated by computing the performance metrics defined by Eq. (8) through (10) and its performance is compared against the corresponding threshold values to determine if it yields a viable solution. In the final step, the performance metrics for all the parametric combinations that yield satisfying solutions are compared against each other and an optimal resolution distribution is determined by selecting the parametric combination that offers both minimally perceptible VA degradation and maximal data saving efficiency within a given set of display hardware constraints and performance requirements.

The optimal solution to the resolution distribution of a statically-foveated display depends on the requirements of different applications. Hereby we demonstrate the optimization process outlined above through an example we implemented. The process began with choosing the function forms for the resolution distribution function defined in Eq. (1) for a foveated display. Though the function may take many different forms, our choice is based on considerations of perceived visual effects, generalization capability of results, simplicity of parametric space for optimization, and ease of implementation. First of all, a uniform or nearly uniform resolution is desired in the fovea region ($|\theta |\le {\theta _{C1}}$) to ensure nearly imperceptible image quality degradation when the eye gaze is fixated within this region. Secondly, a segment of a Gaussian function is chosen to describe the resolution distribution within the parafovea region (${\theta _{C1}} \le |\theta |\le {\theta _{C2}}$). Besides its simplicity, a Gaussian function provides not only an elegant approximation of the visual acuity degradation of the HVS given by Eq. (7), but also a statistical estimation of the eye motion probability within this region. More specifically, due to the increasing torque imposed to the eye muscles as the eye gaze angle increases, the possibility of eye fixation at large field angles decreases. Consequently, the resolution distribution of the parafovea region is expected to decrease as the field angle increases, and a lower rate of resolution degradation is expected for fields closer to the fovea region than that for fields further away. The rate of degradation with increasing field angles can be adjusted by optimizing the standard deviation of the Gaussian function. To ensure smooth transition across the boundary of the fovea and parafovea regions, the peak value of the selected Gaussian function shall match with the resolution of the threshold field angle of the fovea region. Finally, the resolution distribution for the peripheral region ($|\theta |> {\theta _{C2}}$) is modeled by the VA curve of the HVS at a large eccentricity angle with a polynomial correction function to make the curve continuous at the ${\theta _{C2}}$. In summary, in our optimization process the general form of the resolution distribution function defined in Eq. (1) is modeled with rotational symmetry as

$${F_{FD}}(\theta ) = \begin{cases}\begin{array}{lr} {F_0}&|\theta |\le {\theta_{C1}}\\ {{F_0}{e^{\frac{{ - {{(\theta - {\theta_{C1}})}^2}}}{{2{\sigma^2}}}}}}&{\theta_{C1}} < |\theta |\le {\theta_{C2}}\\ {\frac{{2.3{F_0}}}{{2.3 + (\theta - {\phi_{th}})}}\textrm{ + }\Delta (\theta )}&|\theta |> {\theta_{C2}} \end{array}\end{cases},$$
where F0 is the reciprocal of the angular resolution at the display center, σ is the standard deviation of the Gaussian function, affecting the rate of resolution degradation within the parafovea region, and ${\phi _{th}}$ is a threshold eye gaze angle that determines the eccentricity of a field angle in the peripheral region and the rate of resolution degradation. $\Delta (\theta )$ is the polynomial correction function to make the third and the second segments of the resolution distribution function continuous at the ${\theta _{C2}}$, where the specific order of the $\Delta (\theta )$ is determined by the value of σ, ${\theta _{C1}}$, and ${\theta _{C2}}$. It is worth noting that the third segment contributes little for the perceived performance and data sampling efficiency, so usually we can use as simple as a third-order polynomial function to express the $\Delta (\theta )$.

Although the σ value in Eq. (14) explicitly defines the rate of resolution degradation within the parafovea region, it does not intuitively quantify the relative image quality between the fovea and parafovea regions. Instead, the ratio, η ($0 < \eta \le 1$), of the resolution corresponding to the parafovea critical field angle, ${\theta _{C2}}$, to the resolution corresponding to the fovea critical angle ${\theta _{C1}}$, is defined to quantify the relative image quality variation across the display. Its relationship with the standard deviation σ is expressed as:

$$\sigma = \frac{{{\theta _{C2}} - {\theta _{C1}}}}{{\sqrt { - 2\ln \eta } }}. $$

For instance, when σ equals to the angular size of the parafovea region (i.e. $\sigma = {\theta _{C2}} - {\theta _{C1}}$), the resolution ratio η equals to ${1 / {\sqrt e }}$ and the resolution corresponding to the field angle ${\theta _{C2}}$ is about 60% of the resolution for the field angle ${\theta _{C1}}$. Similarly, if σ is increased to twice as large, η equals to ${1 / {{e^2}}}$ and the resolution for the field angle ${\theta _{C2}}$ is about 13.5% of the resolution for the field angle ${\theta _{C1}}$. The higher the ratio η, the less quality degradation across the display, but the less data saving is expected.

To ensure resolution continuity at the boundary of the parafovea and peripheral regions, the threshold gaze angle defining the resolution distribution for the peripheral region, ${\phi _{th}}$, is determined by matching the function values of the corresponding resolution distribution functions when $\theta = {\theta _{C2}}$, and the threshold gaze angle, ${\phi _{th}}$, is obtained as

$${\phi _{th}} = {\theta _{C2}} + 2.3(1 - \frac{1}{\eta }). $$

Based on Eq. (14), the key factors to optimize a statically foveated display include the two critical field angles, ${\theta _{C1}}$ and ${\theta _{C2}}$, as well as the resolution ratio η. For simplicity, we assume F0 equals to 1, matching the peak VA of 1 arcminute for a 20/20 observer. During the optimization, the searching ranges for ${\theta _{C1}}$ and ${\theta _{C2}}$ are set to be [5, 20°] and [20°, 40°], respectively, at an increment of one degrees, while the searching range for η was set to be [0.1, 0.9] at an increment of 0.05. These variable configurations and increments yield a total of 5712 sets of parametric combinations and thus 5712 different foveation schemes. For each foveation scheme, the performance metrics defined by Eqs. (8) through (12) are applied. The performance metrics defined in Sec.3.1 depend on the eye gaze direction and full FOV of a target display. The gaze direction was varied from 0 up to 40° at a 5° increment. The full FOV, corresponding to twice of the maximum field angle ${\theta _{\max }}$ measured from the display center, was varied from 0° up to 160° at an increment of 2°, or equivalently with the maximum field angle ${\theta _{\max }}$ varied from 0 up to 80° at an increment of 1°.

Figure 4(a) plotted 6 different resolution distribution schemes as a function of the field angle $\theta$ selected from the 5712 foveation schemes, where η is chosen among 0.3, 0.5, and 0.7, ${\theta _{C1}}$ is fixed at 10°, ${\theta _{C2}}$ is chosen between 20° and 40°, and the maximum field angle is set at 80°. We can observe that the resolution ratio, η, between the fovea and parafovea boundaries and the choice of the ${\theta _{C2}}$ value have significant impacts not only on the rate of resolution degradation across the parafovea region but also on the resulted resolution distribution across the FOV. Consider the curves with the same ${\theta _{C2}}$ value (the lines of the same color), which correspond to the same division between the parafovea and peripheral regions. In this case, a higher η value leads to a slower resolution degradation and overall higher resolution across the FOV than a lower ratio, but potentially yield less data savings. On the other hand, consider the curves with the same η value (the lines of same line style), which correspond to different division between the parafovea and peripheral regions. In this case, a higher ${\theta _{C2}}$ value suggests a larger parafovea region with relative high resolution and a narrower peripheral region. To maintain the same η value, however, a larger ${\theta _{C2}}$ value implies a larger standard deviation σ and slower resolution degradation within the parafovea region and also a higher resolution in the peripheral region, while it may yield less data saving efficiency.

 figure: Fig. 4.

Fig. 4. Comparison of six different foveation schemes with different ratio η and critical balance point ${\theta _{C2}}$: (a) The resolution distribution as function of field angle; (b) Perceived maximum resolution as function of eye rotation angle; (c) Volume ratio of perceived resolution distribution as function of eye rotation angle; and (d) Relative data sampling efficiency as function of full FOV.

Download Full Size | PDF

By applying the VA metrics defined in Eqs. (8) through (12), we computed and compared the perceived visual acuity of the six different foveation schemes. Based on Eqs. (9) and (10), Figs. 4(b) and (c) plotted the perceived maximum resolution and volume ratio of the six foveation schemes as a function of eye gaze angle varied from 0 up to 40° at an increment of 5°. The larger values of ${\theta _{C2}}$ and η will lead to higher values of the perceived maximum resolution and volume ratio. For example, for a fixed ${\theta _{C2}}$ value of 40°, the perceived maximum resolution values at 30° eye gaze angle are 0.63, 0.75, and 0.86, for η value of 0.3, 0.5, and 0.7, respectively. The volume ratio remains nearly a constant of 1 for up to 30° eye gaze angle and maintains a ratio above 80% for gaze angles between 30° and 40°. In comparison, for ${\theta _{C2}}$ value of 20°, both the perceived maximum resolution values and volume ratio values are substantially lower than those with a larger ${\theta _{C2}}$ value of 40°. The effect of η value is similar to the effect of ${\theta _{C2}}$. For example, for the same 30° eye gaze angle and the same ${\theta _{C2}}$ value of 40°, the perceived maximum resolution values are 0.86 and 0.61 for η value of 0.7 and 0.3, respectively.

By applying the metrics defined in Eqs. (11) and (12), we further compared the bandwidth and data sampling efficiency of the six foveation schemes for displays of different full FOVs ranging from 0° up to 160°. Figure 4(d) plotted the sampling efficiency of the six foveation schemes as a function of the total FOV. As expected, when the full FOV is equal to or smaller than the corresponding $2{\theta _{C1}}$ angle, only the fovea region is utilized and thus no gain in data sampling efficiency over a single-resolution display. When the full FOV is greater than $2{\theta _{C1}}$, the sampling efficiency increases rapidly as the parafovea region increases and the rate of efficiency improvements reaches its peak when the full FOV is equal to $2{\theta _{C2}}$. When the full FOV is beyond $2{\theta _{C2}}$, the sampling efficiency continues to rise but at a reduced rate. In general, a higher η value leads to a lower data sampling efficiency and a slower increase in efficiency improvement rate. With the same η value, a smaller ${\theta _{C2}}$ value yields a higher sampling efficiency. For example, for a display with a 120° FOV and 20° of ${\theta _{C2}}$, the relative data sampling efficiency is 0.92, 0.87 and 0.84, for a η value of 0.3, 0.5 and 0.7, respectively. For a display with a 120° FOV and η value of 0.5, the relative sampling efficiency is 0.87 and 0.82, for ${\theta _{C2}}$ equal to 20° and 40°, respectively. However, a higher sampling efficiency may be achieved at the cost of perceived display performance as shown in Figs. 4(b) and (c) and a good balance between these two sets of performance metrics need to be considered.

Choosing an optimal foveation scheme requires considering many factors, such as target display performance requirements, hardware constraints, and application demands. For the purpose of developing a proof-of-concept prototype, we aim to demonstrate the usability of statically-foveated display with a commercially available 4 K monitor. In terms of the threshold performance specifications, the target display shall be able to yield a field angle at least 80°, an angular resolution of 1 arc minute per pixel minimally in the center of the display, and a data sampling efficiency of at least 50%. When the eye is gazed at a 30° angle from the display center, the perceived maximum resolution shall be better than 4 arc minutes and the volume ratio shall be no less than 0.5. Among the 5712 foveation schemes compared, 133 of the schemes satisfied the threshold performance specifications. After further comparison of the 133 schemes, we chose the foveation scheme with ${\theta _{C1}} = {10^ \circ }$, ${\theta _{C2}} = {30^ \circ }$, and $\eta = 0.3$ for a target display prototype. Based on Eqs. (15) and (16), the standard deviation σ defining the rate of resolution degradation within the parafovea region is $\sigma = {12.89^ \circ }$, while the threshold gaze angle defining the resolution distribution for the peripheral region is ${\phi _{th}} = {24.63^ \circ }$ to make sure the entire function is smooth and continuous without the need for a polynomial correction function $\Delta (\theta )$. The resolution distribution for the selected foveation scheme is expressed as

$${F_{FD}}(\theta ) = \begin{cases}\begin{array}{lr} {1}&|\theta |\le {{10}^ \circ }\\ {{e^{ - \frac{1}{2}{{\left( {\frac{{\theta - 10}}{{12.89}}} \right)}^2}}}}&{{10}^ \circ } < |\theta |\le {{30}^ \circ }\\ {\frac{{2.3}}{{2.3 + (\theta - 24.63)}}}&|\theta |> \textrm{3}{0^ \circ } \end{array}\end{cases}.$$

5. Performance assessment of the proposed foveated display scheme

Based on the performance metrics described in Section 3, this section will present a thorough assessment of the proposed continuous foveation scheme defined in Eq. (17) and offer a comparison against well-known dual-resolution schemes [8]. As we aim to apply the proposed scheme to the design of a statically foveated display, we therefore assume the end system will not have an optical or mechanical scanning mechanism to dynamically adjust the position of the fovea region relative to eye gaze direction. The fovea region for all the foveation schemes is assumed to be fixed at the display center, regardless of eye gaze.

As discussed in Sec.1, a dual-resolution scheme typically divides the full field into a fovea region with a uniformly high resolution and a peripheral region with a uniform but significantly reduced resolution. To provide meaningful comparison and insightful guidance, we choose two different dual-resolution schemes, one with a narrow ±10° fovea region and the other with a wider ±30° fovea region. The fovea region of the first dual-resolution scheme offers the same uniformly high resolution of 1 arc minute across the same region as that of the proposed continuous scheme, while the peripheral resolution is set to 0.187 to match the corresponding VA of the HVS at the eccentric angle of 10°. The fovea region of the second dual-resolution offers the same uniformly high resolution of 1 arc minute across ±30°, as wide as the combined fovea and parafovea regions of the proposed continuous scheme, while the resolution of its peripheral region is set to 0.07 matching with the corresponding VA of the HVS at the eccentric angle of 30°. Figure 5 plots the resolution distribution functions of the three different foveation schemes, the proposed three-segment continuous foveation scheme defined by Eq. (17) (red solid curve with asterisk marks), the 10° dual-resolution scheme (green dashed curve with diamond marks), the 30° dual-resolution scheme (blue dotted curve with pentagram marks), along with the VA curve of HVS for eye gaze angle of 0° (black solid curve) as a reference. We expect the perceived resolution of the 10° dual-resolution scheme will suffer without dynamic foveation while the 30° dual-resolution scheme do not necessarily need dynamic foveation. Furthermore, the optimal perceived resolution of the 10° dual-resolution scheme with dynamic foveation is expected to match the visual acuity of the HVS within the range of tracked eye movements. Because the ±30° is considered as the typical range of eye movements, the perceived resolution of the 30°dual-resolution scheme approximately simulates the anticipated performance of a 10° dynamically foveated dual-resolution system.

 figure: Fig. 5.

Fig. 5. Resolution distribution for three different foveation schemes: proposed three-segment continuous foveation scheme defined by Eq. (17), a 10°dual-resolution scheme, and a 30°dual-resolution scheme.

Download Full Size | PDF

5.1 Perceived visual acuity assessment

The perceived resolution of the three different foveation schemes shown in Fig. 5 were analyzed and compared by applying the metrics defined by Eqs. (8) through (10) in Sec. 3.1. Based on Eq. (8), we computed the perceived visual acuity of the three different foveation schemes as a function of field angles for different eye gaze angles, from 0 to 40 at an interval of 5. Figures 6(a) through 6(d) plot the results for 4 different eye gaze angles of 10, 20, 30 and 40, respectively. Due to rotational symmetry, only half of the display FOV from 0 to 80 is plotted. The perceived VA curves of three schemes are plotted by a red solid line, green dash line, and blue dotted line for the continuous, 10° dual-resolution, and 30° dual-resolution schemes, respectively. In each of the sub-figures, we applied different shading schemes to the areas under the perceived VA curves to represent regions where the perceived VA for one or multiple foveation schemes are as high as the relative VA of a 20/20 standard observer at the corresponding eye gaze direction. For instance, the red, green, or blue-shaded areas represent the regions where only the continuous foveation, 10° dual-resolution, or 30° dual-resolution scheme yields a perceived VA as high as a 20/20 standard observer, respectively, the gray-shaded areas represent the regions where the perceived VA of all three schemes are as high as the relative VA of a 20/20 standard observer. The yellow-shaded areas represent the combined regions of the continuous foveation and the 10° dual-resolution schemes which yield as high as the VA of a standard observer, while the magenta-shaded or cyan-shaded areas are for the combined regions of the continuous and 30° dual-resolution schemes or the combined regions of 10° and 30° dual-resolution schemes, respectively. The unshaded area under the HVS curve represents the region where none of the foveation schemes yields image quality as high as the VA of a 20/20 standard observer. The perceived VA values of all three foveation schemes across the entire FOV are nearly as high as the relative VA of a 20/20 standard observer for eye gaze angle less than 10°.

 figure: Fig. 6.

Fig. 6. The perceived visual acuity of three different foveation schemes shown in Fig. 5 as a function of field angles for the eye gaze angle of (a) 10°, (b) 20°, (c) 30°, and (d) 40°, respectively.

Download Full Size | PDF

Based on the perceived VA described above, we can observe that the perceived resolution of the continuous foveation scheme can almost match the limiting resolution of the HVS across the entire FOV for eye rotation angles up to ±15. As shown by Fig. 6(a), at a 10 eye rotation angle, the perceived VA across the entire field is better or equivalent to the VA of a 20/20 standard observer. At a 15 eye rotation angle, the perceived VA at the fovea center of the eye is about 0.95, which is only slightly below the peak VA value of a 20/20 standard observer. While the 30 dual-resolution scheme yields similar performance, the 10 dual-resolution scheme shows significant performance drop for field angles between 10 and 30, for example the magenta-shaded areas in Figs. 6(a) and (b). When the eye rotates within the region of ${15^ \circ } \le |{{\phi_G}} |\le {30^ \circ }$, as shown by the examples in Figs. 6(b) and 6(c), the perceived resolution of the continuous foveation scheme degrades gradually as expected for field angles near the fovea center of the eye, with peak values decreasing from about 0.95 to 0.45 for 15° to 30° eye gazing angle, respectively. In comparison, the 30 dual-resolution scheme yields slightly better performance for field angles less than 30, as shown by the blue-shaded areas in Figs. 6(b) and (c), than the continuous foveation scheme, but worse performance for field angles larger than 30 as shown by the yellow-shaded areas of the same figures. The 10 dual-resolution scheme in general yields the worst performance among the three schemes. When the eye rotates toward the peripheral region ($|{{\phi_G}} |> {30^ \circ }$), as shown in Fig. 6(d) for a 40° eye gaze angle, the perceived resolution of the continuous foveation scheme degrades further, but it is better than the performance of the 30° dual resolution scheme within 70° and also generally better than the 10° dual-resolution scheme except for the field angles region beyond 50°. Therefore, we can conclude that the proposed foveation scheme yields no perceivable resolution degradation across the entire FOV when the eye rotates within the ±15° region. When the eye rotates within the parafovea region, it is subject to moderate rate of resolution degradation, but still provides good enough perceived quality even with a 30° eye gaze angle where a peak VA value of 0.45 corresponds to 2.2 arcmins/pixel. It overall provides better resolution performance than the two dual-resolution schemes.

Applying Eqs. (9) and (10), we further computed the perceived maximum resolution and volume ratio of the three foveation schemes as a function of eye gaze angles varied from 0 up to 40° at an increment of 5°, and the results are plotted as Figs. 7(a) and 7(b), respectively. When the eye gaze angle is within ±15 from the display center, the maximally perceived resolution of the proposed continuous foveation scheme maintains a peak value of 0.95 or higher, while the 10 dual-resolution scheme drops downs to below 0.3 at the eye gaze angle of 15°. When the eye rotates within the region of ${15^ \circ } \le |{{\phi_G}} |\le {30^ \circ }$, the maximally perceived resolution of the proposed continuous foveation scheme gradually drops from 0.95 to 0.45, while the 10 dual-resolution scheme remains a low 0.2 beyond the eye gaze angle of 15°. The maximally perceived resolution of the 30° dual-resolution scheme generally remains its peak until the eye gaze reaches a 30° due to its much wider fovea region and then drops sharply beyond the 30° boundary. Furthermore, the volume ratio shown by Fig. 7(b) suggest that 95% or more of the entire display fields performs to the limit of the HVS for eye gaze angle up to ±25 for the continuous foveation scheme, which indicates nearly no perceivable degradation within this range of eye motion. In contrast only about 86% and 78% of the fields performs to the limit of the HVS at a 25 eye gaze angle for the 30 and 10 due-resolution schemes, respectively. Although the perceived maximum resolution of the 30 dual-resolution scheme is generally higher than the continuous scheme for eye gaze angles less than 30, its volume ratio is substantially lower, suggesting more regions of the display fields have lower perceived resolution. Overall, we can conclude that the perceived resolution performance of the continuous foveation scheme, when properly optimized, is adequate without implementing dynamic foveation, and degrades more gracefully than the traditional dual-resolution foveation schemes, especially within the area with most frequent eye movements.

 figure: Fig. 7.

Fig. 7. The comparison of (a) the perceived maximum resolution and (b) volume ratio of the three foveation schemes shown in Fig. 5 as a function of eye gaze angles.

Download Full Size | PDF

5.2 Data sampling efficiency assessment

By applying Eqs. (11) through (13), the data bandwidth requirement and sampling efficiency for the three foveation schemes shown in Fig. 5 were computed for displays of different full FOVs ranging from 0° up to 160°. Figure 8(a) plotted the relative data sampling efficiency as a function of total FOV. As expected, the sampling efficiency of all three schemes increases as the full FOV increases. The proposed continuous foveation scheme yields a high data saving efficiency of about 81.5% and 92.8% for displays of 100 and 160 FOV, respectively. The amount of redundant raw data produced by a 100° single-resolution display is almost 5 times more than that of the continuous foveation scheme. Considering a display of 100° FOV, the sampling efficiency of the continuous scheme is about 7% lower than the 10° dual-resolution scheme and about 18% higher than the 30° dual-resolution scheme at the same FOVs. As demonstrated in Sec. 5.1, the continuous foveation scheme shows significant advantages in terms of perceived resolution over a 10° dual-resolution scheme.

 figure: Fig. 8.

Fig. 8. (a) The comparison of data sampling efficiency as a function of the overall FOV for three different foveation schemes shown in Fig. 5; (b) The comparison of total pixel number required in a diagonal direction as a function of the overall FOV among the three foveation schemes shown in Fig. 5 and (a) single-resolution display.

Download Full Size | PDF

To further compare foveation schemes in terms of hardware requirement, we computed the total number of required pixels in the diagonal direction of a display as the function of overall diagonal FOV. Four different displays are investigated, three of which are based on the foveation schemes shown in Fig. 5 and one of which is a single-resolution scheme as a baseline. We further assume the peak resolution of all four display schemes is 1 arc minute per pixel, matching the peak VA of the HVS. It is worth noting that the spatial resolution is defined in angular space rather than based on pixel pitch. Therefore, a uniform angular resolution does not suggest a uniform pixel pitch, but uniform angular sampling of an angular range. When computing the angular resolution of a pixel, two effects should not be neglected especially for large field angles. The first is the projected pixel pitch of a physical pixel on a direction size perpendicular to the corresponding field direction, which is reduced by a factor of cosθ and θ is the field angle of the corresponding pixel. The second is the actual distance of a pixel to the viewer, which increases by a factor of 1/cosθ. Combining two factors, the resolution distribution function projected into a uniform pixel-distribution display space needs to multiple a term of square of cosθ. Figure 8(b) plots the required pixel count in a diagonal direction for the four display schemes. To achieve a 100° FOV in a direction, the continuous foveation scheme requires about 2900 pixels, a commercially available 4 K display is adequate for supporting an FOV over 160°. For the same 100° FOV, 1900 pixels, 3400 pixels, and 6000 pixels are required for the 10° dual-resolution, 30° dual resolution, and single uniform resolution schemes.

6. Experimental validation

6.1 Experimental setup

To experimentally demonstrate and validate the visual effects of a statically foveated display proposed in this paper, we built a test setup with a 27” 4 K monitor with a pixel size of 155.7 µm and resolution of 3840 by 2160 pixels. Creating a testbed that can be used to validate the proposed foveation scheme confronts multi-fold challenges and considerations. First of all, to generate ground-truth images for comparison, the display setup should be capable of rendering the original target image for an angular resolution of 0.5 arcminutes per pixel to match the highest VA of the HVS. It shall also cover a large FOV so that the visual effects of peripheral regions can be tested. However, sampling a full FOV of 160° horizontally at an angular resolution of 0.5 arcminutes requires 19200 by 10800 pixels, 25 times more than the pixels available to our 4 K monitor. Secondly, to effectively create an angular resolution of 0.5 arcminutes per pixel, the monitor needs to be placed at a viewing distance of 1070 mm, at which the 27” monitor can only cover an FOV of 31.2° by 17.9°. Mosaicking at least 25 monitors of the same size to cover a full FOV of 160° becomes not only challenging but also subject to alignment errors and artifacts due to monitor bezels. Thirdly, it is further challenging to capture digital images for such wide FOV and high resolution. To overcome these challenges, we generate high-resolution target images with a horizontal FOV of 80° measured from its top-left corner pixels and with adequate pixels to achieve 0.5 arcminutes per pixel. Effectively, these target images cover a quadrant of a total field up to 160° in the horizontal direction. We then chose to fix the positions of the monitor and the camera to be used for image capture, but virtually pan the rendering viewport of the monitor across the full FOV of the target images as if a virtual pair of monitor and camera was scanned across a large display. More specifically, a 2 K digital camera with a 50-mm focal length lens is centered with the 27” monitor placed at a distance of 1070 mm and the viewing direction of the camera is set to be perpendicular to the monitor surface. The camera offers an angular resolution of about 0.25 arcminutes per pixel. The modulation transfer function (MTF) of the camera was measured with the well-known slanted edge method with Imatest software to ensure the camera yields adequate image contrast and resolution at the Nyquist frequency of 60 cycles/degree of the display.

To render a foveated image, we start with a full-resolution target image and compute a foveated image to be displayed by convolving the original image with the resolution distribution function as a filter. The top-left corner pixel of the target images is assumed to be the center of a foveated display. Because the setup described above is able to yield a peak angular resolution of 0.5 arcminutes per pixel, the resolution distribution function defined by Eq. (17) should be scaled by a factor of 2 expressed as

$${F_{FD}}(\theta ) = \begin{cases}\begin{array}{lc} {2}&|\theta |\le {{10}^ \circ }\\ {2{e^{ - \frac{1}{2}{{\left( {\frac{{\theta - 10}}{{12.89}}} \right)}^2}}}}&{{10}^ \circ } < |\theta |\le {{30}^ \circ }\\ {\frac{{4.6}}{{2.3 + (\theta - 24.63)}}}&|\theta |> {{30}^ \circ } \end{array}\end{cases}.$$

A complication for implementing the convolution filter is that the resolution distribution function characterized by Eq. (18) remains constant for the central ±10° and then varies with the field angle or pixel location outside the fovea, which requires varying the filter with field angles to realize the different levels of blur matching the resolution distribution function. To account for this effect, the convolution filter is only applied to pixels outside the ±10° fields, and the filter applied to the pixels outside the fovea region is implemented by a Gaussian-type filter with a varying standard deviation and a varying convoluted area. The standard deviation of the Gaussian filter at a given field is set to approximate the corresponding angular magnification of the field, which is the ratio of the resolution at the target field angle to the resolution of center field. The convoluted area at a given field is set to be twice as the rounded value of the corresponding angular magnification such that the pixels within the convolution area is sufficient to improve the accurate of blur degree. Figure 9(a) demonstrates four examples of captured zoomed-in images corresponding to the field angles of 10°, 30°, 50°, and 80° with a resolution target, matching with resolution distribution function described by Eq. (18). In this set-up, the pixel pitch of the monitor is 155.7 µm or a pixel density of 163 PPI. After applying the convolution filter, the effective pixel density distribution for the foveated display is plotted in Fig. 9(b).

 figure: Fig. 9.

Fig. 9. (a) Examples of four captured zoomed-in images corresponding to 10°, 30°, 50°, and 80° field angles; (b) effective pixel density distribution for the foveated display setup as a function of field angles.

Download Full Size | PDF

6.2 Objective assessment and validation

To objectively evaluate the image quality rendered by the proposed foveation approach, we generated a 4K-resolution image consisted of a mosaic of 9 by 9 modified Briggs targets. Figure 10(a) shows a captured image of the original 4K-resolution Briggs target pattern displayed on the 4 K monitor by a camera lens of 16 mm focal length. The horizontal and vertical axes were added to the captured image to indicate the field angle on the monitor with the top-left corner pixel as the (0,0) field angle. The resolution target consists of repetitive sub-images of modified Briggs targets periodically arranged in a 9 by 9 regular grid. Each modified Briggs target consists of 16 checkerboards that differ in the number pixels per checker square and the number of checkers. Unlike the original Briggs targets, the light and dark checkers of all the checkerboards have the same contrast of 1. From the smallest to the largest checkers, the number of pixels per square for the first ten checkerboards increases from 1 to 10 pixels at an increment of 1, from 10 to 20 pixels at an increment of 2 pixels for the next five checkerboards, and 25 pixels per checker for the largest checkboard. The smallest checkerboard, consisting of 5 × 5 checkers with 1 pixel width per square, is located at the center of each target. Each modified Briggs target is sized to cover an FOV of 8° and 4.5° in the horizontal and vertical directions, respectively. By setting the appropriate spacing between adjacent target sub-images, the centers of the targets along the diagonal direction are precisely located at the positions matching the FOV of a multiple of 10°. In Fig. 10(a), four of the Briggs targets centered at 10°, 30°, 50°, and 80° in the diagonal direction are marked by a red box. Figure 10(b) showed the zoom-in images of these targets captured by a camera lens of 50 mm focal length and the image of the smallest checkerboard with 1-pixel checkers are shown as an inset image on the rightmost side. These captured images demonstrated adequate resolution for the original full-resolution target.

 figure: Fig. 10.

Fig. 10. (a) Captured image of original 4 K Briggs Target pattern by camera; (b) zoomed in views corresponding the viewing angle of 10, 30, 50, and 80 degrees.

Download Full Size | PDF

By applying the rendering method described in Sec. 6.1, a foveated image of the Briggs target mosaic shown in Fig. 10(a) is generated and displayed on the 4 K monitor. Figure 11(a) shows the captured foveated image by a 16 mm camera lens for overall effect. Then the zoomed-in images of the four marked targets centered at the 10°, 30°, 50°, and 80° field angles, respectively, are captured by a 50-mm camera lens with corresponding orientations to emulates the corresponding eye gaze angles, and the results are shown in Fig. 11(b). On each sub-image of Fig. 11(b), we objectively determine the checkerboards as the just distinguishable checkers (JDC) from the captured zoom-in when the contrast of the light-dark checkers falls to approximately 20%. As a result, the just distinguishable checkers corresponding to the 10°, 30°, 50°, and 80° field angles are 1-pixel, 3-pixel, 12-pixel, and 18-pixel checkers, respectively. Magnified views of these JDCs are shown on the right side of each sub-image. As a comparison, using the captured zoom-in images of the same areas with the original full-resolution 4 K images as shown in Fig. 10(b), we simulated the perceived image a 20/20 standard observer with the eye rotated at the angles of 10°, 30°, 30° and 30°, respectively. The perceived images were simulated by convolving the corresponding images in Fig. 10(b) with the filter corresponding to the VA of HVS defined by Eq. (7), by following the same rendering method described in Sec. 6.1. The filter applied to the zoom-in image for the 80° field angle assumes an eye gaze angle of a 30°, rather than 80°, from the 30° field due to frequent and comfortable eye rotation limits. Figure 11(c) shows the simulated perceived images corresponding to the same local areas as those in Fig. 11(b). Similarly, we objectively determined the JDC for each of the targets from the perceived images and the resulted JDC are 1-pixel, 1-pixel, 10-pixel, and 25-pixel checkers for 10°, 30°, 50°, and 80° field angles, respectively. Magnified views of the corresponding JCDs are shown on the right side of each sub-image in Fig. 11(c). By comparing Figs. 11(b) and 11(c), we can conclude that the statically foveated rendering method yields visually comparable image quality to the perceived images rendered from the original full-resolution image, which is a strong evidence that the proposed foveation method may render visually imperceptible quality degradation.

 figure: Fig. 11.

Fig. 11. (a) Captured image of processed Briggs Target pattern by camera; (b) zoomed in views corresponding the viewing angle of 10, 30, 50, and 80 degrees; (c) 4 rendered images of the same local areas perceived by a standard observer with the eye rotated at the angles of 15, 30, 30, and 30 degrees.

Download Full Size | PDF

Besides objective assessment with a Briggs target mosaic, we repeated the same procedure described above to a 4 K resolution image captured from a real-scene image. The captured foveated image of the entire scene is shown in Fig. 12(a), on which four sub-regions corresponding to 10°, 30°, 50°, and 80° field angles are marked by red boxes. Figures 12(b) and 12(c) shows the captured zoom-in images of these regions with the original full-resolution image and the rendered foveated image displayed on the monitor, respectively. For the purpose of comparison, Figs. 12(d), (e), and (f) show the perceived images of the four sub-regions of a 20/20 standard observer when the eye is rotated at 0°, 15° and 30°, respectively. These images are simulated from the captured zoom-in full-resolution images shown in Fig. 12(b) by applying the VA filter defined by Eq. (7). It is evident that the statically foveated display in Fig. 12(c) yields visually comparable or better image quality to the perceived images of a full-resolution display for most of the eye gaze angles except that the perceived image of the 50° sub-region for the 30° eye gaze angle appears to be slightly sharper than the captured foveated image of the same region.

 figure: Fig. 12.

Fig. 12. (a) Captured image of a foveated scene with four marked regions of interests, corresponding to 10°, 30°, 50°, and 80° fields; (b) zoomed-in images of the four marked regions captured with the original full-resolution image displayed on the monitor; (c) zoomed-in images of the four marked regions captured with the statically foveated image displayed on the monitor; (d)-(f) perceived images of the four marked regions by a 20/20 standard observer, simulated from the captured zoom-in full-resolution image for the eye gaze angle of 0°, 15°, and 30°, respectively.

Download Full Size | PDF

6.3 Quantitative assessment and validation

We further performed experiments to quantitatively evaluate the resulted image quality of the proposed foveation scheme. Two experiments were performed—one to measure the resolution distribution function of the foveated display to validate the proposed foveation scheme characterized by Eq. (18), the second experiment to measure the perceived display resolution as a function of eye gaze angle. Both experiments utilized two sets of bar targets, one in vertical and one in horizontal directions, respectively. Instead of rendering full-size foveated images, small full-resolution targets covering about 10° field angles were utilized, from which the foveated images corresponding to selected field positions were rendered. Both experiments utilized the same camera with a 50-mm focal length lens to capture the zoomed-in views of foveated targets rendered on the 4 K monitor for various field angles. Examples of captured images corresponding to 10°, 30°, 50°, and 80° field angles were shown in Fig. 9.

To evaluate the resolution distribution of the foveated display, we utilized the standard slanted-edge method to measure the MTFs of a test target. To minimize aliasing effects, we rendered horizontal or vertical bar targets on the monitor, rather than slanted edges, but rotated the monitor to create slanted edge effects. We sampled 22 fields along the diagonal direction of the display, ranging from 5° up to 80° at an increment of 5° within the fovea and peripheral regions and 2° within the parafovea region to account for the different rates of resolution degradation. The foveated images corresponding to the sampled field positions are rendered from the small full-resolution bar targets by applying Gaussian-type convolution filters determined by the angular magnifications corresponding to the field positions.

At each of the sampled field positions, we displayed the corresponding foveated image of a bar target and captured a zoom-in image with the test camera. To measure the inherent resolution distribution of the foveated display, which should be independent of camera viewing angle, we need to ensure that the camera captured the image in a viewing direction perpendicular to the display and centered with the sampled field position. We repeated the process for both horizontal and vertical bars across all of the sampled fields and captured a total of 46 images. The MTFs in horizontal and vertical directions for each of the field positions were analyzed from all the captured slanted edge images. These MTF measurements, which combined the sampling effects of the foveated display scheme and the test camera, need to be corrected by accounting for the MTF of the test camera which was measured with a slanted edge target. Furthermore, the MTF measurements were obtained in terms of the camera sensor sampling frequency. To recover the MTFs of the foveated display, we need to apply a sampling frequency conversion by determining the optical magnification between the camera and monitor pixels and thus converting the sampling frequencies in the sensor space to the angular frequencies in the display space. Figure 13(a) plotted the corrected MTF curves of the foveated display in the horizontal direction corresponding to 13 of the sampled fields. Results for the vertical direction are very similar. As expected, the MTF curves for the 5° and 10° fields remain high, reflecting the native resolution of the fovea region, while the MTF curves outside the fovea region drop as the field angles increase.

 figure: Fig. 13.

Fig. 13. (a) MTF curves of foveated display in horizontal direction as a function of spatial frequencies in cycles/degree for 13 different field angles; (b) The resolution distribution of the foveated display measured in the horizontal and vertical directions.

Download Full Size | PDF

To recover the resolution distribution function from the corrected MTF curves, we used the MTF value of 0.89 at 60 cycles/degree, which corresponds to the Nyquist frequency of the monitor, obtained from the 5° zoomed-in view as the threshold value to determine the limiting resolutions for other field angles. The MTF value at 60 cycles/degree for the 5° and 10° fields is the inherent resolution limit of the monitor and the MTF degradation from this threshold value is owing to the foveation scheme. Figure 13(b) plotted the resolution distributions of the foveated display measured in horizontal and vertical directions, marked by * and Δ, respectively, which demonstrate excellent agreement with the theoretical foveation scheme described by Eq. (18).

To quantitatively measure the perceived resolution of the display as a function of eye gaze angle, we adapted the procedure above for the first experiment of measuring the resolution distribution to account for two critical differences. Unlike the first experiment which measures the inherent resolution distribution independent of viewing angles and independent of observer or sensor limit, this experiment aims to measure the perceived resolution distribution as a function of eye gaze angle by a standard observer of 20/20 vision. To account for these differences, the measurement for each sampled field position needs to be repeated at different camera viewing directions with respect to a foveated target image to mimic different gazing directions of a human eye. Furthermore, the captured images need to be convolved with a foveation filter mimicking the relative VA of a 20/20 standard observer characterized by Eq. (7) for MTF analysis to mimic the sampling effects by a human eye. Across the display, we sampled a total of 23 field positions, same as the first experiment. The corresponding foveated images for these fields were rendered in the same way as the last experiment. For each field position, we captured images in 8 different camera viewing angles, from 5° to 40° at a 5° increment. Instead of adjusting camera orientation for each test, which could introduce errors, we adjusted the center positions of the target image on the monitor to yield equivalent viewing angles from the camera’s perspective. We repeated the process for both horizontal and vertical bars across all of the sampled fields and viewing directions, yielding a total of 368 images. The process of capture and analysis are repeated as above for each gazing direction. After applying convolution filters to each image to simulate the sampling effects of a standard observer, the same data analysis was applied to these images to obtain the perceived resolution distributions for different viewing angles. Figure 14 plotted the perceived resolution distributions as a function of field angles for 4 different eye gaze angles, 10°, 20°, 30° and 40°, marked by *, Δ, +, and O, respectively. As comparison, the figure also plotted the theoretical resolution distribution of the foveation scheme in red dashed lines and the theoretical perceived resolution distributions obtained by applying Eq. (8) for the four eye gaze angles. The perceived resolution distributions measured experimentally match well with the theoretical results.

 figure: Fig. 14.

Fig. 14. Quantitative assessment of the perceived resolution distributions as a function of field angles for four different viewing angles, 10°, 20°, 30° and 40°.

Download Full Size | PDF

7. Conclusion

A foveated display is a promising technique that can potentially address the inherent trade-off between field of view and resolution in conventional displays based on the well-established rectilinear sampling method. Although several prior works have attempted to apply a foveation method to the design of a head-mounted display system, the common method is a dynamic discrete foveation approach where a foveated region offering a high image resolution is dynamically steered in response to a user’s gaze direction and a relatively low-resolution region offers peripheral awareness. In this paper, we explored a new perceptual-driven approach to the design of statically foveated HMDs with the goal of offering a wide-FOV across which the degradation of the perceived image resolution may be imperceptible or minimal during the course of eye movement and the perceivable image artifacts and image resolution discontinuity are minimized. In contrast to the multi-level discrete and dynamic foveation approach in the prior art, the static foveation approach will not only maintain resolution continuity, but also eliminate the need for an eyetracker or scanning mechanism and therefore minimize hardware complexity. More specifically, this paper detailed the perceptual-driven approach to optimize the resolution distribution function of a statically foveated display, developed performance metrics to evaluate the perceived image quality and data sampling efficiency, demonstrate the process of applying the performance metrics to obtain optimal foveation schemes to meet the requirements of different applications, experimentally demonstrated and validate the proposed foveation scheme using a bench prototype implemented with a 4 K monitor. In the future work, we will perform perception-based used studies to validate the perceived acceptance of this approach and iteratively optimize the foveation scheme for minimal perceptual artifacts. In terms of the implementation of the proposed static foveation approach in an HMD system, it is most desirable to use microdisplay panels offering spatially varying pixel density matching with the resolution distribution of a foveated scheme. Practically, however, due to the easy access to display panels with uniform pixel density, a statically foveated scheme can be achieved by carefully controlling the optical magnification of the viewing optics such that the angular resolution of the virtual image viewed through the optics appear to have a spatially varying resolution distribution that closely matches with a specific foveation scheme. We are in the process of designing and building such an HMD prototype system.

Disclosures

Dr. Hong Hua has a disclosed financial interest in Magic Leap Inc. The terms of this arrangement have been properly disclosed to The University of Arizona and reviewed by the Institutional Review Committee in accordance with its conflict of interest policies.

Data availability

Data underlying the results presented in this paper are not publicly available at this time but may be obtained from the authors upon reasonable request.

References

1. L. C. Loschky and G. W. McConkie, “Investigating spatial vision and dynamic attentional selection using a gaze contingent multiresolutional display,” Q. J. Exp. Psychol. A 8(2), 99–117 (2002). [CrossRef]  

2. W. Zhou and A. C. Bovik, “Embedded foveation image coding,” IEEE Trans. Image Process. 10(10), 1397–1410 (2001). [CrossRef]  

3. A. T. Duchowski and A. Çöltekin, “Foveated gaze-contingent displays for peripheral LOD management, 3D visualization, and stereo imaging,” ACM Trans. Multimedia Comput. Commun. Appl. 3(4), 1–18 (2007). [CrossRef]  

4. B. Bastani, E. Turner, C. Vieri, H. Jiang, B. Funt, and N. Balram, “Foveated pipeline for AR/VR head-mounted displays,” Inf. Disp. 33(6), 14–35 (2017). [CrossRef]  

5. A. Patney, M. Salvi, J. Kim, A. Kaplanyan, C. Wyman, N. Benty, D. Luebke, and A. Lefohn, “Towards foveated rendering for gaze-tracked virtual reality,” ACM Trans. Graph. 35(6), 1–12 (2016). [CrossRef]  

6. G. Sandini, P. Questa, D. Scheffer, B. Diericks, and A. Mannucci, “A retinalike CMOS sensor and its applications,” in Proceedings of IEEE Workshop on Sensor Array and Multichannel Signal Processing (IEEE), pp. 514–519 (2000).

7. D. V. Wick, T. Martinez, S. R. Restaino, and B. R. Stone, “Foveated imaging demonstration,” Opt. Express 10(1), 60–65 (2002). [CrossRef]  

8. H. Hua and S. Liu, “Dual-sensor foveated imaging system,” Appl. Opt. 47(3), 317–327 (2008). [CrossRef]  

9. Y. Qin and H. Hua, “Continuously zoom imaging probe for the multi-resolution foveated laparoscope,” Biomed. Opt. Express 7(4), 1175–1182 (2016). [CrossRef]  

10. K. Iwamoto, S. Katsumata, and K. Tanie, “An eye movement tracking type head mounted display for virtual reality system: -evaluation experiments of a prototype system,” Proceedings of 1994 IEEE International Conference on Systems, Man, and Cybernetics. Humans, Information and Technology (Cat. No.94CH3571-5). vol. 1, pp.13–18, (1994).

11. J. P. Rolland, A. Yoshida, L. D. Davis, and J. H. Reif, “High resolution inset head-mounted display,” Appl. Opt. 37(19), 4183–4193 (1998). [CrossRef]  

12. G. Tan, Y. H. Lee, T. Zhan, J. Yang, S. Liu, D. Zhao, and S. T. Wu, “Foveated imaging for near-eye displays,” Opt. Express 26(19), 25076–25085 (2018). [CrossRef]  

13. B. Greenberg, “EyeWay Vision: Highly Efficient Immersive AR Glasses Using Gaze-locked Exit Pupil Steering,” SPIE AVR21 Industry Talks II. Vol. 11764. International Society for Optics and Photonics, (2021).

14. R. Burgess-Limerick, M. Mon-William, and V. L. Coppard, “Visual display height,” Human Factors. 42(1), 140–150 (2000). [CrossRef]  

15. G. Cook and L. Stark, “Derivation of a model for the human eye-positioning mechanism,” The bulletin of mathematical biophysics. 29(1), 153–174 (1967). [CrossRef]  

16. W. S. Geisler and J. S. Perry, “Real-time foveated multiresolution system for low-bandwidth video communication,” Proc. SPIE 3299, 294–305 (1998). [CrossRef]  

17. L. C. Loschky, G. W. McConkie, H. Yang, and M. E. Miller, “The limits of visual resolution in natural scene viewing,” Visual Cognition 12(6), 1057–1092 (2005). [CrossRef]  

Data availability

Data underlying the results presented in this paper are not publicly available at this time but may be obtained from the authors upon reasonable request.

Cited By

Optica participates in Crossref's Cited-By Linking service. Citing articles from Optica Publishing Group journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (14)

Fig. 1.
Fig. 1. (a) The schematic illustration of a continuously foveated display where its angular resolution varies as a function of the field angle θ and symmetric about the display center I; (b) Example of an angular resolution distribution function along a given direction crossing the display center I as a function of the field angle; (c) Example of a pixel density distribution function on a microdisplay as a function of the field angle with an eyepiece optics of 1-inch focal length.
Fig. 2.
Fig. 2. Illustration of the perceived resolution of a foveated display with its resolution distribution function in the form of a simple Gaussian function while the eye is gazed in the direction of 40° away from the display center.
Fig. 3.
Fig. 3. Illustration of information throughputs of a SRD and a foveated display as a function of the overall FOV.
Fig. 4.
Fig. 4. Comparison of six different foveation schemes with different ratio η and critical balance point ${\theta _{C2}}$: (a) The resolution distribution as function of field angle; (b) Perceived maximum resolution as function of eye rotation angle; (c) Volume ratio of perceived resolution distribution as function of eye rotation angle; and (d) Relative data sampling efficiency as function of full FOV.
Fig. 5.
Fig. 5. Resolution distribution for three different foveation schemes: proposed three-segment continuous foveation scheme defined by Eq. (17), a 10°dual-resolution scheme, and a 30°dual-resolution scheme.
Fig. 6.
Fig. 6. The perceived visual acuity of three different foveation schemes shown in Fig. 5 as a function of field angles for the eye gaze angle of (a) 10°, (b) 20°, (c) 30°, and (d) 40°, respectively.
Fig. 7.
Fig. 7. The comparison of (a) the perceived maximum resolution and (b) volume ratio of the three foveation schemes shown in Fig. 5 as a function of eye gaze angles.
Fig. 8.
Fig. 8. (a) The comparison of data sampling efficiency as a function of the overall FOV for three different foveation schemes shown in Fig. 5; (b) The comparison of total pixel number required in a diagonal direction as a function of the overall FOV among the three foveation schemes shown in Fig. 5 and (a) single-resolution display.
Fig. 9.
Fig. 9. (a) Examples of four captured zoomed-in images corresponding to 10°, 30°, 50°, and 80° field angles; (b) effective pixel density distribution for the foveated display setup as a function of field angles.
Fig. 10.
Fig. 10. (a) Captured image of original 4 K Briggs Target pattern by camera; (b) zoomed in views corresponding the viewing angle of 10, 30, 50, and 80 degrees.
Fig. 11.
Fig. 11. (a) Captured image of processed Briggs Target pattern by camera; (b) zoomed in views corresponding the viewing angle of 10, 30, 50, and 80 degrees; (c) 4 rendered images of the same local areas perceived by a standard observer with the eye rotated at the angles of 15, 30, 30, and 30 degrees.
Fig. 12.
Fig. 12. (a) Captured image of a foveated scene with four marked regions of interests, corresponding to 10°, 30°, 50°, and 80° fields; (b) zoomed-in images of the four marked regions captured with the original full-resolution image displayed on the monitor; (c) zoomed-in images of the four marked regions captured with the statically foveated image displayed on the monitor; (d)-(f) perceived images of the four marked regions by a 20/20 standard observer, simulated from the captured zoom-in full-resolution image for the eye gaze angle of 0°, 15°, and 30°, respectively.
Fig. 13.
Fig. 13. (a) MTF curves of foveated display in horizontal direction as a function of spatial frequencies in cycles/degree for 13 different field angles; (b) The resolution distribution of the foveated display measured in the horizontal and vertical directions.
Fig. 14.
Fig. 14. Quantitative assessment of the perceived resolution distributions as a function of field angles for four different viewing angles, 10°, 20°, 30° and 40°.

Equations (18)

Equations on this page are rendered with MathJax. Learn more.

F F D ( θ x , θ y ) = { f 1 ( θ x , θ y )   | θ | θ C 1 f 2 ( θ x , θ y ) θ C 1 < | θ | θ C 2 f 3 ( θ x , θ y ) θ C 2 < | θ | θ max ,
f 1 ( | θ C 1 | ) = f 2 ( | θ C 1 | ) f 1 ( | θ C 1 | ) = f 2 ( | θ C 1 | ) and f 2 ( | θ C 2 | ) = f 3 ( | θ C 2 | ) f 2 ( | θ C 2 | ) = f 3 ( | θ C 2 | ) .
p V D ( θ x , θ y ) = L tan ( 1 60 F F D ( θ x , θ y ) ) cos 2 θ ,
P P I V D ( θ x , θ y ) = 25.4 cos 2 θ L tan ( 1 60 F F D ( θ x , θ y ) ) .
P P I M D ( θ x , θ y ) = 25.4 cos 2 θ f E P tan ( 1 60 F F D ( θ x , θ y ) ) .
Φ E P ( θ x , θ y ) = 1 f E P ( θ x , θ y ) = tan ( 1 60 F F D ( θ x , θ y ) ) p 0 cos 2 θ .
V A H V S ( θ x , θ y , ϕ G x , ϕ G y ) = e 2 / ( e 2 + ( θ x ϕ G x ) 2 + ( θ y ϕ G y ) 2 ) ,
V A F D ( θ x , θ y , ϕ G x , ϕ G y ) = min [ F F D ( θ x , θ y ) , V A H V S ( θ x , θ y , ϕ G x , ϕ G y ) ] .
V A F D | max ( ϕ G x , ϕ G y ) = max [ V A F D ( θ x , θ y , ϕ G x , ϕ G y ) ] .
V R F D ( ϕ G x , ϕ G y ) = θ Xmax θ Xmax θ Ymax θ Ymax V A F D 2 ( θ x , θ y , ϕ G x , ϕ G y ) d θ x d θ y θ Xmax θ Xmax θ Ymax θ Ymax V A H V S 2 ( θ x , θ y , ϕ G x , ϕ G y ) d θ x d θ y ,
B = θ Xmax θ Xmax θ Ymax θ Ymax F F D 2 ( θ x , θ y ) d θ x d θ y .
S F D = B S R D B F D B S R D .
E = θ Xmax θ Xmax θ Ymax θ Ymax F H V S 2 ( θ x , θ y ) d θ x d θ y θ Xmax θ Xmax θ Ymax θ Ymax F F D 2 ( θ x , θ y ) d θ x d θ y = B H V S B F D .
F F D ( θ ) = { F 0 | θ | θ C 1 F 0 e ( θ θ C 1 ) 2 2 σ 2 θ C 1 < | θ | θ C 2 2.3 F 0 2.3 + ( θ ϕ t h )  +  Δ ( θ ) | θ | > θ C 2 ,
σ = θ C 2 θ C 1 2 ln η .
ϕ t h = θ C 2 + 2.3 ( 1 1 η ) .
F F D ( θ ) = { 1 | θ | 10 e 1 2 ( θ 10 12.89 ) 2 10 < | θ | 30 2.3 2.3 + ( θ 24.63 ) | θ | > 3 0 .
F F D ( θ ) = { 2 | θ | 10 2 e 1 2 ( θ 10 12.89 ) 2 10 < | θ | 30 4.6 2.3 + ( θ 24.63 ) | θ | > 30 .
Select as filters


Select Topics Cancel
© Copyright 2024 | Optica Publishing Group. All rights reserved, including rights for text and data mining and training of artificial technologies or similar technologies.