Expand this Topic clickable element to expand a topic
Skip to content
Optica Publishing Group

Compensation of color breaking in bi-focal depth-switchable integral floating augmented reality display with a geometrical phase lens

Open Access Open Access

Abstract

A bi-focal integral floating system using a geometrical phase (GP) lens can provide switchable integrated spaces with enhanced three-dimensional (3D) augmented reality (AR) depth expression. However, due to the chromatic aberration properties of the GP lens implemented for the switchable depth-floating 3D images, the floated 3D AR images with the red/green/blue (R/G/B) colors are formed at different depth locations with different magnification effects, which causes color breaking. In this paper, we propose a novel technique to resolve the color breaking problem by integrating the R/G/B elemental images with compensated depths and sizes along with experiments to demonstrate the improved results. When we evaluated the color differences of the floated 3D AR images based on CIEDE2000, the experimental results of the depth-switchable integral floating 3D AR images showed that the color accuracies were greatly improved after applying a pre-compensation scheme to the R/G/B sub-images in both concave and convex lens operation modes of the bi-focal switching GP floating lens.

© 2020 Optical Society of America under the terms of the OSA Open Access Publishing Agreement

1. Introduction

An optical projection system, which is widely used for various display applications, provides projected images from the planes of objects of source images to the focal plane of the projection lens. Among various applications of optical projection systems, augmented reality (AR) displays provide a multitude of applications within display industries with a rapid growth of the head-mounted display (HMD) and head-up display (HUD) markets. Both devices were developed for preventing distractions and helping the observer to concentrate on tasks more effectively by overlapping virtual images or information of interest onto a real-world scene.

However, current state-of-the-art AR displays require further progress to reach their ideal optical features. Though the HMD applications provide binocular disparity to induce the cubic effect and some of them are close to provide multi-depth planes, most of autostereoscopic AR device such as HUD are generally implemented on the level of two-dimensional (2D) image projections. Figure 1 shows the comparison of an AR optical system projecting 2D AR images, three-dimensional (3D) volumetric AR (3D AR) images [14], and bi-focal depth-switchable 3D AR images [5]. As shown in Fig. 1(a), current AR optics for most of commercial HUD can only project 2D planar images at a single focal plane and cannot express depth information for 3D objects. Meanwhile, a 3D AR display as shown in Fig. 1(b) is expected to provide 3D AR images with volumetric and perspective information overlapped with 3D objects as like current binocular HMDs. Nevertheless, the locations of AR images are still restricted near the single focal plane limited by 3D optics. The limit of depth expression is a problem remained to be resolved. Alternatively, the bi-focal 3D AR display as shown in Fig. 1(c) can provide more natural AR images with an improved expression of depth information. Thus, we expect that an optical device to realize a bi-focal 3D AR display may be a practical goal of optical projection systems.

 figure: Fig. 1.

Fig. 1. Comparison of augmented reality (AR) displays: (a) conventional 2D-image-based AR display, (b) 3D AR display with single focal plane, and (c) depth-enhanced 3D AR display with switchable bi-focal planes.

Download Full Size | PDF

Recently, researches on the introduction of geometric phase hologram (GPH) optics into AR modules have been intensively reported as an approach for reducing the volume of optic modules to produce a compact form factor [614]. When a thin film of a birefringent layer has half-wave retardation, spatial wavefronts of a transmitted light can be reliably modulated according to the optic axis distribution of the birefringent layer [6]. By applying polarization volume grating as the reflective image coupler in the waveguide structure used as the image combiner unit, the field of view conditions of the AR image can be effectively improved due to the large diffractive angle property of the GPH unit [7,8]. By utilizing the polarization-dependent diffractive behaviors of the GPH, the stacked structure of a geometric phase (GP) lens can provide the multi-depth focusing conditions needed for improving the depth ranges of the AR optics [913]. Despite the significant benefits described above, there are several issues with the GP lens that remain to be solved for AR applications [10,11,14]. When a GP lens is used as the imaging or projection unit in an AR system covering a wideband visible range, the GP lens shows chromatic aberration. This is caused by the shorter focal length seen for a longer wavelength of light, as the GPH operation by diffractive properties are wavelength-dependent [6,14].

Resolving the problem above is necessary to implement a full-color bi-focal projection system with a GP lens that shows natural 3D AR expression. Previous researches adopted the use of multiple layers of color-dependent diffractive optical element layers while also controlling the color-selective wavefront to adjust the red/green/blue (R/G/B) color images in the AR systems using a GP lens [10,11]. However, the use of a compensation layer in these approaches was not suitable for providing a compact form factor, which is essential in HMD and HUD applications. Though the post image processing technique adopted to the image capturing scheme also showed an improved color expression, there still exist a restriction that unintended parallaxes between the R/G/B sub-images due to the separated depth planes cannot be compensated by the post image processing only [15,16]. Recently, by utilizing opposite dispersion behaviors between the GP lens and the refractive lens, it has been successfully demonstrated that the chromatic aberration of the GP lens can be effectively compensated [17]. However, the aberration compensation effect is inevitably limited to a single operation mode of the GP lens between the concave and convex lens states, which is dependent on the dispersion design of the additional optic layer adopted for the aberration compensation.

In this paper, we propose a novel scheme to realize a compact time-sequential bi-focal AR display system with compensation of color dispersions from the GP lens without the need for additional optical layers. Referring to Percival’s analysis, it is expected that a bi-focal AR display system can be optimized to cover a depth range of 0 to 1D [18]. The color-breaking issue found in a depth-switchable floating full-color image was solved by controlling the size and depth of the R/G/B 3D AR sub-images independently based on the principle of an integral floating display technique, instead of using an optical compensator. Though a bi-focal 3D integral floating system is presented as a verification in this paper, the proposed scheme can be also applied to both 2D/3D projection AR optics to enhance their performance of the color expression. In addition, we also expect that the proposed technique can be adopted to an AR system with providing more focal planes for an improved depth expression by applying a structure of the stacked GP lenses [12,13,19,20].

2. Principles

To address color breaking, analyzing the cause of color breaking was the first and the most important step. Figure 2 shows the basic principles of a time-sequential switching bi-focal integral floating system with two imaging steps using a GP lens [5]. In Fig. 2, the GP lens operation mode using a convex lens is depicted as an example. At the first step, a 3D image (the white die) is integrated from many elemental images at the plane of display panel through the lens array. Each elemental image includes different perspectives of the original 3D data and those perspectives are projected through a lens array composed of a single elemental lens. In this step, there is no color breaking issue since the focal length of the lens array made by the optically isotropic material has a negligible wavelength dependency like typical 3D integral imaging system.

 figure: Fig. 2.

Fig. 2. Conventional depth-switchable integral floating display with color-breaking issues caused by the chromatic aberration effects of the polarization-dependent-switching GP lens, where the R/G/B elemental images floated at different depth planes are depicted operating in the convex lens mode of the depth-floating GP lens, for example, between the bi-focal depth-switching states.

Download Full Size | PDF

The second step is to project the integrated 3D image to the focal plane of a GP lens operated in convex mode as shown in the example in Fig. 2. In this step, chromatic aberration is caused by the GP lens originating from the wavelength-dependent diffractive behavior [6,21]. Thus, the different depths and magnifications of the floated R/G/B sub-images can cause color breaking as shown in Fig. 2.

Based on the analysis above, it is expected that combining the floated R/G/B sub-images accurately is essential for a full-color image projection of the bi-focal temporal switching integral floating scheme for resolving the color-breaking problem. Since the integral floating image is a 3D display technology that realizes volumetric images with designated depths with a depth range, a pre-compensation technique without additional compensation optics can be adopted for depth-ranged volume images unlike the AR display projecting 2D images, shown in Fig. 1(a). The main idea was to integrate the R/G/B sub-images with pre-compensated depths and magnifications so that they can be combined together at the same depth planes with the same magnification after projection by the GP lens despite the chromatic aberration properties, as shown in Fig. 3. Thus, considering the basic principles above, integral floating can be a more practical approach since it can compensate for the different depths and magnifications of the R/G/B sub-images independently.

 figure: Fig. 3.

Fig. 3. Principle of the proposed bi-focal depth-switchable integral floating AR display which applies the pre-compensation scheme for the R/G/B elemental images by solving the color-breaking problems.

Download Full Size | PDF

To prove the method proposed above, calculating the pre-compensated depths and magnifications of the integrated R/G/B sub-images were necessary. Initial analysis of the color-dependent depth dislocations and magnification effects taking into account the chromatic aberration behavior of the GP lens is needed. The distributions of the R/G/B depths (${b_R}$, ${b_G}$, and ${b_B}$ for the R/G/B sub-images, respectively) can be derived as shown below using the lens equation, assuming the GP lens can selectively operate in a convex or concave mode with the focal lengths ${f_R}$, ${f_G}$, and ${f_B}$ which indicate relative distances to the integrated 3D images as shown in Fig. 2.

$${b_R} = \frac{{a{f_R}}}{{{f_R} - a}}$$
$${b_G} = \frac{{a{f_G}}}{{{f_G} - a}}$$
$${b_B} = \frac{{a{f_B}}}{{{f_B} - a}}$$
Considering the above depth distortions, the pre-compensated (pre-distributed) depths (${a_{c\_R}}$, ${a_{c\_G}}$, and ${a_{c\_B}}$) of the integrated R/G/B sub-images can eliminate the color-breaking issue by matching the locations of the floated R/G/B sub-images, ${b_c} = {b_G}$ as shown in Fig. 3. The compensated depths can be derived as follows and the results from these calculations are summarized in Table 1:
$${a_{c\_R}} = \frac{{{b_c}{f_R}}}{{{b_c} + {f_R}}}$$
$${a_{c\_G}} = \frac{{{b_c}{f_G}}}{{{b_c} + {f_G}}}$$
$${a_{c\_B}} = \frac{{{b_c}{f_B}}}{{{b_c} + {f_B}}}$$

Tables Icon

Table 1. Experimental parameters of the proposed bi-focal depth-switchable integral floating 3D AR display.

The above analysis is important as the second step of our pre-compensation scheme, since the compensation of magnification ($|{{M_{c\_R}}} |$, $|{{M_{c\_G}}} |$, and $|{{M_{c\_B}}} |$) for the R/G/B sub-images may be derived using the pre-distorted depths as follows:

$$|{{M_{c\_R}}} |= \left|{\frac{{{a_{c\_R}}}}{{{a_{c\_G}}}}} \right|$$
$$|{{M_{c\_G}}} |= \left|{\frac{{{a_{c\_G}}}}{{{a_{c\_G}}}}} \right|$$
$$|{{M_{c\_B}}} |= \left|{\frac{{{a_{c\_B}}}}{{{a_{c\_G}}}}} \right|$$
Using the analyses above, it is possible to resolve color breaking by controlling the depths and sizes of the floated R/G/B sub-images with an accuracy down to a single pixel pitch [22] of the display panel, as shown in Figs. 2 and 3.

3. Experimental results and discussion

To verify the proposed technique, a bi-focal depth-switchable integral floating 3D display that used a GP lens with an optical combiner was setup to check whether the demonstration provided proper motion parallaxes as a 3D AR concept. For that purpose, two real objects of a pink die (the first object) and a yellow toy car (the second object) were located at 675 mm and 735 mm from the GP lens, respectively (Fig. 4). The proposed system was composed of a display panel with a pixel pitch of 31.5 µm, a lens array, an active polarization switching device (APSD), and a polarization-dependent focus-switching GP lens (Fig. 4). When we characterized the spectral properties of the display panel implemented in our experiment by using a spectrometer (USB4000, Ocean Optics), the display panel has a peak wavelength of λG = 530.1 nm and λB = 456.5 nm for the green and blue colors. For the red, the spectrum has the dual peaks at λR1 = 615.0 nm and λR2 = 632.6 nm. For each peak condition, the spectral width of the full width at half maximum value is ΔλR1 = 5.3 nm, ΔλR2 = 6.9 nm, ΔλG = 40.9 nm, and ΔλB = 22.8 nm. The lens array consisted of 15 × 15 rectangular elemental lenses with a lens pitch of 5 mm and a focal length of 10 mm. The GP lens had a diameter of 25.4 mm and a polarization-dependent focusing function (convex and concave lens modes). For that purpose, the APSD was made with stacked optics including a polarization-control liquid crystal unit and a broadband quarter waveplate. The waveplate was able to switch the polarization states of the incident light between the left-handed circular polarization (LHCP) and right-handed circular polarization (RHCP) using a time-sequential projection. This was synchronized with the display panel of the 3D AR images at each focal plane. Besides, in order to prevent a polarization distortion due to a birefringent property of the lens array, a 45° linear polarizer was located between the lens array and the APSD to align the incident polarization state precisely for a time-sequential modulation of the polarization states of the APSD. The experimental parameters were derived from Eqs. (1) to (9) using the specifications listed in Table 1.

 figure: Fig. 4.

Fig. 4. Structure of the experimental setup and the locations of the 1st and the 2nd real objects, and the bi-focal depth-floating 3D AR images.

Download Full Size | PDF

With the optical setup described above, the gap between the lens array and the display panel was set to be 8.5 mm in order to project the 3D AR images at the depth locations of the two real objects (die and toy car) by switching the incident polarization states between the LHCP and RHCP using the APSD unit. The first 3D AR image, a white die with the “3” face up, was located at the near depth position of the first object (the pink die) by the GP floating lens operating in the concave lens mode. The second 3D AR image, a white die with the “2” face up, was floated in the convex mode of operation of the GP lens and was projected near the second object (the yellow toy car). Unlike our previous demonstration of the bi-focal integral floating display using of a GP lens [5], both the 3D AR images were set to be white images which is a reference color having all of R/G/B channels in order to check the chromatic aberration effects of the GP lens quantitatively after applying the proposed pre-compensating scheme for resolving the color-breaking problem.

Figure 5 shows a comparison of the experimental results of the bi-focal depth-switchable integral floating 3D AR images without and with the pre-compensation scheme applied to the R/G/B elemental images. In order to determine if the effect of the color compensation was as accurate as possible, the ambient luminance was set at a level sufficiently low to identify the shapes of the real objects while preventing the influence of ambient light on the color properties of the captured 3D AR images. Thus, though the transmittance ratio of the optical combiner used in the experiment is nearly 50%, the pictures of the real objects in Fig. 5 can look dimer than the AR images, which is only for the experimental condition and will not be problematic in cases of real applications under brighter ambient light conditions. With this experimental setup, the first 3D AR image was projected in the concave mode and set to be at the location of the first object (the pink die). Thus, it was expected that the first 3D AR image and the first object shall be captured together with the same motion parallaxes. The second 3D AR image was floated in the convex mode near the second object (the yellow toy car) and it was expected to show the same motion parallaxes. The experimental results confirmed that both the 3D AR images projected at different depth positions provided the proper motion parallaxes corresponding to their respective objects (Fig. 5). The enlarged views of the 3D AR images from both the left and rightmost perspectives obtained in both concave/convex modes show that the 3D AR images also have the motion parallax of their own, demonstrating proper 3D depth projection. From these results we have shown that the bi-focal integral floating display works as designed. It was also clearly observed that the AR images obtained by the concave/convex GP lens modes with the proposed compensation scheme show significantly improved color-adjusted 3D AR images with resolving the color-breaking problem.

 figure: Fig. 5.

Fig. 5. Comparison of experimental results of the bi-focal depth-switchable integral floating 3D AR images with/without the pre-compensation scheme for the R/G/B elemental images to resolve the color-breaking issues induced by the chromatic aberration of the depth-floating GP lens.

Download Full Size | PDF

Due to the finite aperture of the floating lens used by the GP lens in our experiment, the viewing angle of the concave and the convex modes were different from each other. Although the depth distributions of the R/G/B sub-images were pre-compensated, a small amount of mismatch, less than a single pixel, can be seen in the concave mode. This is due to the larger viewing angle in the convex mode which is used to float AR images in a closer position. Based on the derived parameters listed in Table 1, it is also expected that the size ratio of the floated R/G/B sub-images without compensation would be 0.94:1:1.06 (R:G:B) for the concave mode and 1.21:1:0.85 for the convex mode. Therefore, before applying the proposed pre-compensation scheme, color breaking of the floated image in the convex mode is more pronounced than the concave mode due to the larger deviation in the size ratio despite the smaller viewing angle, as shown in Fig. 5(b). In the presented results of Fig. 5, the effects of stray light were observed around the 3D AR images due to the light scattering and the reflection from the surfaces of the multi-layered optical components implemented in our optical system. However, we expect that these image degradation issues can be minimized in commercial applications by optimizing the enclosure optic design and adopting an anti-reflection coating to the surfaces of optical elements.

As the next verification, the effects of the proposed pre-compensation scheme for the R/G/B elemental image sets for resolving the color-breaking problem were quantitatively analyzed and compared. The image analysis was performed by partially cropping the projected 3D AR images at the vertical edges. The color breaking was more clearly observed at the edges of our white die images as shown in Fig. 5(b). For the near-floating images operated in the concave mode of the GP lens, the right edge of the object image was cropped 23° from the right (Fig. 6). Similarly, the far-floating AR image sets obtained in the convex mode of the GP lens were cropped 8.5° from the right. In Fig. 6, the enlarged original white images obtained without applying the pre-compensation scheme show severe chromatic aberration at the image edge. The problem was more clearly seen in the convex operation mode of the GP lens along with more color breaking problems. For both operation modes of the GP lens, after applying the pre-compensation scheme for the R/G/B elemental image sets, it was demonstrated that these problems were effectively resolved (Fig. 6).

 figure: Fig. 6.

Fig. 6. Experimental results showing the white-colored 3D AR images before/after applying the pre-compensation scheme for the R/G/B elemental images, where the color aberration properties are analyzed by separating the white images into red, green, and blue channels with the digital values of each color channel for the sampling lines (120 pixels) depicted by the horizontal half line.

Download Full Size | PDF

Quantitative and comparative analysis is presented in depth by separating the white-colored images into the R/G/B channels as shown in Fig. 6. For this analysis, the digital values of color information were obtained along the horizontal sampling line (120 pixels) of the R/G/B channel images (Fig. 6). The ideal resulting image without any color breaking would have the same R, G, B value at each pixel since achromatic images (R = G=B) were used in the 3D AR demonstration. Before applying the pre-compensation scheme in the conventional bi-focal depth-switchable integral floating 3D display using the GP lens, the depth positions of the floated R/G/B sub-images were differently formed, according to Table 1, with the relative relationships of ${b_R}$ > ${b_G}$ > ${b_B}$ (as in Fig. 2) and ${b_R}$ < ${b_G}$ < ${b_B}$ for the convex and concave mode of operations for the GP lens, respectively. In addition, since the blue and the red shifts are dominant in the concave and the convex modes, respectively, the magnification levels required for the pre-compensation of the R/G/B sub-images were higher in the red channel in the concave mode and higher in blue images in the convex mode of the GP lens as described in Table 1.

The images separated into the R/G/B color channels and the quantitative profiles of the R/G/B digital values matched well with respect to the analysis based on the derived equations. Before applying the pre-compensation scheme, the reversal of the distortion profiles was also observed in the red and blue colors with respect to the green color when changing the operation mode of the GP lens. In addition, the color breaking is also more severely observed in the blue and red images obtained in the concave and convex mode due to the different magnification of the GP lens, respectively. For both the concave and convex modes of operations of the GP lens, the quantitative profiles of the R/G/B digital values from the image sets were separated into the R/G/B color channels (Fig. 6) and matched more closely the R, G, and B values after applying the proposed pre-compensation scheme. This indicates that the proposed method resolves or improves color breaking by minimizing the differences between the R/G/B channels in the projected 3D AR images despite the inherent color aberration effects of the GP lens used for the time-sequential bi-focal depth switching. Note that the color breaking shown in Figs. 5 and 6 without the proposed pre-compensation scheme is serious because of large discrepancy of the focal lengths and magnifications of the wavelength-dependent GP lens even though the gap, implemented in our experiment, between two depth planes of the AR images was not large. When we use a stacked structure of the GP lens for an extended depth range in the AR image expression [12,13,19,20], it can be expected that the color breaking problem is much severer. Regardless of the stack number of the GP lens, the presented compensation scheme can be simply and effectively extended to applications of multi-depth integral floating 3D AR optics for resolving the color breaking issues.

To quantify the improvement of this method, the color difference was compared with the ideal state, i.e. the condition without any color breaking. Color differences were obtained using CIEDE2000 [23]. This is a standard color difference formula that accurately reflects human recognition of colors and is also widely used in practical ways [2427]. The reference data for evaluating the color difference levels from the ideal states was obtained by averaging the R, G, and B digital values along the horizontal sampling line for each pixel (the method explained above) and by replacing the values for all three R, G and B with their respective average data values (R = G = B = average value). This procedure was repeated to gain ideal state data for all four image sets obtained in the concave and convex lens modes before and after the pre-compensation. Three conversion steps were needed to calculate the CIEDE2000 color differences from the RGB data information in each image to see the effects of color improvement before and after the pre-compensation. In the first step, all R, G, and B data sets were assumed to be sRGB and they were converted into the CIEXYZ color space [24]. This step was used to convert the device-dependent RGB values to the device-independent XYZ values. Then, the uniform color space CIELAB [25] values were obtained under the standard conditions using the CIEXYZ. Finally, the color difference between each set of data of the images and corresponding ideal data set were calculated by following the CIEDE2000.

Figure 7 shows the evaluation results from the CIEDE2000 for the 3D AR images obtained in the concave and convex mode of the GP lens used for the bi-focal depth-floating images along the horizontal line. The blue lines are the color differences between the ideal data without applying our pre-compensation scheme and the red lines are those after applying our pre-compensation scheme. This clearly demonstrates that the difference in color decreases after applying the pre-compensation scheme, resulting in the improvement of color breaking, as shown in Fig. 5. Between the two operation modes of the GP lens, the improvement effect on the spatial color difference is more pronounced in the convex mode used for floating the 3D AR images at the far depth plane, as expected based on the discussion of the experimental results (Fig. 6). The color-dependent depth and magnification distortions in the R/G/B sub-images are more serious in the convex lens mode in our optical setup, highlighting the correlation of the experimental evaluations and theoretical expectations.

 figure: Fig. 7.

Fig. 7. Color-breaking analysis on the bi-focal depth-floating 3D AR images (the white dice floated at the near and far depth planes), operated at the (a) concave and (b) convex lens modes of the depth-floating GP lens, based on the CIEDE2000 color difference along the horizontal sampling line compared with no color breaking (R = G=B) condition as an ideal color reference.

Download Full Size | PDF

The color difference is also an indicator of how visually different the target image is compared to the ideal image. Although there is no absolute range level for color differences, the permissible categories in the image are guided by prior studies [26,27]. As the preceding studies show, the color difference tolerance category depends on the resolution of the image, the luminance histogram, etc., so it can vary depending on the content. Therefore, the results of this paper were analyzed by using various images to set the stage of the seven-step subject assessment metric based on CIEDE2000 color difference [26]. In this study, the perception of color difference was ‘Noticeable’ for the CIEDE2000 values between 1.5 and 3.0 and ‘Appreciable’ between 3.0 and 6.0. When the color difference values of more than 3.0 (the ‘Noticeable’ level) were summed for all four sets, and the summation of the resulting color difference was 650 after pre-compensation and 941 before pre-compensation in the concave lens mode (the 3D AR images floated at the near depth plane). In case of the convex lens mode (the 3D AR images floated at the far depth plane), the color difference summation values before and after pre-compensation were 1495 and 330, respectively. When ‘noticeable’ criteria level was applied, the color difference levels were reduced by 69.1% and 22.0% after applying the proposed pre-compensation scheme, for the concave and convex modes of the depth-switchable GP lens, respectively.

In the same way, when the criteria of the color differences exceeding 6.0, (‘appreciable’ level) was applied, the color difference summation values for the concave mode of operation were evaluated as 460 and 800 after and before applying the pre-compensation scheme, respectively. In the convex mode of operation, the color difference summation showed a relatively high summation value of 1365 for the images obtained without applying the pre-compensation scheme. However, when applying the ‘appreciable’ level as the criteria for evaluating the color difference, all CIEDE2000 values of the sampled pixels, obtained after applying the pre-compensation scheme were sufficiently below 6 resulting in the color difference summation of 0. In both color difference criteria, the suggested pre-compensation method was effective at reducing color breaking. In addition, we can also confirm that the pre-compensation scheme (reduction of CIEDE2000 value) is more effective for the convex mode since the images captured at smaller viewing angles have less misalignment when synthesizing floating images from R/G/B sub-images.

4. Conclusion

Bi-focal depth-switchable display using the GP lens is expected to be a promising approach to improve the depth expression of 3D AR images. However, the development of a commercial product is hindered by wavelength dependency of the GP lens that seriously degrades color expression in these systems. In this paper, we proposed a novel pre-compensation scheme for R/G/B elemental image sets based on an integral floating method to resolve color breaking without additional compensation optic units. Their color difference evaluation results based on CIEDE2000 were discussed. The results showed that the proposed technique can provide a significant improvement of color accuracy both in the concave and convex modes of time-sequential operation when a GP lens is used as the depth-switching floating lens. Considering the analyses on the zone of comfort, we expect that this approach can help advance the development of realistic 3D AR applications with enhanced depth and color expressions by providing multi-depth planes with a stacked GP lens structure.

Acknowledgment

This work was supported by Institute of Information & Communications Technology Planning & Evaluation (IITP) grant funded by the Korea government (MSIT) No.2020-0-00924: Technology development of authoring tool for 3D holographic printing contents, No.2020-0-00867: Display technology development for augmented reality devices with outdoor visibility.

Disclosures

The authors declare no conflicts of interest.

References

1. S.-W. Min, M. Hahn, J. Kim, and B. Lee, “Three-dimensional electro-floating display system using an integral imaging method,” Opt. Express 13(12), 4358–4369 (2005). [CrossRef]  

2. J. Hong, S.-W. Min, and B. Lee, “Integral floating display systems for augmented reality,” Appl. Opt. 51(18), 4201–4209 (2012). [CrossRef]  

3. Y. Takaki and Y. Yamaguchi, “Flat-panel type see-through three-dimensional display based on integral imaging,” Opt. Lett. 40(8), 1873–1876 (2015). [CrossRef]  

4. Y. Yamaguchi and Y. Takaki, “See-through integral imaging display with background occlusion capability,” Appl. Opt. 55(3), A144–A149 (2016). [CrossRef]  

5. M. Park, K.-I. Joo, H.-R. Kim, and H.-J. Choi, “An augmented-reality device with switchable integrated spaces using a bi-focal integral floating display,” IEEE Photonics J. 11(4), 1–8 (2019). [CrossRef]  

6. J. Kim, Y. Li, M. N. Miskiewicz, C. Oh, M. W. Kudenov, and M. J. Escuti, “Fabrication of ideal geometric-phase holograms with arbitrary wavefronts,” Optica 2(11), 958–964 (2015). [CrossRef]  

7. Y. Weng, D. Xu, Y. Zhang, X. Li, and S.-T. Wu, “Polarization volume grating with high efficiency and large diffraction angle,” Opt. Express 24(16), 17746–15579 (2016). [CrossRef]  

8. Y.-H. Lee, K. Yin, and S.-T. Wu, “Reflective polarization volume gratings for high efficiency waveguide-coupling augmented reality displays,” Opt. Express 25(22), 27008–27014 (2017). [CrossRef]  

9. G. Yoo, K. Bang, C. Jang, D. Kim, C.-K. Lee, G. S. Lee, H.-S. Lee, and B. Lee, “Dual-focal waveguide see-through near-eye display with polarization-dependent lenses,” Opt. Lett. 44(8), 1920–1923 (2019). [CrossRef]  

10. S. Moon, C.-K. Lee, S.-W. Nam, C. Jang, G.-Y. Lee, W. Seo, G. Sung, H.-S. Lee, and B. Lee, “Augmented reality near-eye display using Pancharatnam-Berry phase lenses,” Sci. Rep. 9(1), 1–10 (2019). [CrossRef]  

11. S. Moon, S.-W. Nam, Y. Jeong, C.-K. Lee, H.-S. Lee, and B. Lee, “Compact augmented reality combiner using Pancharatnam-Berry phase lens,” IEEE Photonics Technol. Lett. 32(5), 235–238 (2020). [CrossRef]  

12. T. Zhan, Y.-H. Lee, and S.-T. Wu, “High-resolution additive light field near-eye display by switchable Pancharatnam–Berry phase lenses,” Opt. Express 26(4), 4863–4872 (2018). [CrossRef]  

13. S. Li, Y. Liu, Y. Li, S. Liu, S. Chen, and Y. Su, “Fast-response Pancharatnam-Berry phase optical elements based on polymer-stabilized liquid crystal,” Opt. Express 27(16), 22522–22531 (2019). [CrossRef]  

14. C. Yousefzadeh, A. Jamali, C. McGinty, and P. J. Bos, “Limits of Pancharatnam phase lens for 3D/VR/AR applications,” SID Symp. Digest. Tech. Pap. 50(1), 616–619 (2019). [CrossRef]  

15. A. Nikonorov, R. Skidanov, V. Fursov, M. Petrov, S. Bibikov, and Y. Yuzifovich, “Fresnel lens imaging with post-capture image processing,” IEEE Comput. Soc. Conf. Comput. Vis. Pattern Recognit. Work., 33–41 (2015).

16. M. Petrov, S. Bibikov, Y. Yuzifovich, R. Skidanov, and A. Nikonorov, “Color correction with 3D lookup tables in diffractive optical imaging systems,” Procedia Eng. 201, 73–82 (2017). [CrossRef]  

17. T. Zhan, J. Zou, J. Xiong, X. Liu, H. Chen, J. Yang, S. Liu, Y. Dong, and S.-T. Wu, “Practical Chromatic Aberration Correction in Virtual Reality Displays Enabled by Cost-Effective Ultra-Broadband Liquid Crystal Polymer Lenses,” Adv. Opt. Mater. 8(2), 1901360–5 (2020). [CrossRef]  

18. T. Shibate, J. Kim, D. M. Hoffman, and M. S. Banks, “The zone of comfort: Predicting visual discomfort with stereo displays,” J. Vision 11(7), 1 (2011). [CrossRef]  

19. T. Zhan, Y.-H. Lee, G. Tan, J. Xiong, K. Yin, F. Gou, J. Zou, N. Zhang, D. Zhao, J. Yang, S. Liu, and S.-T. Wu, “Pancharatnam–Berry optical elements for head-up and near-eye displays,” J. Opt. Soc. Am. B 36(5), D52–56 (2019). [CrossRef]  

20. T. Zhan, J. Xiong, J. Zou, and S.-T. Wu, “Multifocal displays: review and prospect,” PhotoniX 1(1), 10 (2020). [CrossRef]  

21. C. Oh and M.-J. Escuti, “Achromatic diffraction from polarization gratings with high efficiency,” Opt. Lett. 33(20), 2287–2289 (2008). [CrossRef]  

22. S.-W. Min, J. Kim, and B. Lee, “New characteristic equation of three-dimensional integral imaging system and its applications,” Jpn. J. Appl. Phys. 44(2), L71–L74 (2005). [CrossRef]  

23. M. R. Luo, G. Cui, and B. Rigg, “The development of the CIE 2000 colour-difference formula: CIEDE2000,” Color Res. Appl. 26(5), 340–350 (2001). [CrossRef]  

24. International Electrotechnical Commission, “Multimedia systems and equipment-colour measurement and management-Part 2-1: Colour management-default RGB colour space-sRGB,” IEC 61966-2-1-16 (1999).

25. M. Mahy, L. Van Eycken, and A. Oosterlinck, “Evaluation of uniform color spaces developed after the adoption of CIELAB and CIELUV,” Color Res. Appl. 19(2), 105–121 (1994). [CrossRef]  

26. Y. Yang, J. Ming, and N. Yu, “Color image quality assessment based on CIEDE2000,” Adv. Multimedia 2012, 1–6 (2012). [CrossRef]  

27. D. Rodríguez-Esparragón, J. Marcello, C. Gonzalo-Martín, Á García-Pedrero, and F. Eugenio, “Assessment of the spectral quality of fused images using the CIEDE2000 distance,” Computing 100(11), 1175–1188 (2018). [CrossRef]  

Cited By

Optica participates in Crossref's Cited-By Linking service. Citing articles from Optica Publishing Group journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (7)

Fig. 1.
Fig. 1. Comparison of augmented reality (AR) displays: (a) conventional 2D-image-based AR display, (b) 3D AR display with single focal plane, and (c) depth-enhanced 3D AR display with switchable bi-focal planes.
Fig. 2.
Fig. 2. Conventional depth-switchable integral floating display with color-breaking issues caused by the chromatic aberration effects of the polarization-dependent-switching GP lens, where the R/G/B elemental images floated at different depth planes are depicted operating in the convex lens mode of the depth-floating GP lens, for example, between the bi-focal depth-switching states.
Fig. 3.
Fig. 3. Principle of the proposed bi-focal depth-switchable integral floating AR display which applies the pre-compensation scheme for the R/G/B elemental images by solving the color-breaking problems.
Fig. 4.
Fig. 4. Structure of the experimental setup and the locations of the 1st and the 2nd real objects, and the bi-focal depth-floating 3D AR images.
Fig. 5.
Fig. 5. Comparison of experimental results of the bi-focal depth-switchable integral floating 3D AR images with/without the pre-compensation scheme for the R/G/B elemental images to resolve the color-breaking issues induced by the chromatic aberration of the depth-floating GP lens.
Fig. 6.
Fig. 6. Experimental results showing the white-colored 3D AR images before/after applying the pre-compensation scheme for the R/G/B elemental images, where the color aberration properties are analyzed by separating the white images into red, green, and blue channels with the digital values of each color channel for the sampling lines (120 pixels) depicted by the horizontal half line.
Fig. 7.
Fig. 7. Color-breaking analysis on the bi-focal depth-floating 3D AR images (the white dice floated at the near and far depth planes), operated at the (a) concave and (b) convex lens modes of the depth-floating GP lens, based on the CIEDE2000 color difference along the horizontal sampling line compared with no color breaking (R = G=B) condition as an ideal color reference.

Tables (1)

Tables Icon

Table 1. Experimental parameters of the proposed bi-focal depth-switchable integral floating 3D AR display.

Equations (9)

Equations on this page are rendered with MathJax. Learn more.

b R = a f R f R a
b G = a f G f G a
b B = a f B f B a
a c _ R = b c f R b c + f R
a c _ G = b c f G b c + f G
a c _ B = b c f B b c + f B
| M c _ R | = | a c _ R a c _ G |
| M c _ G | = | a c _ G a c _ G |
| M c _ B | = | a c _ B a c _ G |
Select as filters


Select Topics Cancel
© Copyright 2024 | Optica Publishing Group. All rights reserved, including rights for text and data mining and training of artificial technologies or similar technologies.