Expand this Topic clickable element to expand a topic
Skip to content
Optica Publishing Group

Aberration analysis of a projection-type CGH display with an expanded FOV based on the HOE screen

Open Access Open Access

Abstract

This paper proposed a holographic optical element as a see-through screen for the computer-generated hologram projection system with 3D images. The proposed holographic screen consisted of a linear grating and a lens phase. The linear grating is used to redirect the information light and guide information into the observer's eye and achieve the see-through function. The lens phase is used to magnify the field of view of the holographic projection system. The aberration caused by the screen was analyzed in this paper and the aberration can be pre-corrected in the hologram calculation algorithm. Finally, the proposed system achieved 20.3 by 14.3 degrees field of view at 532 nm laser based on the spatial light modulator with 6.4 µm pixels.

© 2022 Optica Publishing Group under the terms of the Optica Open Access Publishing Agreement

1. Introduction

Three-dimensional (3D) displays are one attractive and essential technology for the display market owing to the increasing demand for ultra-reality displays. The primary purpose of the 3D displays is to offer images with depth information to the observers and generate a visual perception similar to natural human vision. The 3D displays without depth information are prone to cause the vergence–accommodation conflicts (VAC) [1]. In current 3D display techniques, holographic technology is a favorable method for achieving natural 3D images via wavefront reconstruction. In order to achieve dynamic holographic display function, the electric field distribution of information light must be predicted by numerical calculation and be reproduced by a spatial light modulator (SLM) [2,3]. The SLMs can modulate the light source's amplitude or phase distribution spatially. However, the SLM which can modulate both amplitude and phase simultaneously is difficult to achieve. In most cases, the phase-only SLM is chosen because of the higher efficiency.

In order to predict the electric field, an appropriate computer-generated hologram (CGH) technique is necessary [4]. There are two fundamental algorithms, the point-based method and convolution propagation method. The former calculates the superposition of the spherical point waves on the target image. This method is simple and intuitive, but a higher operation time is needed. The calculation time depends on the point number of the target image. The advantage of the latter one is that the field of a 2D image can be calculated in a single operation. Generally, the fast Fourier transform (FFT) method is employed to reduce the operation time of the convolution propagation. The calculation speed depends on the pixel numbers of the calculation space. The other algorithms have already been developed based on this method for different goals such as iteration algorithm, angular spectrum. The iteration algorithm is used to improve the image quality and compensate for the quality degradation caused by the amplitude elimination for the phase-only SLM system [57]. The angular spectrum method can be used to generate a tilted 2D plane [810]. On the other hand, the iteration Fourier transform algorithm (IFTA) is also proposed to generate images located at infinity [11,12]. Generally, an additional lens phase will be used to modulate the image distance. In this method, the target image locates at the Fourier domain. The position of each point in the target image is remarked in the angular unit.

The disadvantage of the CGH technique is that the active area is tiny in the current technique. It caused the CGH displays to be challenging to provide long eye relief. On the contrary, the near-eye displays based on the CGH technique have been proposed in recent years [1316]. These works proposed outstanding devices to achieve dynamic 3D near-eye display function. In order to lead the similar displays to become more compact, the relative research based on holographic waveguide was also proposed [1720]. The compact holographic waveguide was employed to achieve the holographic display function in these studies. The waveguide leads the information light into the human eye via linear gratings. The waveguide with a symmetric linear grating pair exists astigmatism independent of the image distance. On the contrary, the holographic waveguide with a single linear grating caused astigmatism and anamorphic deformation. Furthermore, astigmatism is a variation corresponding to the image distance in this case. The astigmatism point-based algorithm and the astigmatism iteration convolution method were proposed in Refs. [17,19]. In the Ref. [18], a cylindrical lens phase was employed to compensate for the astigmatism aberration. Although the near-eye CGH display has already developed well, the FOV is limited by the pixel pitch. In order to enhance the FOV, the focal lens or holographic optical element (HOE) with lens function is necessary [21,22]. For instance, Maimone et al. proposed the holographic lens based near-eye CGH display with 80 degrees FOV [22]. K. Wakunami et al. proposed a digitally designed HOE (DDHOE) fabricated by the wavefront printing technique to guide the information light of high FOV into the human eye [23]. Since the eye-box of similar devices is very small, Changwon Jang et al. proposed the method to extend the eye-box. The HOE with lens array function and the scanning mirror were utilized to achieve exit-pupil shifting [24]. The design principle, the method of FOV estimation, and the aberration analysis method are described in this article. The aberration caused by the proposed HOE was analyzed by the commercial ray-tracing software Zemax.

In the previous literature, the aberration caused by the holographic device can be analyzed by Zemax. Furthermore, the aberration can be corrected using an appropriate algorithm [18,19]. The required intermediate images can be obtained when the light of desired aberration-free images inversely irradiates the holographic device. Then aberration-free final images can be observed if the SLM system provides the conjugate light of the required intermediate images. In this work, a single-shot HOE with lens function was employed to provide a CGH display device with an enlarged FOV. The proposed holographic screen consists of a linear grating and a lens phase. The existence of astigmatism can be expected according to previous literature [20]. The CGHs in the following sections were generated via the IFTA method. the astigmatism aberration was corrected using an additional phase distribution of a cylindrical lens. Finally, the proposed devise achieves 20.3 degrees display function using the phase-only SLM with 6.4 µm pixel pitch and 532 nm laser light source.

2. Principle

The principle of the FOV magnified function and the method to calculate the FOV is described in this section. The schematic working principle of the enlarged FOV using an optical element with positive lens function is shown in Fig. 1. The SLM is arranged at position where the distance to the lens is longer than two times the focal lens. The distance between the intermediate and the lens is shorter than the focal length. Then the SLM’s real image becomes the system's exit pupil. In other words, the imaged SLM is a hologram with reduced magnification. Finally, the observer can obtain the holographic images with enlarged FOV. Generally, the FOV can be calculated as

$$\varphi = 2si{n^{ - 1}}\left( {\frac{\lambda }{{2p}}} \right)$$
where $\lambda $ is the wavelength and p is the pixel pitch. In this case, the pixel pitch p utilized to estimate the FOV $\varphi $ should consider the magnification of the imaged SLM. For instance, the FOV will become four times if the magnification of the imaged SLM is 0.25.

 figure: Fig. 1.

Fig. 1. The schematic diagram shows the principle of the FOV expanding function.

Download Full Size | PDF

Figure 2 shows the experimental arrangement of the HOE recording. The reflection-type HOE was fabricated using the optical interference between two spherical waves. The half-wave plate (HWP) 1 and the polarizing beam splitter (PBS) were utilized to modulate the intensity ratio between the reference beam and the signal beam as 1:1, and HWP 2 was utilized to convert the polarization as s-state. The spatial filter (SF) and collimated lens (CL) were employed to generate the collimated beam with high spatial coherence. The standard focal lens (FL) 1 was used to generate the spherical divergence wave as the signal beam that the distance B between material and point source is 200 mm. The reference beam emitted from FL2 is a spherical convergence wave that the distance A is 600 mm. The incident angle $\mathrm{\Theta }$ of the reference beam is 45 degrees and the signal beam is normal incident. The material is the photopolymer (C-RT20, Liti Holographics, Inc.). The diffractive efficiency of the reflection-type HOE is higher than 70% if the dosage larger than 30mJ/cm2 with 532 nm Laser beam.

 figure: Fig. 2.

Fig. 2. The optical configuration was utilized to fabricate the HOE screen.

Download Full Size | PDF

Here, the proposed holographic screen consisted of a linear grating and a lens phase. The SLM’s real image will be generated when an SLM is arranged at the focal plane of reference beam. Then the SLM’s real image as the exit pupil located at signal beam’s focal plane. The schematic diagram of the arrangement is shown in Fig. 3(a). Then we can estimate the FOV via substituting the imaged SLM's pixel into formula (1). Considering the effect of the lens phase, the paraxial vertical magnification to equal the ratio between the distance A and B. However, the horizontal magnification should consider the contribution of the lens phase and the linear grating simultaneously. Figure 3(b) shows the schematic diagram of diffraction when the collimated beam probes the linear grating from the SLM side. The differential of linear grating’s diffraction formula can estimate the diffraction angle variation when the incident angle rotates slightly. The diffraction angle variation will respect the following formula

$$\cos ({{\theta_d}} )\mathrm{\delta }{\theta _d} ={-} \cos ({{\theta_i}} )\mathrm{\delta }{\theta _i}$$
where ${\theta _i}$ is the incident angle, $\mathrm{\delta }{\theta _i}$ is the incident angle variation, ${\theta _d}$ is the diffraction angle, $\mathrm{\delta }{\theta _d}$ is the diffraction angle variation. For the on-axis light, since the incident angle ${\theta _i}$ is $\mathrm{\Theta }$ and the diffraction angle ${\theta _d}$ is zero in our case, the formula can be simplified as
$$\mathrm{\delta }{\theta _d} ={-} \cos (\mathrm{\Theta } )\mathrm{\delta }{\theta _i}$$

 figure: Fig. 3.

Fig. 3. The schematic diagram shows (a) the arrangement of the SLM and HOE; (b) diffraction angle variation of linear grating.

Download Full Size | PDF

Notes that $\mathrm{\delta }{\theta _i}$ is the angular variation in the SLM side and $\mathrm{\delta }{\theta _d}$ is the angular variation in the eye side.

Then the pixel size of the SLM's real image can be described as

$$\left\{ {\begin{array}{{c}} {p_v^{\prime} = p\frac{B}{A}}\\ {p_h^{\prime} = p\frac{B}{A}\cos (\mathrm{\Theta } )} \end{array}} \right.$$
where, $p_v^{\prime}$ is the vertical pixel size, $p_h^{\prime}$ is the horizontal pixel size and p is the original pixel size of the SLM. According to formula (4), the vertical and horizontal imaged pixel sizes are 2.1333 µm and 1.5085 µm, respectively. These values matched the simulation results based on Zemax. Finally, the FOV can be calculated as the following formula
$$\left\{ {\begin{array}{{c}} {{\varphi_v} = 2{{\sin }^{ - 1}}\left( {\frac{{\lambda A}}{{2pB}}} \right)}\\ {{\varphi_h} = 2{{\sin }^{ - 1}}\left( {\frac{{\lambda A}}{{2pB\cos (\mathrm{\Theta } )}}} \right)} \end{array}} \right.$$
where ${\varphi _v}$ is the vertical FOV, ${\varphi _h}$ is the horizontal FOV. In this case, the estimated FOV is 14.3254° and 20.3127° in vertical and horizontal separately. The vertical FOV will be enlarged about three times and the horizontal FOV will be enlarged about 4.24 times.

3. Aberration analysis and correction

In order to correct the aberration, the aberration caused by the HOE was analyzed via the ray-tracing software Zemax in this paper. The desired aberration-free final image was arranged as the object surface in the simulation configuration. Then the required intermediate image with aberration will be obtained when the light of the desired aberration-free final image probes the HOE inversely. As a result, the observer will obtain the aberration-free final image if the CGH system reconstructs the aberrated intermediate image experimentally. The most serious aberration in this system is anamorphic distortion and astigmatism which caused by the linear grating. The linear grating caused astigmatism since the variations of horizontal and vertical diffractive angles are different. The anamorphic distortion is existence because of the horizontal and vertical FOV magnification are different.

The field curvature of the required intermediate image when the final image is located at 100 mm behind the HOE is shown in Fig. 4. The vertical axis in Fig. 4(a) is the horizontal field, and the one in Fig. 4(b) is the vertical field. The horizontal axis represents the distance from SLM to the required intermediate image. The red line ${\mathrm{\Sigma }_{y - z}}$ represents the required intermediate image plane in the y-z plane (the sagittal image plane in Fig. 4(a), the tangential image plane in Fig. 4(b)), and the black line ${\mathrm{\Sigma }_{x - z}}$ represents the required intermediate image plane in the x-z plane (the tangential image plane in Fig. 4(a), the sagittal image plane in Fig. 4(b)). In both Fig. 4(a) and 4(b), the sagittal plane and the tangential plane separate from each other because of astigmatism. In order to generate the intermediate image with astigmatism, the CGH algorithm with astigmatism is necessary. Here, the image plane in Fig. 4(a) inclines 45 degrees because of the rotation of the HOE. The required intermediate images are almost parallel to the HOE.

 figure: Fig. 4.

Fig. 4. The required position of the intermediate image to provide aberration-free images at 100 mm behind the HOE corresponding to different (a) x-field; (b) y-field.

Download Full Size | PDF

Furthermore, according to Ref. 20, astigmatism depends on the final image's distance when the device employs only one linear grating. Therefore, astigmatism at zero-degree field corresponding to different image distances was analyzed in Fig. 5. The horizontal axis is the distance from the HOE to the final image. The negative value means the image close to the observer. The vertical axis is the distance from the SLM to the intermediate image. Here, ${d_{x - z}}$ and ${d_{y - z}}$ represent the distance from SLM to ${\mathrm{\Sigma }_{x - z}}$ and ${\mathrm{\Sigma }_{y - z}}$, separately. This figure describes that astigmatism is increased with the growth of distance. There is no astigmatism when the final image is located at the HOE plane. The variation corresponding to image distance with negative distance is more significant because of the HOE including a lens phase which the effective focal lens is 150 mm. When the distance of the final image is −150 mm, ${d_{y - z}}$ will become infinity.

 figure: Fig. 5.

Fig. 5. The required position of the intermediate image at the central field to provide aberration-free images is dependent on the display distance of the final image.

Download Full Size | PDF

Figure 6 shows the grid distortion of the required intermediate image at ${\mathrm{\Sigma }_{y - z}}$. The black grid shows the desired final image and the blue dots show the nodes of the grid of the required intermediate image. In other words, when the CGH system generates the intermediate image that matches the blue dots, the final image will match the black grid. The field of the final image is set as 14.3254° (vertical) and 20.3127° (horizontal), equal to the estimated FOV in formula (5). The intermediate image's size shrinks in the horizontal direction because of the horizontal FOV magnification is larger than the vertical one. The vertical size on the positive x side must be larger than the negative x side because of the rotation of the HOE. Moreover, since the anamorphic is caused due to the difference between vertical and horizontal FOV magnification, the distortion is independent of the image distance in this case.

 figure: Fig. 6.

Fig. 6. The required grid distortion of the intermediate image was obtained when the information light of an aberration-free image inversely irradiates the HOE from the human eye side.

Download Full Size | PDF

4. Experimental results

The experimental configuration is shown in Fig. 7. The phase-only SLM was arranged at the focal plane of the reference beam in Fig. 2. Then the imaged SLM located at the focal plane of the signal beam. The camera was arranged at the position of the SLM’s real image to obtain the resulting image. The focal lens (FL) generates a convergent beam to probe the SLM. The DC term focused near the boundary of the HOE and did not probe the HOE. Then the DC noise can be avoided. To cancel the wavefront of the reading beam, the phase-only hologram should multiply the conjugate phase of the reading beam. In this case, the phase of the reading beam is the convergence spherical wave that focal length is 600 mm.

 figure: Fig. 7.

Fig. 7. The experimental system was utilized to observe the provided holographic image.

Download Full Size | PDF

In order to determine the FOV, this research employed the IFTA method to generate CGHs. The holographic material in this research is photopolymer film with 2-inch (vertical) by 3-inch (horizontal) dimensions. The resulting image at the HOE plane without any aberration correction is shown in Fig. 8. Figure 8(a) shows the target image in the calculation. The outer white square frame matched the boundary of the calculated space. Therefore, the outer white frame will match the device's FOV in the experimental observation. Figure 8(b) shows the resulting image. The phase distribution ${H_{SLM}}$ on the SLM plane is

$${H_{SLM}}({X,Y} )= H({X,Y} )exp\left[ {\frac{{i\pi }}{{\lambda z}}({{X^2} + {Y^2}} )} \right]\psi ({X,Y} )$$
where $H({X,Y} )$ is the phase distribution calculated by IFTA, X and Y is the pixels on the SLM, $\psi ({X,Y} )$ is the conjugate phase of the reading beam, and the distance from SLM to intermediate image z = 600 mm. This virtual image was obtained clearly because it's astigmatism-free in this condition. The red arrows in Fig. 8(b) show the measured FOV. The measured horizontal FOV is 20.2399°. The result is very close to the estimated value. The image height of zero x-field was blocked slightly because of the HOE's edge and the mount's shadow. However, the full vertical FOV can be calculated as 14.3286° because the measured vertical FOV of the central two subgrids is 4.7762°. This value is also matched the estimated value based on formula (5).

 figure: Fig. 8.

Fig. 8. The (a) target image and (b) the resulting image were utilized to determine the FOV.

Download Full Size | PDF

In Fig. 8(b), the left side is the positive x-direction. Since the distortion was not pre-corrected, the distortion opposite from Fig. 6 exists in this result. The resulting image of the distortion pre-correction is shown in Fig. 9. Figure 9(a) shows the desired final image. Figure 9(b) shows the pre-distorted image when Fig. 9(a) was fitted to the grid distortion in Fig. 6. The resulting image at the HOE plane is shown in Fig. 9(c). This image matches the target image. It proofs the grid distortion analysis is correct.

 figure: Fig. 9.

Fig. 9. The experimental result shows distortion was corrected successfully. (a) desired final image; (b) required target image for CGH calculation; (c) the resulting image.

Download Full Size | PDF

Figure 10 shows the resulting images located at 100 mm behind the HOE to verify the astigmatism analysis. The field curvature of the required intermediate image is shown in Fig. 4. The resulting image without astigmatism correction is shown in Fig. 10(a). The parameter z in formula (6) must be 540 mm in this condition, and the intermediate image plane located at 540 mm in front of the SLM. The vertical lines on the left side are clear because the required ${d_{x - z}}$ of 10° x-field is also 540 mm. The vertical lines at -x-field become blurred following the growth of the separation between the intermediate image plane and the required plane ${\mathrm{\Sigma }_{x - z}}$. The image, as shown in the red square, is the zoom-in image. Figure 10(b) shows the resulting image with astigmatism correction. The phase distribution ${H_{SLM}}$ on the SLM plane becomes

$${H_{SLM}}({X,Y} )= H({X,Y} )exp\left[ {\frac{{i\pi }}{\lambda }\left( {{{\frac{X}{{{d_{x - z}}}}}^2} + {{\frac{Y}{{{d_{y - z}}}}}^2}} \right)} \right]\psi ({X,Y} )$$
where ${d_{x - z}}$ is 568 mm and ${d_{y - z}}$ is 540 mm. Then ${\mathrm{\Sigma }_{x - z}}$ and ${\mathrm{\Sigma }_{y - z}}$ located at 568 mm and 540 mm in front of the SLM, but the image plane did not rotate about the y-axis. The image quality was improved significantly because of the astigmatism correction. However, the quality decrease of the resulting image still exists because of field curvature. Figure 10(c) shows the resulting image considering the intermediate image rotation. In order to generate the inclined intermediate image with the same method, the target image was separated into 32 parts along the x-field. Each part was calculated using the same method with the corresponding ${d_{x - z}}$ and ${d_{y - z}}$. The lines in Fig. 10(c) are clearer than the former.

 figure: Fig. 10.

Fig. 10. The resulting images were located at 100 mm behind the HOE (a) without astigmatism correction; (b) with astigmatism correction; (c) with astigmatism correction and considering intermediate image rotating.

Download Full Size | PDF

Furthermore, we can calculate the required phase distributions of the final images with different distances. Figure 11 shows the resulting images at different distances. The phase distribution on the SLM plane is the superposition of these images’ information light. The distances from HEO to the images were configured as −50 mm, 0 mm, and 100 mm. The resulting image shows this method works well. Finally, the proposed method can also achieve the full-color display function via the holographic wavelength multiplexing. Figure 12 shows the full-color image using this method. The HOE was fabricated under the same optical configuration with the 457, 532, and 640 nm laser light sources. The “BGR” character was situated 100 mm behind the HOE, and “3D” was located at 50 mm in front of the HOE.

 figure: Fig. 11.

Fig. 11. The provided 3D image was captured when the camera focused at (a) 50 mm in front of the HOE; (b) at the HOE; (c) 100 mm behind the HOE.

Download Full Size | PDF

 figure: Fig. 12.

Fig. 12. The full-color image was captured when the camera focused at (a) 50 mm in front of the HOE; (b) 100 mm behind the HOE.

Download Full Size | PDF

5. Discussions

This work verified the proposed holographic screen significantly magnified the FOV of the 3D CGH display. The aberration analysis method was also discussed and verified. In Fig. 10(c), the multilayer astigmatism intermediate image was employed to verify that the rotated astigmatism intermediate image can significantly enhance the image quality. However, the uniformity is decreased because of the mismatching between every layer's efficiency. The target image with a few binary lines used in this work helps determine astigmatism. In this case, the bright area of the target image at different layers is not identical. It finally caused the brightness of the lines at different x-field to be not identical. Therefore, the brightness balance between layers is necessary. Therefore, the brightness balance between layers is necessary. In contrast, the point-based method is a solution, but more operation time was required. Rotating the intermediate image using the angular spectrum method is also a great solution. However, the requirement of random-access memory and operation time is higher than the IFTA method.

The astigmatism analysis in this research uses the ray-tracing software Zemax. However, it means several astigmatism data with different image distance must be simulated when the HOE's parameter changed. For the central field, the required distance ${d_{x - z}}$, $ {d_{y - z}}$ can be calculated by tracing the cross of an arbitrary off-axis ray of x-fan, y-fan and the on-axis ray. Figure 13 shows the schematic diagram of the rays in the SLM side. If the divergence angle of x-fan and y-fan is same and very small, and the irradiated area on the HOE is a circle with radius r when the rays from the final image propagate to the HOE inversely. For the x-fan, the separation between two rays perpendicular to the propagation direction can be approximated as $r\textrm{cos}(\Theta )$. If the convergence angle of the off-axis y-fan ray is $\mathrm{\rho }$, the convergence angle of the off-axis ray in x-fan can be approximated as $\mathrm{\rho }/\textrm{cos}(\Theta )$ when the off-axis angle is very small. Finally, we can expect the distance from HOE to ${\mathrm{\Sigma }_{x - z}}$ can be approximated as the distance of ${\mathrm{\Sigma }_{y - z}}$ times ${\cos ^2}(\Theta )$, where the distance from HOE to ${\mathrm{\Sigma }_{y - z}}$ can be calculated by Gaussian lens formula.

 figure: Fig. 13.

Fig. 13. The schematic diagram shows the distance between HOE and the required intermediate image (a) ${\mathrm{\Sigma }_{y - z}}$; (b) ${\mathrm{\Sigma }_{x - z}}$.

Download Full Size | PDF

Therefore, ${d_{y - z}}$ and ${d_{x - z}}$ can be approximated as

$$\left\{ {\begin{array}{{c}} {{d_{y - z}} = A - 1/\left( {\frac{1}{f} + \frac{1}{L}} \right)}\\ {{d_{x - z}} = A - {{\cos }^2}(\Theta )/\left( {\frac{1}{f} + \frac{1}{L}} \right)} \end{array}} \right.$$
where f is the effective focal length of the HOE’s lens phase and L is the distance from the HOE to the final image. Since the linear grating has no effect in y-direction, ${d_{y - z}}$ is inducted according to the Gaussian form of the thin-lens formula. The value ${d_{x - z}}$ will become incorrect when L is close to $- f$. The estimated value and the Zemax simulation results with different image distances are shown in Fig. 14. The ${d_{y - z}}$ value based on Zemax and formula (8) matched perfectly. The difference between the two methods is 0, no matter the image distance. For the estimated value of ${d_{x - z}}$, the difference between Zemax simulation and formula (8) is shown in Fig. 15.

 figure: Fig. 14.

Fig. 14. The comparison of the astigmatism analysis based on the commercial software Zemax and the proposed formula (8).

Download Full Size | PDF

 figure: Fig. 15.

Fig. 15. The difference between Zemax and formula (8) is dependent on the display distance of the final images.

Download Full Size | PDF

The difference is 0 when L = 0. The difference becomes larger with the distance growth between HOE and the final image. The error percentage is 2.013% when L is −85 mm. When the final image is located at infinity, the error percentage is 2.084%.

6. Conclusions

We proposed the design principle and aberration analysis method of a single shot holographic screen that is used to enhance the CGH display's FOV. The holographic screen is the HOE consisting of a linear grating and a lens phase. The FOV can be estimated using a simple formula. The FOV is magnified three times in verticals and 4.2 times in horizontals. Furthermore, the aberration caused by the HOE can be analyzed via the ray-tracing method. The most serious aberration in this device is astigmatism and anamorphic deformation. The correction method was proposed and verified using the CGH with a cylindrical lens phase. Finally, the astigmatism estimation method with simple formula is also discussed in this article. In specific image distance ranges, the estimated value matches the result based on commercial ray-tracing software. In this work, the proposed device is a bench-top projection type display. The value A and B were chosen considering the numerical aperture of FL1 and FL2 in Fig. 2 and the dimension the dimension of the photopolymer film. Actually, the same method can be utilized to design a near-eye display by reducing the length A and B. Furthermore, the FOV can be increased by enhancing the ratio of A and B. Enhancing the angle $\mathrm{\Theta }$ can also increase the horizontal FOV. However, in this fabrication method, the focal length or holographic lens (FL1) with a higher numerical aperture is necessary if a larger FOV is desired.

Funding

Ministry of Science and Technology, Taiwan (MOST 108-2221-E-018-018-MY3, MOST 111-2218-E-008-004-MBK).

Disclosures

The authors declare no conflicts of interest related to this article.

Data availability

Data underlying the results presented in this paper are not publicly available at this time but may be obtained from the authors upon reasonable request.

References

1. D. M. Hoffman, A. R. Girshick, K. Akeley, and M. S. Banks, “Vergence–accommodation conflicts hinder visual performance and cause visual fatigue,” J. Vis. 8(3), 33 (2008). [CrossRef]  

2. I. Ducin, T. Shimobaba, M. Makowski, K. Kakarenko, A. Kowalczyk, J. Suszek, M. Bieda, A. Kolodziejczyk, and M. Sypek, “Holographic projection of images with step-less zoom and noise suppression by pixel separation,” Opt. Commun. 340, 131–135 (2015). [CrossRef]  

3. M. Makowski, I. Ducin, K. Kakarenko, J. Suszek, M. Sypek, and A. Kolodziejczyk, “Simple holographic projection in color,” Opt. Express 20(22), 25130–25136 (2012). [CrossRef]  

4. Y. Ogihara and Y. Sakamoto, “Fast calculation method of a CGH for a patch model using a point-based method,” Appl. Opt. 54(1), A76–A83 (2015). [CrossRef]  

5. R. G. Dorsch, A. W. Lohmann, and S. Sinzinger, “Fresnel ping-pong algorithm for two-plane computer-generated hologram display,” Appl. Opt. 33(5), 869–875 (1994). [CrossRef]  

6. T. Shimobaba, M. Makowski, T. Kakue, M. Oikawa, N. Okada, Y. Endo, R. Hirayama, and T. Ito, “Lensless zoomable holographic projection using scaled Fresnel diffraction,” Opt. Express 21(21), 25285–25290 (2013). [CrossRef]  

7. K. Masuda, Y. Saita, R. Toritani, P. Xia, K. Nitta, and O. Matoba, “Improvement of image quality of 3D display by using optimized binary phase modulation and intensity accumulation,” J. Disp. Technol. 12(5), 472–477 (2016). [CrossRef]  

8. K. Matsushima, H. Schimmel, and F. Wyrowski, “Fast calculation method for optical diffraction on tilted planes by use of the angular spectrum of plane waves,” J. Opt. Soc. Am. A 20(9), 1755–1762 (2003). [CrossRef]  

9. K. Matsushima, “Computer-generated holograms for three-dimensional surface objects with shade and texture,” Appl. Opt. 44(22), 4607–4614 (2005). [CrossRef]  

10. K. Matsushima and S. Nakahara, “Extremely high-definition full-parallax computer-generated hologram created by the polygon-based method,” Appl. Opt. 48(34), H54–H63 (2009). [CrossRef]  

11. J. Liu, A. Caley, and M. Taghizadeh, “Symmetrical iterative Fourier-transform algorithm using both phase and amplitude freedoms,” Opt. Commun. 267(2), 347–355 (2006). [CrossRef]  

12. F. Wyrowski and O. Bryngdahl, “Iterative Fourier-transform algorithm applied to computer holography,” J. Opt. Soc. Am. A 5(7), 1058–1065 (1988). [CrossRef]  

13. Z. Chen, X. Sang, Q. Lin, J. Li, X. Yu, X. Gao, B. Yan, C. Yu, W. Dou, and L. Xiao, “Acceleration for computer-generated hologram in head-mounted display with effective diffraction area recording method for eyes,” Chin. Opt. Lett. 14(8), 080901 (2016). [CrossRef]  

14. Q. Gao, J. Liu, X. Duan, T. Zhao, X. Li, and P. Liu, “Compact see-through 3D head-mounted display based on wavefront modulation with holographic grating filter,” Opt. Express 25(7), 8412–8424 (2017). [CrossRef]  

15. J. Hong, Y. Kim, S. Hong, C. Shin, and H. Kang, “Gaze contingent hologram synthesis for holographic head-mounted display,” in Proc. of SPIE Vol, 2016), 97710K-97711.

16. E. Murakami, Y. Oguro, and Y. Sakamoto, “Study on Compact Head-Mounted Display System Using Electro-Holography for Augmented Reality,” IEICE Trans. Electron. E100.C(11), 965–971 (2017). [CrossRef]  

17. H.-J. Yeom, H.-J. Kim, S.-B. Kim, H. Zhang, B. Li, Y.-M. Ji, S.-H. Kim, and J.-H. Park, “3D holographic head mounted display using holographic optical elements with astigmatism aberration compensation,” Opt. Express 23(25), 32025–32034 (2015). [CrossRef]  

18. W.-K. Lin, O. Matoba, B.-S. Lin, and W.-C. Su, “Astigmatism and deformation correction for a holographic head-mounted display with a wedge-shaped holographic waveguide,” Appl. Opt. 57(25), 7094–7101 (2018). [CrossRef]  

19. W.-K. Lin, O. Matoba, B.-S. Lin, and W.-C. Su, “Astigmatism correction and quality optimization of computer-generated holograms for holographic waveguide displays,” Opt. Express 28(4), 5519–5527 (2020). [CrossRef]  

20. W.-C. Su, S.-K. Zhou, B.-S. Lin, and W.-K. Lin, “Simplified aberration analysis method of holographic waveguide combiner,” Photonics 7(3), 71 (2020). [CrossRef]  

21. Z. Chen, Q. Lin, J. Li, X. Yu, X. Gao, B. Yan, K. Wang, C. Yu, and S. Xie, “A see-through holographic head-mounted display with the large viewing angle,” Opt. Commun. 384, 125–129 (2017). [CrossRef]  

22. A. Maimone, A. Georgiou, and J. S. Kollin, “Holographic near-eye displays for virtual and augmented reality,” ACM Trans. Graph. 36(4), 1–16 (2017). [CrossRef]  

23. K. Wakunami, P.-Y. Hsieh, R. Oi, T. Senoh, H. Sasaki, Y. Ichihashi, M. Okui, Y.-P. Huang, and K. Yamamoto, “Projection-type see-through holographic three-dimensional display,” Nat. Commun. 7(1), 12954 (2016). [CrossRef]  

24. C. Jang, K. Bang, G. Li, and B. Lee, “Holographic near-eye display with expanded eye-box,” ACM Trans. Graph. 37(6), 1–14 (2018). [CrossRef]  

Data availability

Data underlying the results presented in this paper are not publicly available at this time but may be obtained from the authors upon reasonable request.

Cited By

Optica participates in Crossref's Cited-By Linking service. Citing articles from Optica Publishing Group journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (15)

Fig. 1.
Fig. 1. The schematic diagram shows the principle of the FOV expanding function.
Fig. 2.
Fig. 2. The optical configuration was utilized to fabricate the HOE screen.
Fig. 3.
Fig. 3. The schematic diagram shows (a) the arrangement of the SLM and HOE; (b) diffraction angle variation of linear grating.
Fig. 4.
Fig. 4. The required position of the intermediate image to provide aberration-free images at 100 mm behind the HOE corresponding to different (a) x-field; (b) y-field.
Fig. 5.
Fig. 5. The required position of the intermediate image at the central field to provide aberration-free images is dependent on the display distance of the final image.
Fig. 6.
Fig. 6. The required grid distortion of the intermediate image was obtained when the information light of an aberration-free image inversely irradiates the HOE from the human eye side.
Fig. 7.
Fig. 7. The experimental system was utilized to observe the provided holographic image.
Fig. 8.
Fig. 8. The (a) target image and (b) the resulting image were utilized to determine the FOV.
Fig. 9.
Fig. 9. The experimental result shows distortion was corrected successfully. (a) desired final image; (b) required target image for CGH calculation; (c) the resulting image.
Fig. 10.
Fig. 10. The resulting images were located at 100 mm behind the HOE (a) without astigmatism correction; (b) with astigmatism correction; (c) with astigmatism correction and considering intermediate image rotating.
Fig. 11.
Fig. 11. The provided 3D image was captured when the camera focused at (a) 50 mm in front of the HOE; (b) at the HOE; (c) 100 mm behind the HOE.
Fig. 12.
Fig. 12. The full-color image was captured when the camera focused at (a) 50 mm in front of the HOE; (b) 100 mm behind the HOE.
Fig. 13.
Fig. 13. The schematic diagram shows the distance between HOE and the required intermediate image (a) ${\mathrm{\Sigma }_{y - z}}$; (b) ${\mathrm{\Sigma }_{x - z}}$.
Fig. 14.
Fig. 14. The comparison of the astigmatism analysis based on the commercial software Zemax and the proposed formula (8).
Fig. 15.
Fig. 15. The difference between Zemax and formula (8) is dependent on the display distance of the final images.

Equations (8)

Equations on this page are rendered with MathJax. Learn more.

φ = 2 s i n 1 ( λ 2 p )
cos ( θ d ) δ θ d = cos ( θ i ) δ θ i
δ θ d = cos ( Θ ) δ θ i
{ p v = p B A p h = p B A cos ( Θ )
{ φ v = 2 sin 1 ( λ A 2 p B ) φ h = 2 sin 1 ( λ A 2 p B cos ( Θ ) )
H S L M ( X , Y ) = H ( X , Y ) e x p [ i π λ z ( X 2 + Y 2 ) ] ψ ( X , Y )
H S L M ( X , Y ) = H ( X , Y ) e x p [ i π λ ( X d x z 2 + Y d y z 2 ) ] ψ ( X , Y )
{ d y z = A 1 / ( 1 f + 1 L ) d x z = A cos 2 ( Θ ) / ( 1 f + 1 L )
Select as filters


Select Topics Cancel
© Copyright 2024 | Optica Publishing Group. All rights reserved, including rights for text and data mining and training of artificial technologies or similar technologies.