Expand this Topic clickable element to expand a topic
Skip to content
Optica Publishing Group

Acceleration and expansion of a photorealistic computer-generated hologram using backward ray tracing and multiple off-axis wavefront recording plane methods

Open Access Open Access

Abstract

Holograms can reconstruct the light wave field of three-dimensional objects. However, the computer-generated hologram (CGH) requires much calculating time. Here we proposed a CGH generation algorithm based on backward ray tracing and multiple off-axis wavefront recording planes (MO-WRP) to generate photorealistic CGH with a large reconstruction image. In this method, multiple WRPs were placed parallelly between the virtual object and the hologram plane. Virtual rays were emitted from the pixel of WRPs and intersect with the object. The complex amplitude of WRPs is then determined by illumination module, such as Phong reflection module. The CGH was generated by the shifted Angular Spectrum Propagation (ASP) from WRPs to the hologram plane. Experimental results demonstrate the effectiveness of this method, and the CGH generation rate is 37.3 frames per second (1 WRP) and 9.8 frames per second (2×2 WRPs).

© 2020 Optical Society of America under the terms of the OSA Open Access Publishing Agreement

1. Introduction

Holographic three-dimensional display technology can reproduce the complete information of the 3D scene, which is an important direction of future display technology. Computer-generated hologram (CGH) based on spatial light modulator (SLM) is the mainstream of research.

Point-based CGH algorithm has been widely used in CGH generation. In this algorithm, the three-dimensional model is discretized into multiple luminous object points, and the complex amplitude of the hologram is calculated by summing up the complex amplitude generated by each of the luminous object points. In order to speed up the calculation speed of point-based CGH, the researchers, on the one hand, proposed fast calculation algorithms such as lookup table (LUT) [1] or wavefront recording plane (WRP) [2], and so on. On the other hand, high-performance hardware such as Field Programmable Gate Array (FPGA) [3], Co-processor (Xeon Phi) [4], and Graphics Procession Unit (GPU) [5] were also used to increase the computation speed by parallel computing.

Backward ray tracing CGH algorithm is an improvement of the point-based CGH algorithm. Compared with point-based CGH with illumination model [6] and occlusion culling [7], backward ray tracing CGH is an effective way to generate holograms with photorealistic rendering effects. Ray tracing was first proposed by Turner Whitted in 1979 [8] and then applied to the CGH algorithm [9]. In 2018, wavefront recording plane (WRP) method was applied to the backward ray tracing CGH algorithm, and 35.6 frames per second (FPS) CGHs generation speed was achieved [10].

The WRP method can significantly improve the calculation speed in ray tracing CGH. However, the reconstruction image size was limited to the hologram size because the sampling number and the sampling interval of WRP and CGH must be equal in the WRP method based on FFT. Using tilted WRP is a way to scale the reconstruction image [11], but a spectrum interpolation process is needed in the diffraction calculation, which increases the computation time compared with the original parallel WRP method [12]. Using nonuniform sampled WRP is another way [13], but the pixel pitch of the reconstruction image was increased, and the fineness of the reconstruction image will be reduced.

We proposed a ray tracing CGH algorithm based on multiple off-axis WRPs and achieved photorealistic reconstruction images with lighting effects. In the proposed method, the reconstruction image size is not limited to the hologram size, and a fast calculation speed is achieved. The complex amplitude of WRPs was generated by tracing the rays from WRPs to objects while considering lighting models such as the Phong reflection model. Then the CGH was calculated by an off-axis propagation of WRPs based on shifted ASP to expand the area of the reconstruction plane. Experimental results verified the effectiveness of the proposed method, and 37.3 FPS (1 WRP) and 9.8 FPS (2×2 WRPs) reconstruction speed were achieved.

2. Ray tracing CGH algorithm based on multiple off-axis WRPs

2.1 Backward ray tracing and illumination model

In point-based CGH algorithm, the complex amplitude $U({{x_H},{y_H},{z_H}} )$ of hologram can be calculated by

$$U({{x_H},{y_H},{z_H}} )= \sum\limits_{j = 1}^N {\frac{{{A_j}}}{{{r_j}}}\exp \left( {i\left( {\frac{{2\pi }}{\lambda }{r_j} + {\varphi_j}} \right)} \right)}$$
$${r_j} = \sqrt {{{({{x_H} - {x_j}} )}^2} + {{({{y_H} - {y_j}} )}^2} + {{({{z_H} - {z_j}} )}^2}}$$
where N represents the number of object points, ${A_j}$ represents the amplitude of the object point, ${r_j}$ represents the distance between object point and hologram pixel and can be calculated by Eq. (2), $\lambda $ represents the light wavelength, and ${\varphi _j}$ represents the initial phase of the object point, which distributes randomly between 0 and $\textrm{2}\mathrm{\pi }$. Variables $({{x_H},{y_H},{z_H}} )$ and $({{x_j},{y_j},{z_j}} )$ represent the coordinates of hologram pixel and object point, respectively.

In the ray tracing CGH algorithm, the hologram plane was dispersed into many virtual ray emitters. Massive virtual rays were emitted from those emitters to objects, as shown in Fig. 1. When a virtual ray interest with the object, the amplitude of the ray is calculated by local illumination models such as the Phong reflection model or global illumination models, which considering light not only from light source but also from reflective objects. The phase of the ray is calculated by the distance between the intersection and emitter. New rays were generated at the intersection according to the reflection law or refraction law, and recursive calculation was carried out to the new ray until the maximum recursion depth. Finally, the complex amplitude of the discrete point on the hologram can also be calculated by Eq. (1), where j represents the serial number of the rays, $N$ represents the number of rays, ${A_j}$ represents the amplitude of intersection calculated according to the illumination model, ${r_j}$ represents the distance between the intersection and ray emitter, instead.

 figure: Fig. 1.

Fig. 1. Schematic of ray tracing CGH.

Download Full Size | PDF

Here, we used the Phong reflection model to render the lighting effects in ray tracing CGH. In the Phong illumination model, the reflected light consists of ambient light, diffuse light, and specular light. The ambient component describes the diffused light in the environment and is a constant. The diffuse component describes the light reflected in any directions. The specular component describes the light dot when the surface is smooth. The illumination at the surface point when there are N light sources is expressed as Eq. (3) [14].

$$I = {k_a}{I_a} + \sum\limits_{j = 1}^N {{I_{lj}}{f_j}({{k_d}({{\mathbf N}\cdot {{\mathbf L}_j}} )+ {k_s}{{({{{\mathbf R}_j}\cdot {\mathbf V}} )}^m}} )}$$
where ${I_a}$ represents the intensity of ambient light, ${I_{lj}}$ represents the intensity of the light source, ${f_j}$ represents the distance factor of light intensity, ${k_a}$, ${k_d}$, and ${k_s}$ represents the material reflectance coefficient of ambient, diffuse, and specular component, and m represents the glossiness of surface, which ranges from 0 to $\infty $, respectively. ${\mathbf N}$, ${\mathbf L}$, ${\mathbf R}$, and ${\mathbf V}$ are unit vectors which represent the normal vector of surface, illumination vector and reflection vector of light source, and direction of viewer, as shown in Fig. 2(a).

 figure: Fig. 2.

Fig. 2. Phong reflection model (a) Coordinate system (b) Diffuse reflection (c) Specular reflection.

Download Full Size | PDF

2.2 Multiple off-axis WRPs in backward ray tracing CGH

Applying the WRP method to ray tracing CGH algorithm can effectively improve the calculation speed [10], but the resolution and pixel size of WRP must be equal to that of the CGH in the WRP method based on FFT, which means the size of the reconstruction image was limited.

Here, we proposed a ray tracing CGH method based on multiple off-axis WRPs, which expands the reconstruction plane size while maintaining high calculation speed. The concept of the proposed method is illustrated in Fig. 3, and it consists of three main steps:

 figure: Fig. 3.

Fig. 3. Schematic diagram of multiple off-axis WRP (a) Side view (b) Top view.

Download Full Size | PDF

Step 1: Ray tracing

In step 1, multiple WRPs are placed close to the object and parallel to the hologram plane. The amount and position of WRPs depend on the size and position of the objects to ensure the WRPs can collect all the rays transmit to the hologram plane. The resolution and pixel pitch of each WRP is the same as CGH.

As shown in Fig. 4, for a WRP, the off-axis shift is $({{x_0},{y_0}} )$, and the distance to CGH and objects are ${z_{\textrm{wrp}}}$ and ${z_{\textrm{obj}}}$, respectively. In the ray tracing process, a virtual ray is composed of ray origin and direction vector. Each of the WRP pixels is treated as a ray emitter, and the ray origin depends on the WRP pixel location.

 figure: Fig. 4.

Fig. 4. Schematic diagram of virtual ray emitters.

Download Full Size | PDF

The ray direction depends on the size of WRP/CGH, the shift of WRP, and the distance between WRP and CGH. Take x-direction as an example. In the case of in-axis $({{x_0} = 0} )$, the angle range of virtual rays was calculated approximately by

$${\theta _x}\textrm{ = }\frac{W}{{{z_{\textrm{wrp}}}}}$$
and virtual rays range in $[{ - {{{\theta_x}} / 2},{{{\theta_x}} / 2}} ]$, where W represents the width of the CGH and ${z_{\textrm{wrp}}}$ represents the distance between the WRP and CGH plane. In the case of off-axis WRPs $({{x_0} \ne 0} )$, the angel rang is still ${\theta _x}$, but there is an angle shift ${\theta ^{\prime}_x}$ defined approximately by
$${\theta ^{\prime}_x} = \frac{{{x_0}}}{{{z_{\textrm{wrp}}}}}$$
Then the direction of virtual rays emitted by this WRP pixel range in $[{ - {{{\theta_x}} / 2} + {{\theta^{\prime}}_x},{{{\theta_x}} / 2} + {{\theta^{\prime}}_x}} ]$. Besides, the angle range cannot exceed the diffraction angle of SLM defined by ${\theta _{\textrm{slm}}} = {\lambda / {{p_{\textrm{slm}}}}}$, where $\lambda $ represents the wavelength and ${p_{\textrm{slm}}}$represents the pixel pitch of SLM, respectively. For the y-direction parameters ${\theta _y}$ and ${\theta ^{\prime}_y}$, the calculation process is the same.

The ray number emitted by each WRP pixel depends on the angel range and ray interval. The ray number in the x-direction is

$${N_x} = \frac{{{\theta _x}}}{{\Delta \theta }}$$
where $\Delta \theta $ represents the ray interval and $\Delta \theta = {p / {{z_{\textrm{obj}}}}}$, approximately. p represents the object sampling interval. In this paper, the sampling interval p was set to 0.1 mm, which balances the computational efficiency and the human eye angle resolution. ${z_{\textrm{obj}}}$ represents the distance between the object and WRP.

When the WRPs are close to the object, the angle range ${\theta _x}$ and ${\theta _y}$ is significantly smaller than the diffraction angle of SLM. For the same ray interval, the amount of virtual rays is significantly reduced. The number of virtual rays N is calculated by

$$N = {N_x}{N_y} = \frac{{{\theta _x}{\theta _y}}}{{{{({\Delta \theta } )}^2}}}$$
Where ${N_y}$ is the ray number in the y-direction.

When the parameters of WRP and virtual rays are determined, a ray tracing process was implemented, and the complex amplitude of WRP was acquired according to Eqs. (1)-(3). The random initial phase ${\varphi _j}$ of each intersection in Eq. (1) can be obtained by a precomputed lookup table corresponds to the position in object space [9,15].

Step 2: Off-axis propagation

In the WRP method, the complex amplitude of the hologram plane can be obtained by calculating the diffraction from WRP to the hologram plane. For the off-axis situation, a feasible method is to expand the size of WRP by filling with zero to be larger than the object and then calculate the diffraction from WRP to the CGH plane. After this, cut off the CGH plane to the original size to obtain the CGH. This method severely increases the computational load and counteracts the advantages of the WRP method. Here, we use a band-limited shifted angular spectrum method [16] to calculate the off-axis diffraction.

 figure: Fig. 5.

Fig. 5. The coordinate of off-axis WRP.

Download Full Size | PDF

The coordinate of off-axis WRP is shown as Fig. 5. The hologram is located at $\mathord{\buildrel{\lower3pt\hbox{$\scriptscriptstyle\frown$}} \over x} - \mathord{\buildrel{\lower3pt\hbox{$\scriptscriptstyle\frown$}} \over y} $ plane and the complex amplitude is $\hat{g}({\hat{x},\hat{y},{z_0}} )$. The WRP is located at $x - y$ plane and the complex amplitude is $g({x,y,0} )$. The coordinate origin of $\mathord{\buildrel{\lower3pt\hbox{$\scriptscriptstyle\frown$}} \over x} - \mathord{\buildrel{\lower3pt\hbox{$\scriptscriptstyle\frown$}} \over y} $ plane is shifted from $x - y$ plane by $({{x_0},{y_0}} )$ as in Eq. (8).

$$\begin{array}{l} \mathord{\buildrel{\lower3pt\hbox{$\scriptscriptstyle\frown$}} \over x} = x - {x_0}\\ \mathord{\buildrel{\lower3pt\hbox{$\scriptscriptstyle\frown$}} \over y} = y - {y_0} \end{array}$$
The complex amplitude of the CGH plane can be calculated by adding Eq. (8) to Rayleigh-Sommerfeld integral equation and rewritten using the circular convolution theorem as Eq. (9).
$$\begin{aligned} \hat{g}({\hat{x},\hat{y},{z_0}} )&= g({\hat{x} + {x_0},\hat{y} + {y_0},{z_0}} )\\ &= \int\!\!\!\int {g({x^{\prime},y^{\prime},0} )h({\hat{x} + {x_0} - x^{\prime},\hat{y} + {y_0} - y^{\prime},z0} )dx^{\prime}dy^{\prime}} \\ &= {\mathcal{F}^{ - 1}}\{\mathcal{F}[{g({x^{\prime},y^{\prime},0} )} ]\mathcal{F}[{h({x^{\prime},y^{\prime},{z_0}} )} ]\exp [{i2\pi ({x_0}u + {y_0}v ) ]} \}\\ &= {\mathcal{F}^{ - 1}}\{\mathcal{F}[{g({x^{\prime},y^{\prime},0} )} ]\exp [{i2\pi ({x_0}u + {y_0}v + {z_0}w} ) ]\}\\ &= {\mathcal{F}^{ - 1}}\{{\mathcal{F}[{g({x^{\prime},y^{\prime},0} )} ]\hat{H}({u,v,{z_0}} )} \}\end{aligned}$$
where ${\mathcal{F}^{ - 1}}[{\ }]$ represent the inverse Fourier transform and $\mathcal{F}[{\ }]$ represent the Fourier transform, respectively. $w = {({{\lambda^{ - 2}} - {u^2} - {v^2}} )^{{1 / 2}}}$ . u and v respectively represent the horizontal and vertical frequency. Before performing the Fourier transform, the source plane must be doubled by filling with zeros to avoid aliasing and cut off to the original size after the inverse Fourier transform. It can be seen that the off-axis transfer function $\hat{H}({u,v,{z_0}} )$ is the product of the in-axis transfer function $\mathcal{F}[{h({x^{\prime},y^{\prime},{z_0}} )} ]$ and a phase shift factor $\exp [{i2\pi ({{x_0}u + {y_0}v} )} ]$.

The off-axis transfer function $\exp [{i2\pi ({x_0}u + {y_0}v + {z_0}w )} ]$ in Eq. (9) must meet the Nyquist sampling theorem

$$\begin{array}{l} \Delta {u^{ - 1}} \ge 2|{{f_u}} |\\ \Delta {v^{ - 1}} \ge 2|{{f_v}} |\end{array}$$
where $\Delta u$ and $\Delta v$ are sampling intervals on spatial frequency plane in x and y direction, respectively. the signal frequencies can be calculated by
$$\begin{array}{l} {f_u} = \frac{1}{{2\pi }}\frac{{\partial \phi }}{{\partial u}} = {x_0} - \frac{u}{w}{z_0}\\ {f_v} = \frac{1}{{2\pi }}\frac{{\partial \phi }}{{\partial v}} = {y_0} - \frac{v}{w}{z_0} \end{array}$$
where $\phi = {x_0}u + {y_0}v + {z_0}w$ represents the phase part of the off-axis transfer function. Substitute Eq. (11) into Eq. (10), there has
$$\begin{array}{l} \Delta {u^{ - 1}} \ge 2\left|{{x_0} - \frac{u}{w}{z_0}} \right|\\ \Delta {v^{ - 1}} \ge 2\left|{{y_0} - \frac{v}{w}{z_0}} \right|\end{array}$$
The spatial frequency of the transfer function element higher than $\Delta {u^{ - 1}}$ and $\Delta {v^{ - 1}}$ will be replaced by 0. By applying Eq. (9) and Eq. (12) to WRPs and summing up the complex amplitude, the complex amplitude of the CGH plane can be achieved.

Step 3: hologram generation

After calculating all the off-axis propagation from WRPs to the CGH plane, sum up the complex amplitude generated by each of the WRPs, and the complex amplitude of CGH can be obtained.

3. Implementation

According to the calculation process Step 1 to Step 3 in Section 2.2, we adopted NVIDIA OptiX ray tracing engine and NVIDIA CUDA as the program framework to implement the proposed algorithm. The overall calculation process is shown in Fig. 6. The program consisted of C code running on CPU (HOST) and CUDA C code running on the GPU (DEVICE).

 figure: Fig. 6.

Fig. 6. Flow chart of off-axis WRP ray tracing algorithm.

Download Full Size | PDF

In the ray tracing kernel, the object model was first loaded. The pixel pitch and resolution of WRPs were set to be the same as the CGH. The amount of WRPs depends on the size of the object. The WRPs were arranged close to the object and parallel to the CGH plane. Virtual rays were emitted from WRP pixels, and the number of virtual rays was calculated by Eq. (7). The OptiX ray tracing kernel program parallelly computed the intersections of rays and objects and generated the complex amplitude of each ray by Eq. (1) and the Phong reflection model. The complex amplitude of each pixel on WRP is the accumulation of all the rays emitted from this pixel.

In the off-axis WRP kernel, the diffraction of WRP to the hologram plane was calculated by Eq. (9). The resolution of WRP was doubled by filling with 0 to linearize the discrete convolution. We use CUDA fast Fourier transform (CUFFT) to calculate the Fourier transform and the inverse Fourier transform. After the off-axis diffraction calculation, the CGH plane was cut off to the original size. When the diffraction calculation was completed, the complex amplitude generated by each WRPs was accumulated in the CGH buffer located in GPU memory.

In the phase hologram kernel, the phase of each hologram pixel was also calculated parallelly by launching a 2D CUDA kernel function, which dimension equal to the resolution of the hologram. In this proposed program, all the hologram complex amplitude data was stored in the GPU memory to avoid data copy between GPU memory and system memory.

In dynamic holograms, the position of the hologram and model was updated before the CGH calculation of each frame. When each frame of the motion CGH was generated, it was transferred from GPU memory to system memory and displayed by SLM.

4. Experiment

4.1 Computational efficiency

To demonstrate the efficiency of the proposed algorithm, we implemented point-based CGH, ray tracing CGH, and MO-WRP ray tracing CGH algorithms and compared the calculation time. All algorithms were written in CUDA C language and run on a computer with a 3.6 GHz Intel Core i7-4790 CPU, 16 GB RAM, NVIDIA RTX2080Ti GPU. The software environments are Windows 10 x64, CUDA 10.1, and OptiX 6.5.

The parameters of the hologram and reconstructed image are shown in Table 1. Here we set the resolution of CGH to 1920×1080 and pixel pitch to 3.74 µm according to the SLM. In this condition, the dimension of WRPs was fixed at 7.18 mm×4.04 mm. A plane model (2500 points, 12 polygons, 12.5 mm×12.5 mm×2.5 mm), a cow model (10084 points, 5804 polygons, 12.5 mm×7.5 mm×4.1 mm), and a teapot model (41472 points, 16188 polygons, 12.5 mm×6.1 mm×7.6 mm) were adopted to compare the calculation efficiency of the three CGH algorithms, as shown in Fig. 7.

 figure: Fig. 7.

Fig. 7. The model used in CGH calculation (a) Plane (b) Cow (c) Teapot.

Download Full Size | PDF

Tables Icon

Table 1. Specification of CGH and reconstructed image.

In the point-based algorithm, we used a point cloud format model (ply) and calculated the complex amplitude on CGH according to Eq. (1) directly. In ray tracing CGH without the WRP method, each of the pixels on CGH emitted 100×100 virtual rays to the model. In the MO-WRP ray tracing CGH algorithm, 2×2 WRPs were placed between the model and CGH, and the reconstruction plane was expanded to 14.36mm×8.08 mm. The number of rays emitted from WRP pixels was calculated by Eq. (7), while the sampling interval p was set to 0.1 mm.

The calculation time of those algorithms running on GPU is shown in Table 2 and Fig. 8. Different from the point-based algorithm, the calculation time of ray tracing CGH is not significantly affected by the number of polygons. By implementing the MO-WRP method, the calculation speed is about 25-30 times faster than the ray tracing method without WRP when there are 2×2 WRPs.

 figure: Fig. 8.

Fig. 8. Computation time of different algorithm running on GPU.

Download Full Size | PDF

Tables Icon

Table 2. Computation time of different algorithm running on GPU (ms).

In practical MO-WRP ray tracing CGH calculation, the size and shape of the models are different, which means the number of WRPs is not a constant. Considering the relationship between calculation speed and the number of WRPs, we run the ray tracing CGH algorithm with different numbers of WRPs and timed the calculation process. The number of WRPs was set from 1 (1×1) to 9 (3×3). Four models were used, which are 20 mm teapot, 20 mm cow, 20 mm plane, and 100 mm plane. The other parameters of MO-WRP ray tracing CGH are the same as Table 1. The calculation time of MO-WRP ray tracing CGH with different numbers of WRPs is shown in Fig. 9. The generation speed of CGH can reach 37.3 FPS (1 WRP) and 9.8 FPS (2×2 WRPs), which means a quasi-real-time CGH generation was achieved. A dynamic reconstruction image of a 12.5 mm width teapot model (2×2 WRPs) is shown in the attached video (see Visualization 1).

 figure: Fig. 9.

Fig. 9. Computation time of MO-WRP ray tracing CGH algorithm with WRP number range from 1 to 9.

Download Full Size | PDF

4.2 Optical reconstruction

We used an experiment device, as shown in Fig. 10, to achieve photorealistic optical reconstructions of holograms. A 638 nm red laser was expanded by beam expander (Thorlabs GBE20-A) to illuminate the SLM (Holoeye GAEA-2). A 50:50 beam splitter (Thorlabs CCM1-BS013) was used to change the light path, and 4F-lens were used to filter out the zero-order light. A camera (Sony ILCE-A7) without lens was placed at the reconstruction plane to record the images.

 figure: Fig. 10.

Fig. 10. Schematic diagram of the experimental device.

Download Full Size | PDF

To demonstrate the advantages of the proposed method, we adopted the teapot model and set the width to 6 mm, 12.5 mm, and 20 mm in MO-WRP ray tracing CGH, respectively. The number of WRPs was set to 3×3. Each of the WRPs was set to 1920×1080 resolution with a 3.74 µm pixel pitch. The distance between the object and the SLM was set to 100 mm. The rendered teapot model with different width is shown in Figs. 11(a), 11(b), and 11(c). Compared with the traditional WRP method, by implementing the MO-WRP method, reconstruction images (12.5 mm×6.1 mm and 20 mm×9.7 mm) that are larger than the SLM (7.18 mm×4.04 mm) were achieved, as shown in Figs. 11(d), 11(e), and 11(f).

 figure: Fig. 11.

Fig. 11. Rendering images and optical reconstruction images with different object size (a)(d) 6 mm (b)(e) 12.5 mm (c)(f)20 mm.

Download Full Size | PDF

To demonstrate the photorealistic rendering of the proposed method, CGHs of an 8 mm diameter ball and a 14 mm square with a checkered texture were generated. Here, we set the reconstruction distance to 100 mm, and the other parameters are the same as Table 1. The optical reconstruction images are shown in Fig. 12 and Fig. 13. In Fig. 12, the camera was placed (b) 95 mm and (c) 105 mm away from the SLM to demonstrate the 3D effects. In Fig. 13, the ball was rendered with (a) rough surface, (b) smooth surface, and (c) textured surface, respectively. Figures 13(d), 13(e) and 13(f) was the reconstruction image of Figs. 13(a), 13(b) and 13(c), respectively.

 figure: Fig. 12.

Fig. 12. (a) Rendered image and reconstruction image at (b) 95 mm and (c) 105 mm.

Download Full Size | PDF

 figure: Fig. 13.

Fig. 13. Rendering image and optical reconstruction image of different surface (a)(d) Rough surface (b)(e) Smooth surface (c)(f) Rough surface with texture.

Download Full Size | PDF

5. Conclusion

In conclusion, an algorithm for photorealistic CGH generation based on ray tracing and multiple off-axis WRPs was demonstrated. The experiment results show that the proposed method can expand the reconstruction image and maintain a high calculation speed. Besides, by parallel computing on GPU, the proposed method can realize a quasi-real-time CGH generation.

Funding

National Key Research and Development Program of China (2016YFB0401902); Special Project for Research and Development in Key areas of Guangdong Province (2019B010926001).

Disclosures

The authors declare no conflict of interest.

References

1. D. P. Pi, J. Liu, R. F. Kang, Z. Q. Zhang, and Y. Han, “Reducing the memory usage of computer-generated hologram calculation using accurate high-compressed look-up-table method in color 3D holographic display,” Opt. Express 27(20), 28410–28422 (2019). [CrossRef]  

2. D. Arai, T. Shimobaba, T. Nishitsuji, T. Kakue, N. Masuda, and T. Ito, “An accelerated hologram calculation using the wavefront recording plane method and wavelet transform,” Opt. Commun. 393, 107–112 (2017). [CrossRef]  

3. T. Nishitsuji, Y. Yamamoto, T. Sugie, T. Akamatsu, R. Hirayama, H. Nakayama, T. Kakue, T. Shimobaba, and T. Ito, “Special-purpose computer HORN-8 for phase-type electro-holography,” Opt. Express 26(20), 26722–26733 (2018). [CrossRef]  

4. K. Murano, T. Shimobaba, A. Sugiyama, N. Takada, T. Kakue, M. Oikawa, and T. Ito, “Fast computation of computer-generated hologram using Xeon Phi coprocessor,” Comput. Phys. Commun. 185(10), 2742–2757 (2014). [CrossRef]  

5. S. Ikawa, N. Takada, H. Araki, H. Niwase, H. Sannomiya, H. Nakayama, M. Oikawa, Y. Mori, T. Kakue, T. Shimobaba, and T. Ito, “Real-time color holographic video reconstruction using multiple-graphics processing unit cluster acceleration and three spatial light modulators,” Chin. Opt. Lett. 18(1), 010901 (2020). [CrossRef]  

6. A. Symeonidou, D. Blinder, and P. Schelkens, “Colour computer-generated holography for point clouds utilizing the Phong illumination model,” Opt. Express 26(8), 10282–10298 (2018). [CrossRef]  

7. A. Symeonidou, D. Blinder, A. Munteanu, and P. Schelkens, “Computer-generated holograms by multiple wavefront recording plane method with occlusion culling,” Opt. Express 23(17), 22149–22161 (2015). [CrossRef]  

8. T. Whitted, “An improved illumination model for shaded display,” in Proceedings of the 6th annual conference on Computer graphics and interactive techniques, (1979), 14.

9. T. Ichikawa, K. Yamaguchi, and Y. Sakamoto, “Realistic expression for full-parallax computer-generated holograms with the ray-tracing method,” Appl. Opt. 52(1), A201–A209 (2013). [CrossRef]  

10. Y. Wang, X. Sang, Z. Chen, H. Li, and L. Zhao, “Real-time photorealistic computer-generated holograms based on backward ray tracing and wavefront recording planes,” Opt. Commun. 429, 12–17 (2018). [CrossRef]  

11. D. Arai, T. Shimobaba, K. Murano, Y. Endo, R. Hirayama, D. Hiyama, T. Kakue, and T. Ito, “Acceleration of computer-generated holograms using tilted wavefront recording plane method,” Opt. Express 23(2), 1740–1747 (2015). [CrossRef]  

12. K. Matsushima, H. Schimmel, and F. Wyrowski, “Fast calculation method for optical diffraction on tilted planes by use of the angular spectrum of plane waves,” J. Opt. Soc. Am. A 20(9), 1755–1762 (2003). [CrossRef]  

13. C. Chang, J. Wu, Y. Qi, C. Yuan, S. Nie, and J. Xia, “Simple calculation of a computer-generated hologram for lensless holographic 3D projection using a nonuniform sampled wavefront recording plane,” Appl. Opt. 55(28), 7988–7996 (2016). [CrossRef]  

14. B. T. Phong, “Illumination for computer generated pictures,” Commun. ACM 18(6), 311–317 (1975). [CrossRef]  

15. T. Ichikawa, T. Yoneyama, and Y. Sakamoto, “CGH calculation with the ray tracing method for the Fourier transform optical system,” Opt. Express 21(26), 32019–32031 (2013). [CrossRef]  

16. K. Matsushima, “Shifted angular spectrum method for off-axis numerical propagation,” Opt. Express 18(17), 18453–18463 (2010). [CrossRef]  

Supplementary Material (1)

NameDescription
Visualization 1       An optical reconstruction image of a 12.5mm width teapot model based on ray tracing CGH algorithm and multiple off-axis WRPs method

Cited By

Optica participates in Crossref's Cited-By Linking service. Citing articles from Optica Publishing Group journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (13)

Fig. 1.
Fig. 1. Schematic of ray tracing CGH.
Fig. 2.
Fig. 2. Phong reflection model (a) Coordinate system (b) Diffuse reflection (c) Specular reflection.
Fig. 3.
Fig. 3. Schematic diagram of multiple off-axis WRP (a) Side view (b) Top view.
Fig. 4.
Fig. 4. Schematic diagram of virtual ray emitters.
Fig. 5.
Fig. 5. The coordinate of off-axis WRP.
Fig. 6.
Fig. 6. Flow chart of off-axis WRP ray tracing algorithm.
Fig. 7.
Fig. 7. The model used in CGH calculation (a) Plane (b) Cow (c) Teapot.
Fig. 8.
Fig. 8. Computation time of different algorithm running on GPU.
Fig. 9.
Fig. 9. Computation time of MO-WRP ray tracing CGH algorithm with WRP number range from 1 to 9.
Fig. 10.
Fig. 10. Schematic diagram of the experimental device.
Fig. 11.
Fig. 11. Rendering images and optical reconstruction images with different object size (a)(d) 6 mm (b)(e) 12.5 mm (c)(f)20 mm.
Fig. 12.
Fig. 12. (a) Rendered image and reconstruction image at (b) 95 mm and (c) 105 mm.
Fig. 13.
Fig. 13. Rendering image and optical reconstruction image of different surface (a)(d) Rough surface (b)(e) Smooth surface (c)(f) Rough surface with texture.

Tables (2)

Tables Icon

Table 1. Specification of CGH and reconstructed image.

Tables Icon

Table 2. Computation time of different algorithm running on GPU (ms).

Equations (12)

Equations on this page are rendered with MathJax. Learn more.

U ( x H , y H , z H ) = j = 1 N A j r j exp ( i ( 2 π λ r j + φ j ) )
r j = ( x H x j ) 2 + ( y H y j ) 2 + ( z H z j ) 2
I = k a I a + j = 1 N I l j f j ( k d ( N L j ) + k s ( R j V ) m )
θ x  =  W z wrp
θ x = x 0 z wrp
N x = θ x Δ θ
N = N x N y = θ x θ y ( Δ θ ) 2
x = x x 0 y = y y 0
g ^ ( x ^ , y ^ , z 0 ) = g ( x ^ + x 0 , y ^ + y 0 , z 0 ) = g ( x , y , 0 ) h ( x ^ + x 0 x , y ^ + y 0 y , z 0 ) d x d y = F 1 { F [ g ( x , y , 0 ) ] F [ h ( x , y , z 0 ) ] exp [ i 2 π ( x 0 u + y 0 v ) ] } = F 1 { F [ g ( x , y , 0 ) ] exp [ i 2 π ( x 0 u + y 0 v + z 0 w ) ] } = F 1 { F [ g ( x , y , 0 ) ] H ^ ( u , v , z 0 ) }
Δ u 1 2 | f u | Δ v 1 2 | f v |
f u = 1 2 π ϕ u = x 0 u w z 0 f v = 1 2 π ϕ v = y 0 v w z 0
Δ u 1 2 | x 0 u w z 0 | Δ v 1 2 | y 0 v w z 0 |
Select as filters


Select Topics Cancel
© Copyright 2024 | Optica Publishing Group. All rights reserved, including rights for text and data mining and training of artificial technologies or similar technologies.