Expand this Topic clickable element to expand a topic
Skip to content
Optica Publishing Group

Efficient calculation scheme for high pixel resolution non-hogel-based computer generated hologram from light field

Open Access Open Access

Abstract

We propose a method that reduces the computation time and memory requirement in non-hogel-based hologram synthesis from light field data. The non-hogel-based technique synthesizes coherent complex field for a three-dimensional scene from its light field. Unlike conventional holographic stereogram, the non-hogel-based technique reconstructs continuous parabolic wavefront for individual three-dimensional object point by globally processing the light field. However, the global processing increases the computational load significantly, making it hard to synthesize holograms with high pixel resolution. The proposed technique reduces the computational burden by processing each two-dimensional angular frequency slice of the four-dimensional light field independently. Hologram tiling technique is also proposed to make the hologram synthesis process scalable. Using the hologram tiling and the angular-frequency-slice-based processing, 25K×25 K pixel resolution hologram was synthesized successfully.

© 2020 Optical Society of America under the terms of the OSA Open Access Publishing Agreement

1. Introduction

Computer generated hologram (CGH) techniques synthesize complex field of three-dimensional (3D) scene [1]. The 3D scene is represented in various forms and the complex field for the scene is synthesized by calculating and accumulating the complex field of individual primitives of the scene. The possible forms representing the 3D scene for the CGH synthesis include point cloud [24], multiple layers [5], polygon meshes [68], and light field [910]. The light field is a collection of numerous views of the scene which are captured at slightly different directions. As the light field can be readily rendered for a virtual scene using computer graphics or directly captured for a real scene using light field camera [1112], the light field could be universal representation of the virtual and real scene for the CGH synthesis. The detailed scene appearance like view dependent occlusion and material reflection property is already included in the light field, which alleviates complicated calculation in the hologram synthesis step for its realization [1314].

One limitation of the light field representation is that it does not provide explicit depth information of individual 3D points or surfaces of the scene. Instead, the depth information is only implicitly included as a form of slight difference, or disparity, between the views. Conventional light field based CGH techniques which are usually called holographic stereogram divide the hologram into many non-overlapping segments called hogels in spatial or Fourier domain [9,15]. They then calculate the complex field in each segment exclusively from the corresponding single view in the light field data. Each hologram segment simply reconstructs the corresponding view without proper depth reconstruction nor global phase relationship across multiple hologram segments. Therefore, those conventional light field based CGHs only reconstruct light rays with random phase relationship without reconstructing ideal continuous parabolic wavefront for individual 3D object point. Although advanced techniques which match the phase relationship between the non-overlapping hogels [16], encode a 3D view with the proper depth distribution to each hogel [1719], or overlap the hogels to overcome the spatio-angular resolution tradeoff [14] have been reported, all of them require the explicit depth map information in addition to the original light field data.

Recently, a novel non-hogel-based CGH technique has been proposed [20]. Instead of segmenting the hologram and assigning the corresponding view, this technique globally processes the light field data, or processes all the views collectively to synthesize the entire hologram without requiring the depth map. The depth information implicitly encoded in the difference between the views is automatically reflected in the synthesized hologram, creating ideal continuous parabolic wavefront for each object point. The continuous wavefront realizes high reconstruction resolution only limited by the numerical aperture (NA) of the hologram. The carrier wave, or equivalently the phase distribution on the surface of the reconstructed 3D objects can also be controlled arbitrarily, making this technique versatile for various applications. It was shown that this technique can be considered as the application of the complex field recovery technique from its Wigner distribution function (WDF) to the light field data [20].

Despite the benefits of the non-hogel-based technique, its high computational load is problematic. Unlike the conventional methods where each pixel in the hologram spatial domain or Fourier domain is only affected by the single corresponding view, the non-hogel-based technique calculates each hologram pixel value from all light field data, i.e. all image pixels in all views within the NA range. Therefore, the computational time and the memory requirement are much higher than conventional methods, which prevent generation of high pixel resolution holograms. Although an enhanced implementation using two-dimensional (2D) fast Fourier transform (FFT) over projection angle axes has been reported [20], significant further improvement is required.

In this paper, we propose an efficient calculation scheme for the non-hogel-based CGH technique. The proposed technique considers that the spatial resolution of the 3D scene can be lower than the NA limit of the hologram, which is reasonable in practical applications where holograms are used to show 3D images to users. For a light field data prepared at the spatial sampling rate corresponding to the spatial resolution of the 3D scene, the proposed technique performs 2D FFT over its projection angle axes. Then the individual 2D angular frequency slice is interpolated to the hologram resolution, multiplied with a complex weighting function, and finally accumulated in the hologram plane. This angular frequency slice processing is independent for each slice, enabling concurrent parallel processing for multiple slices. The hologram tiling scheme is also proposed to make the process scalable to arbitrary resolution, not being limited to physical memory of the calculation system. Using the angular-frequency-slice-based processing and the hologram tiling, the 25K×25 K pixel resolution hologram has been synthesized successfully based on the non-hogel-based CGH technique. In the followings, we explain the principle of the proposed method after brief review on the non-hogel-based CGH. Then we present the verification results of the speed enhancement and the artefact-free hologram synthesis by the proposed angular-frequency-slice-based processing and the hologram tiling. Finally, two high pixel resolution holograms synthesized using the proposed method are presented with demonstrations of the extended depth range and the realistic reproduction of the parallax.

2. Principle of the proposed technique

2.1 Brief review of non-hogel-based CGH

The non-hogel-based CGH technique synthesizes complex field from the light field data of the 3D scene. The light field data is a spatio-angular distribution of the light ray radiance. In a plane, the light field can be represented by L(tx, ty, u, v) where (tx, ty) is the spatial position of the light ray and (u, v) represents its angular direction. In this paper, we represent the angular direction of the light ray as the corresponding spatial frequency (u, v)=(sinθx/λ, sinθy/λ) where λ is the wavelength and (θx, θy) is the ray angle with respect to the z axis. The light field data can also be considered as the collection of the orthographic view images Iorthouo,vo(tx,ty) = L(tx, ty, uo, vo) at different observation directions (uo, vo) or the collection of the perspective view images Ipersptxo,tyo(u,v) = L(txo, tyo, u, v) at different observation positions (txo, tyo).

From the given light field data L(tx, ty, u, v), the non-hogel-based CGH technique synthesizes the complex field by [20]

$$H(x,y) = \int\!\!\!\int {\tilde{L}\left( {\frac{{x + {x_c}}}{2},\frac{{y + {y_c}}}{2},x - {x_c},y - {y_c}} \right)W({{x_c},{y_c}} )d{x_c}d{y_c}} ,$$
where xc and yc are integration variables representing the spatial position in the hologram plane same as x and y. The W(xc, yc) represents complex field of the carrier wave in the hologram plane which determines the phase distribution on the surface of the 3D objects. Note that arbitrary carrier wave W(xc, yc) can be used including a normal plane wave W(xc,yc) = 1, a parabolic wave W(xc, yc) = exp[j{π/(λf)}(xc2+yc2)] with a focal lengths f, and any random complex-valued distribution as demonstrated in the following section. The $\tilde{L}({t_x},{t_y},{\tau _x},{\tau _y})$ is the 2D Fourier transform of L(tx, ty, u, v) along the (u, v) axes. From the relationship between the light field L(tx, ty, u, v) and the corresponding 3D objects, the $\tilde{L}({{x \mathord{\left/ {\vphantom {x 2}} \right.} 2} + {{{x_c}} \mathord{\left/ {\vphantom {{{x_c}} 2}} \right.} 2},{y \mathord{\left/ {\vphantom {y 2}} \right.} 2} + {{{y_c}} \mathord{\left/ {\vphantom {{{y_c}} 2}} \right.} 2},x - {x_c},y - {y_c}} )\buildrel \varDelta \over = H(x,y;{x_c},{y_c})$ of Eq. (1) can be shown to be the complex field of the 3D objects for a parabolic carrier wave emanating from a single point (xc, yc) in the hologram plane [20]. Their weighted addition given by Eq. (1), therefore represents the complex field of the 3D object with an arbitrary carrier wave determined by W(xc, yc).

Although the non-hogel-based CGH technique can generate coherent complex field with arbitrary carrier waves, its implementation requires large computational load. Given the target hologram resolution Nx×Ny, the light field L(tx, ty, u, v) should be prepared with the same spatial resolution Ntx×Nty=Nx×Ny. The prepared light field L(tx, ty, u, v) is first 2D Fourier transformed along the (u, v) axes at each (tx, ty) position to obtain $\tilde{L}({t_x},{t_y},{\tau _x},{\tau _y})$ which requires Ntx×Nty=Nx×Ny times fast Fourier transform (FFT) operations of Nu×Nv size. The calculated $\tilde{L}({t_x},{t_y},{\tau _x},{\tau _y})$ is then processed according to Eq. (1) which requires extractions of different slice of $\tilde{L}({t_x},{t_y},{\tau _x},{\tau _y})$ for the corresponding (xc, yc) and their accumulations. Since the (xc, yc) is the spatial position in the hologram plane defined on the same sampling grid as the hologram itself, the slice extraction and the addition should be performed Nx×Ny times. Therefore, the overall computation of the non-hogel-based CGH technique requires Nx×Ny times FFTs and Nx×Ny times slice extractions and additions, which is huge and non-tractable especially when the hologram resolution Nx×Ny is large.

2.2 Angular-frequency-slice-based processing

Figure 1 is the conceptual diagram of the proposed angular-frequency-slice-based processing. In the proposed method, the light field L(tx, ty, u, v) is prepared with a spatial resolution Ntx×Nty which is sufficient for the 3D scene bandwidth but can be smaller than the hologram resolution Nx×Ny. The light field is then 2D Fourier transformed along the (u, v) axes at each (tx, ty) position to obtain $\tilde{L}({t_x},{t_y},{\tau _x},{\tau _y})$ by Ntx×Nty times FFT operations. Next, at each angular frequency (τx, τy), the 2D slice ${\tilde{I}_{{\tau _x},{\tau _y}}}({{t_x},{t_y}} )= \tilde{L}({t_x},{t_y},{\tau _x},{\tau _y})$ of the resolution Ntx×Nty is interpolated to the hologram resolution Nx×Ny, multiplied with the complex weight function W(xc, yc), and finally accumulated on the hologram plane. For the multiplication and the accumulation for each angular frequency (τx, τy), the integration variables xc and yc in Eq. (1) are modified to τx=x-xc, τy=y-yc, giving

$$H(x,y) = \int\!\!\!\int {{H_{{\tau _x},{\tau _y}}}(x,y)d{\tau _x}d{\tau _y}} ,$$
where
$${H_{{\tau _x},{\tau _y}}}(x,y) = \tilde{L}\left( {x - \frac{{{\tau_x}}}{2},y - \frac{{{\tau_y}}}{2},{\tau_x},{\tau_y}} \right)W({x - {\tau_x},y - {\tau_y}} )$$
represents contribution of the angular frequency (τx, τy) component to the final hologram. Equation (3) can be further modified by using tx=x-τx/2 and ty=y-τy/2 to
$$\begin{aligned}{H_{{\tau _x},{\tau _y}}}\left( {{t_x} + \frac{{{\tau_x}}}{2},{t_y} + \frac{{{\tau_y}}}{2}} \right) &= \tilde{L}({{t_x},{t_y},{\tau_x},{\tau_y}} )W\left( {{t_x} - \frac{{{\tau_x}}}{2},{t_y} - \frac{{{\tau_y}}}{2}} \right)\\ &= {{\tilde{I}}_{{\tau _x},{\tau _y}}}({{t_x},{t_y}} )W\left( {{t_x} - \frac{{{\tau_x}}}{2},{t_y} - \frac{{{\tau_y}}}{2}} \right). \end{aligned}$$
Equation (4) indicates that the interpolated angular frequency slice ${\tilde{I}_{{\tau _x},{\tau _y}}}({{t_x},{t_y}} )$ is multiplied by a shifted complex weight function W(tx-τx/2, ty-τy/2) to be accumulated in the hologram plane with a shift, giving Hτx,τy(tx+τx/2, ty+τy/2). Following Eq. (2), this process is repeated for every angular frequency pair (τx, τy) to complete the hologram. In actual implementation, the continuous integration of Eq. (2) over τx and τy is replaced by the numerical addition at discrete (τx, τy) pairs whose number is given by Nτx×Nτy=Nu×Nv.

 figure: Fig. 1.

Fig. 1. Angular frequency slice processing.

Download Full Size | PDF

Note that once $\tilde{L}({t_x},{t_y},{\tau _x},{\tau _y})$ is obtained by taking Ntx×Nty times 2D FFTs over (u, v) axes to the given light field L(tx, ty, u, v), all following procedures including the spatial interpolation to Nx×Ny resolution, multiplication with the complex weight function W, and the accumulation in the hologram plane are performed for each angular frequency slice ${\tilde{I}_{{\tau _x},{\tau _y}}}({{t_x},{t_y}} )= \tilde{L}({t_x},{t_y},{\tau _x},{\tau _y})$, independently. Therefore, the proposed technique allows the parallel processing, which could contribute to the calculation time reduction.

Also note that the proposed technique performs the spatial interpolation along (tx, ty) axes not for the light field L(tx, ty, u, v) itself but for the $\tilde{L}({t_x},{t_y},{\tau _x},{\tau _y})$ after the 2D FFT operations over (u, v) axes. In the original technique, the light field L(tx, ty, u, v) should be prepared in the same spatial resolution as the final hologram. If the initial light field is given with a low resolution Ntx×Nty (< Nx×Ny), it should be first interpolated to the hologram resolution Nx×Ny, then it is 2D Fourier transformed, which requires Nx×Ny times 2D FFT operations. However, in the proposed method, the required number of the 2D FFT operations is kept low, i.e. Ntx×Nty, by performing the interpolation over (tx, ty) axes after the 2D Fourier transform over (u, v) axes. The equivalence between the spatial interpolation over (tx, ty) axes before and after the 2D Fourier transform over (u, v) axes can be shown easily. Representing the spatial interpolation as the linear weighted addition of the neighborhood, the spatially interpolated value of the light field at (txi, tyi) can be written by

$$L({{t_{xi}},{t_{yi}},u,v} )= \sum\limits_{m,n} {{a_{m,n}}L} ({{t_{x,m}},{t_{y,n}},u,v} ),\quad$$
where (tx,m, ty,n) ∈ {neighbor of (txi, tyi)}, and am,n is the weight which depends on the linear interpolation methods. By taking 2D Fourier transform over (u, v) axes for both sides of Eq. (5), we obtain
$$\begin{aligned}\tilde{L}({{t_{xi}},{t_{yi}},{\tau_x},{\tau_y}} )&= \int {\sum\limits_{m,n} {{a_{m,n}}L} ({{t_{x,m}},{t_{y,n}},u,v} )} {e^{j2\pi ({u{\tau_x} + v{\tau_y}} )}}dudv\\ &= \sum\limits_{m,n} {{a_{m,n}}\tilde{L}({{t_{x,m}},{t_{y,m}},{\tau_x},{\tau_y}} )} ,\quad \end{aligned}$$
which shows that the order of the 2D Fourier transform over (u, v) axes and the spatial interpolation over (tx, ty) axes are inter-changeable.

Finally, it is noted that, for a given light field data, the synthesized hologram and its reconstruction are dependent on the spatial interpolation method used in the proposed angular-frequency-slice-based processing technique. The maximum effective spatial resolution of the reconstruction is, however, still given by the spatial resolution of the original light field before the spatial interpolation. An extensive analysis of the effect of the interpolation method on the wavefront reconstructed from the synthesized hologram and the search of the optimum interpolation method are left for the further research. All demonstrations presented in this paper were obtained using a simple two-dimensional linear interpolation method in our implementation.

2.3 Hologram tiling

The proposed-angular-frequency-slice-based processing technique also allows the tiled processing of the hologram. Figure 2 shows the area in the hologram plane which is affected by a single tile of the light field data. Suppose that the light field tile L(tx, ty, u, v) is given for a spatial range (tx,min, ty,min) < (tx, ty) < (tx,max, ty,max) with a sampling interval Δtx, Δty and for a directional range (umin, vmin) < (u, v) < (umax, vmax) with a sampling interval Δu and Δv. Then its Fourier transform over u, v axes $\tilde{L}({t_x},{t_y},{\tau _x},{\tau _y})$ has an angular frequency range (-1/(2Δu), -1/(2Δv)) < (τx, τy) < (1/(2Δu), 1/(2Δv)). Equation (4) indicates that this light field data contributes to the hologram H(x, y) in a limited spatial range (tx,min- 1/(4Δu), ty,min- 1/(4Δv)) < (x, y) < (tx,max+ 1/(4Δu), ty,max+ 1/(4Δv)) in the hologram plane, as illustrated in Fig. 2.

 figure: Fig. 2.

Fig. 2. Spatial extent of the hologram corresponding to a given light field.

Download Full Size | PDF

Since the spatial range (x,y) of the hologram affected by a light field data is limited around the spatial range (tx, ty) of the light field, the tiled processing is possible. Figure 3 show the concept. The whole spatial range (tx, ty) is divided into non-overlapping spatial tiles in the preparation of the light field data. In the hologram synthesis, each spatial tile of the light field data is processed to produce the corresponding hologram tile around its spatial range. The produced hologram tiles are accumulated with an overlapping area, populating the whole hologram area as shown in Fig. 3. The spatial range of each light field tile can be determined considering the computational capacity of the processing unit. Since the processing of each light field tile is independent, the proposed tiled processing enables the synthesis of hologram with arbitrary pixel resolution over the limit of a single processing unit.

 figure: Fig. 3.

Fig. 3. Hologram tiling process.

Download Full Size | PDF

2.4 Light field sampling and the depth range of the hologram

One advantage of the non-hogel-based CGH technique over the conventional hogel based techniques is that the 3D scene reconstructed by the non-hogel-based CGH has higher spatial resolution only limited by the NA of the hologram. This high resolution is enabled by the feature that the non-hogel-based CGH is free from the tradeoff between the spatial resolution and the angular resolution. In this paper, however, it is considered that the 3D scene can have lower spatial resolution, i.e. lower than the NA limit of the hologram but still plausible to the human visual system. In this case, the tradeoff-free feature of the non-hogel-based CGH enables more angular samplings of the light field data at the same spatial resolution, producing longer depth range of the reconstruction than the conventional hogel based techniques.

Suppose that a hologram H(x,y) has Nx×Ny pixel resolution with a pixel pitch Δx and Δy. In the proposed technique, the light field L(tx,ty,u,v) is assumed to be spatially sampled with the sampling interval Δtx=MxΔx and Δty=MyΔy which corresponds to the spatial bandwidth of the 3D scene Bx=1/Δtx, and By=1/Δty. The angular range of the light field is determined to fill the whole viewing angle or diffraction range of the hologram. With the hologram pixel pitch Δx and Δy, the viewing angle of the hologram reconstruction is |θx|<sin-1(λ/2Δx) and |θy|<sin-1(λ/2Δy), which gives the angular range of the light field |u|=|sinθx/λ|<1/(2Δx) and |v|=|sinθy/λ|<1/(2Δy). If the number of the light field samplings in the angular u, v directions is given by Nu and Nv, then the angular sampling pitch is determined to be

$$\Delta u = \frac{1}{{{N_u}\Delta x}},\quad \Delta v = \frac{1}{{{N_v}\Delta y}}.$$
In the light field sampling, the angular sampling pitch Δu and Δv determines the maximum depth at a given spatial bandwidth. Suppose that a flat image of the bandwidth Bx and By is located at a depth z in the 3D scene. Since the angular interval Δθx, Δθy corresponds to the spatial interval |zθx, |zθy in the z plane, the maximum depth |z| is limited by |zθx≤1/Bx, |zθy≤1/By or
$$|z|\le \min \left( {\frac{1}{{\Delta {\theta_x}{B_x}}},\frac{1}{{\Delta {\theta_y}{B_y}}}} \right) = \min \left( {\frac{{{M_x}\Delta x}}{{\lambda \Delta u}},\frac{{{M_y}\Delta y}}{{\lambda \Delta v}}} \right),$$
where the relations u = sinθx/λθx/λ, v = sinθy/λθy/λ, Bx=1/Δtx=1/(MxΔx), and By=1/Δty=1/(MyΔy) are used. By substituting Eq. (7) to Eq. (8), the maximum depth can be represented using the number of the angular sampling Nu and Nv by
$$|z|\le \min \left( {\frac{{{N_u}{M_x}\Delta {x^2}}}{\lambda },\frac{{{N_v}{M_y}\Delta {y^2}}}{\lambda }} \right).$$
The enhanced depth range of the non-hogel-based CGH comes from the fact that its tradeoff-free feature enables larger Nu and Nv at the same spatial sampling pitch of the light field Δtx=MxΔx and Δty=MyΔy and the same hologram sampling pitch Δx and Δy.

In the conventional hogel based techniques, the Nu×Nv hologram pixels consist of a single hogel. Since each hogel acts as a single 3D pixel of the reconstruction, the physical hogel size NuΔx × NvΔy should be the same as the spatial sampling interval of the light field Δtx=MxΔx and Δty=MyΔy. Therefore, in the conventional hogel based techniques, the number of the angular sampling Nu and Nv is limited to be Nu=Mx and Nv=My. From Eq. (9), the corresponding maximum depth of the conventional hogel based techniques is given by

$$|z|\le \min \left( {\frac{{M_x^2\Delta {x^2}}}{\lambda },\frac{{M_y^2\Delta {y^2}}}{\lambda }} \right).$$
To the contrary, in the non-hogel-based CGH, the number of the angular sampling Nu and Nv is not limited by the hogel size, or by Mx and My but it can be increased arbitrarily. Therefore, by increasing Nu and Nv over Mx and My, the depth range can be enhanced following Eq. (9).

Note that increasing Nu and Nv requires rendering more orthographic view images with finer angular sampling pitch Δu and Δv, which increases computational load and time in the preparation of the light field data. Once the light field data is prepared with Nu and Nv larger than Mx and My, however, the non-hogel-based CGH utilizes all data, increasing the depth range, while the conventional hogel based CGH techniques should down-sample it to Mx and My to fit the hogel size, limiting the depth range.

Also note that even though Eq. (9) deduced from the light field sampling theory shows linear increase of the maximum depth proportional to the number of the angular samples Nu and Nv, there is an upper limit of the depth range imposed by the NA of the hologram itself. For a spatial transverse spot size Δtx=MxΔx and Δty=MyΔy, the required NA is about λ/(2Δtx)=λ/(2MxΔx) or λ/(2Δty)=λ/(2MyΔy) [21]. With the physical size of the hologram NxΔx × NyΔy, the maximum depth corresponding to this NA is |z|≤NxΔx/(2NA), |z|≤NyΔy/(2NA), or |z|≤min(NxMx2Δx2/λ, NyMy2Δy2/λ). Comparison with Eq. (9) shows that the upper limit imposed by the hologram is larger than the light field sampling limit of Eq. (9) by the ratio of Nx/Nu or Ny/Nv, which is huge in most cases. Therefore, in most practical cases, the depth range of the reconstruction increases linearly with the number of the angular samplings Nu and Nv as given by Eq. (9).

3. CGH synthesis and verification

The verification results of the speed enhancement using the proposed angular-frequency-slice-based processing and the high resolution hologram synthesis using the proposed hologram tiling are presented in this section. For the speed enhancement verification, a light field data which consists of Nu×Nv=40 × 40 orthographic views with Ntx×Nty=300 × 200 pixel resolution was prepared as shown in Fig. 4(a). Holograms were synthesized from this light field using 5 different values of Mx=My=M=1, 2, 4, 8, and 16, which give the final hologram pixel resolution of Nx×Ny=MNtx×MNty=300 × 200, 600 × 400, 1200 × 800, 2400 × 1600, and 4800 × 3200, respectively. In this speed verification, the hologram tiling was not applied, and each hologram was synthesized as a whole. The conventional processing scheme of the non-hogel-based CGH following Eq. (1) and the proposed angular-frequency-slice-based processing scheme were implemented using MATLAB environment running on a PC having Intel i7-8700 K CPU, 32GB memory, and GeForce GTX 1080 TI GPU.

 figure: Fig. 4.

Fig. 4. Light field and synthesized holograms with different interpolation factor M. (a) Light field data used for the hologram synthesis, (b) synthesized holograms (top two rows) and their numerical reconstructions (bottom two rows).

Download Full Size | PDF

In the conventional processing scheme, the light field L(tx, ty, u, v) of the Ntx×Nty×Nu×Nv = 300 × 200 × 40 × 40 data size was first spatially interpolated at each (u, v) to MNtx×MNty×Nu×Nv data size, then Fourier transformed along the u, v axes at each interpolated (tx, ty) to obtain $\tilde{L}({t_x},{t_y},{\tau _x},{\tau _y})$ of MNtx×MNty×Nu×Nv data size, and its various slices were finally extracted and accumulated following Eq. (1) with the complex weights W. In the proposed angular-frequency-slice-based processing scheme, the light field L(tx, ty, u, v) of the Ntx×Nty×Nu×Nv = 300 × 200 × 40 × 40 data size was first Fourier transformed along the u, v axes at each (tx, ty) to obtain $\tilde{L}({t_x},{t_y},{\tau _x},{\tau _y})$ of the same Ntx×Nty×Nu×Nv size, then each angular frequency slice ${\tilde{I}_{{\tau _x},{\tau _y}}}({{t_x},{t_y}} )= \tilde{L}({t_x},{t_y},{\tau _x},{\tau _y})$ of the Ntx×Nty resolution was interpolated to MNtx×MNty resolution, and accumulated on the hologram plane with the complex weight distribution W following Eq. (4) and Fig. 1. In both conventional and proposed processing implementations, a random complex field of MNtx×MNty resolution was used as the complex weight distribution W(xc, yc). Figure 4(b) shows the synthesized holograms and their numerical reconstructions. Note that the apparent quality enhancement observed in the numerical reconstructions of the higher pixel resolution holograms, i.e. larger M, is not due to the inherent resolution enhancement of the light field data but due to the smaller speckle grain size relative to the reconstruction window of MNtx×MNty.

Figure 5 and Table 1 show the speed measurement result. Note that the ‘LF load’ in Table 1 only means the time to read the prepared Nu×Nv=40 × 40 orthographic view image files of the Ntx×Nty=300 × 200 pixel resolution, not including their rendering time. Also, in our implementation of the conventional processing scheme, each orthographic view image of the Ntx×Nty=300 × 200 pixel resolution is spatially interpolated to MNtx×MNty=(300×M)×(200×M) right after read from the file to populate the corresponding slice of the interpolated light field data array. Thus the first step of the conventional processing scheme denoted by ‘LF load & interpolation’ in Table 1 means the time to create the interpolated light field data of MNtx×MNty×Nu×Nv=(300×M)×(200×M) × 40 × 40 size in memory from the Nu×Nv image files of the Ntx×Nty=300 × 200 pixel resolution in the PC disk. To the contrary, the first step of the proposed angular-frequency-slice-based processing scheme denoted by ‘LF load’ is just to read Nu×Nv image files of the Ntx×Nty=300 × 200 pixel resolution from the PC disk without interpolation to load the light field data of Ntx×Nty×Nu×Nv=300 × 200 × 40 × 40 size in memory. The interpolation is conducted in the last step after its Fourier transform. The last step of the proposed angular frequency slice processing scheme denoted by ‘Interpolation & weighted superposition’ in Table 1 includes the time to interpolate every angular frequency slice ${\tilde{I}_{{\tau _x},{\tau _y}}}({{t_x},{t_y}} )= \tilde{L}({t_x},{t_y},{\tau _x},{\tau _y})$ of the Fourier transformed light field data, multiply it with the shifted complex weight function W(tx-τx/2, ty-τy/2), and accumulate in the hologram plane with the corresponding shift following Eq. (4) to complete the final hologram H(x,y) as illustrated in Fig. 1.

 figure: Fig. 5.

Fig. 5. Measured computation time for three implementations of the non-hogel-based CGH technique, i.e. conventional direct implementation of Eq. (1) using CPU, proposed implementation based on the angular frequency slice processing using CPU, and the proposed implementation based on the same algorithm using GPU.

Download Full Size | PDF

Tables Icon

Table 1. Computation time comparison.

In the conventional processing scheme, the spatial interpolation is first applied to the whole light field data, increasing the data size. Then the Fourier transform and the weighted accumulation are performed on the interpolated light field data of the large size. As shown in Table 1, this makes the processing time of all steps including the light field interpolation, the Fourier transform, and the weighted accumulation increases with the interpolation ratio M. Also, the memory requirement increases with the interpolation ratio M because the whole interpolated light field data is maintained in the memory. In our MATLAB implementation which uses 16 bytes for each complex number, the minimum memory requirement is about MNtx×MNty×Nu×Nv×16=(300×M)×(200×M) × 40 × 40 × 16 bytes to store the complex Fourier transformed and interpolated light field data $\tilde{L}({t_x},{t_y},{\tau _x},{\tau _y})$ in the memory, which measures 1.43GB, 5.72GB, 22.89GB, 91.55GB, and 366.21GB for M=1, 2, 4, 8, and 16, respectively. With the 32GB physical memory of the PC used in the implementation, the maximum interpolation factor M was limited to 4.

In the proposed angular-frequency-slice-based processing, however, the original light field data before the spatial interpolation is maintained in the memory. The spatial interpolation and the weighted accumulation are performed not for the whole light field but only for a selected angular frequency slice at a time. The processed angular frequency slice can then be removed in the memory. This feature not only enhances the processing speed at the same CPU environment as detailed in Table 1, but also lowers the memory requirement significantly. In our MATLAB implementation, the minimum memory requirement is maintained to be about Ntx×Nty×Nu×Nv×16 = 300 × 200 × 40 × 40 × 16 = 1.43GB to store the complex Fourier transformed light field data in the memory regardless of the interpolation factors M=1, 2, 4, 8, and 16. Moreover, the proposed angular-frequency-slice-based processing enables independent and concurrent processing of each angular frequency slice, making it suitable for the GPU processing. Table 1 shows significant speed enhancement of the proposed processing scheme with the GPU implementation.

Figures 6 and 7 show the verification results of the proposed hologram tiling for the scalable processing. For the verification, three light field data were used including Ntx×Nty×Nu×Nv = 100 × 100 × 64 × 64 resolution light field of a single pixel point object as shown in Fig. 6(a), the same resolution light field of a 8 × 8 pixel size square object as shown in Fig. 6(b), and Ntx×Nty×Nu×Nv = 300 × 200 × 40 × 40 resolution light field of two socket object shown in Fig. 6(c). Each light field data was spatially divided into 2 × 2 tiles, i.e. Lm,n(tx, ty, u, v) with m, n = 1, 2, and the corresponding hologram tiles Hm,n(x,y) were synthesized and combined as explained in section 2.3. In the hologram synthesis of all light field data, the spatial interpolation of the light field was not applied, giving Mx=My=M=1 or Δxytxty=4µm. As the complex weight distribution W(xc, yc), a constant distribution W(xc, yc) = 1 for the light field of Figs. 6(a) and 6(b), and a parabolic phase distribution W(xc, yc) = exp[j{π/(λf)}(xc2+yc2)] with λ=532 nm and f=3 cm for the light field of Fig. 6(c) were used instead of the random complex field in order to verify the phase consistency more easily.

 figure: Fig. 6.

Fig. 6. Light field data used for the verification of the proposed hologram tiling algorithm. (a) Single pixel point object, (b) 8 × 8 pixel width square object, and (c) two sockets at different depths.

Download Full Size | PDF

 figure: Fig. 7.

Fig. 7. Hologram tiles generated from corresponding light field tiles (left), the composite hologram generated by combining the hologram tiles (center), and the original hologram generated from whole light field without tiling process (right). Holograms are generated from (a) light field of single pixel point object in Fig. 6(a), 6(b) light field of 8 × 8 pixel square object in Fig. 6(b), and 6(c) light field of two socket object in Fig. 6(c).

Download Full Size | PDF

Figure 7 shows the synthesized holograms. In Figs. 7(a)–7(c), the left column shows the amplitude and phase of individual hologram tiles synthesized from the corresponding light field tiles shown in Fig. 6. The center column shows the hologram combined from those hologram tiles and the right column is the hologram directly synthesized from the whole light field without the tiled processing. Note that the hologram tiles in the left column are not simply stitched side by side but are combined with overlap area as explained in section 2.3. From Figs. 2 and 3, the overlap area is given by ±1/4Δu and ±1/4Δv around the original light field area, which corresponds to ±Nu/4 and ±Nv/4 hologram pixels by Eq. (7). In Fig. 7, the original light field area and the overlap area are indicated by a green rectangle and orange dashed lines in the left column. From the comparison between the holograms synthesized by the tiling process shown in the center column and the holograms synthesized directly from the whole light field data shown in the right column, it can be verified that the proposed hologram tiling process generates the original hologram without any error, successfully.

Using the proposed angular-frequency-slice-based processing and the hologram tiling, we demonstrate the synthesis of large pixel resolution holograms. Figure 8 shows the light field data used for a 25K×25 K pixel resolution hologram synthesis. In this hologram synthesis and the numerical reconstructions, not only the large pixel resolution hologram synthesis but also the enhanced depth range of the non-hogel-based CGH explained in section 2.4 is also demonstrated. The light field has Ntx×Nty×Nu×Nv = 1563 × 1563 × 64 × 64 data size with a spatial sampling pitch of 64µm, i.e. Δtxty=64µm. In the hologram synthesis, the spatial interpolation by a factor of M=16 was performed using the proposed angular-frequency-slice-based processing scheme, resulting in (1563 × 16)×(1563 × 16) = 25K×25 K hologram pixel resolution with the hologram pixel pitch Δxy=4µm (=Δtx/Mty/M). The angular range of the light field was given by the viewing angle of the hologram, i.e. |θx|=|θy|=λ/2Δx=λ/2Δy=3.81° at λ=532 nm or |u|=|v|=1/2Δx=1/2Δy=1.25 × 105m-1. With Nu=Nv=64, the angular sampling pitch of the light field was determined to be Δθxθy=0.12° or Δuv=3.9 × 103m-1. The full light field data of the Ntx×Nty×Nu×Nv = 1563 × 1563 × 64 × 64 data size was divided into 9 × 9 tiles of the Ntx×Nty×Nu×Nv = 180 × 180 × 64 × 64 data size (note that the Ntx of the light field tiles in the rightmost column and the Nty of the light field tiles in the bottom row are smaller to be 1563-180 × 8 = 123, not 180), and the final hologram was obtained by combining the 9 × 9 hologram tiles of (180×M)×(180×M) = 2880 × 2880 resolution synthesized from the corresponding light field tiles with M=16. A random complex field was used as the complex weight distribution W(xc, yc).

 figure: Fig. 8.

Fig. 8. Light field used for the 25K×25 K resolution hologram and its extended depth range verification. (a) Single orthographic view in the light field, (b) collection of 64 × 64 orthographic views.

Download Full Size | PDF

For the depth range comparison, a holographic stereogram was also synthesized from the same light field. The full spatial resolution of the light field Ntx×Nty=1563 × 1563 was still used without change in the holographic stereogram synthesis. The number of the angular samplings Nu×Nv, however, was reduced from 64 × 64 to 16 × 16 by down-sampling, in order to make hogels of 16 × 16 size, giving the holographic stereogram of the same pixel resolution (1563 × 16)×(1563 × 16) = 25K×25 K as the hologram synthesized by the non-hogel-based CGH. In the synthesis of the holographic stereogram, 16 × 16 pixel resolution perspective view at each spatial position (txo, tyo), i.e. Ipersptxo,tyo(u,v)=L(txo,tyo,u,v) was multiplied with a random phase distribution of the same resolution and Fourier transformed to generate the corresponding hogel.

With these parameters of the non-hogel-based CGH and the holographic stereogram, the maximum depths that can be reconstructed without aliasing are given by Eqs. (9) and (10) to be |z|=30.8 mm for the non-hogel-based CGH and |z|=7.7 mm for the holographic stereogram. Note that the non-hogel-based CGH is expected to have 4 times larger depth range than the holographic stereogram.

For the clear demonstration of the enhanced depth range, the 3D scene that consists of 16 identical resolution target images located at different depths as shown in Fig. 8(a) was used in the light field generation. The depths of the 16 resolution targets are distributed from -4zo to +4zo spanning the depth range of the non-hogel-based CGH where zo is the maximum depth of the holographic stereogram, i.e. zo=7.7 mm. Figure 8(b) shows some orthographic views in the whole light field data.

Figure 9 shows the amplitudes of the synthesized holographic stereogram and the non-hogel-based hologram. Figure 10 and Visualizations 1 and 2 shows their numerical reconstructions at various depths. For the numerical reconstructions, the synthesized 25K×25 K resolution holograms are divided into overlapping tiles and the band-limited angular spectrum based numerical propagation algorithm [22] was applied to each tile. The tile boundaries for the numerical reconstructions were selected not to coincide with the tile boundaries used in the hologram synthesis. In Fig. 10, the focused reconstructions of the corresponding resolution target image at depths z=1.0zo, 2.5zo, and 4.0zo are magnified to compare the resolution. From the comparison between Figs. 10(a) and 10(b), it is observed that the reconstructions of the non-hogel-based CGH maintain the original spatial resolution of the light field for the entire depth range |z|≤4zo=30.8 mm, while the holographic stereogram preserves the original resolution only within |z|≤zo=7.7 mm as expected. Therefore, the extended depth range advantage of the non-hogel-based CGH explained in section 2.4 can be confirmed.

 figure: Fig. 9.

Fig. 9. Amplitude of generated holograms from the light field in Fig. 8. (a) Holographic stereogram, (b) hologram synthesize by the non-hogel-based CGH.

Download Full Size | PDF

 figure: Fig. 10.

Fig. 10. Numerical reconstructions for different depths of the holograms in Fig. 9. (a) Holographic stereogram (Visualization 1), (b) hologram synthesize by the non-hogel-based CGH (Visualization 2).

Download Full Size | PDF

Figure 11 shows another light field data for the large pixel resolution hologram synthesis. A chessboard scene spanning the depth range -4zoz≤4zo, or -30.8mm ≤ z≤30.8 mm was rendered at various orthographic projection angles using a ray tracing software POV-RAY, giving a light field data of Ntx×Nty×Nu×Nv = 1563 × 1081 × 64 × 64 data size with Δtxty=64µm spatial sampling pitch. The non-hogel-based hologram was synthesized from this light field using the spatial interpolation M=16, giving the hologram of (Ntx×M)×(Nty×M) = 25K×17 K resolution with Δxy=4µm (=Δtx/Mty/M) pixel pitch. The angular range and the angular sampling pitch of the light field data were determined by the hologram viewing angle and the number of the orthographic views to be |u|=|v|=1/2Δx=1/2Δy=1.25 × 105m-1, and Δuv=2|u|/Nu=2|v|/Nv=3.9 × 103m-1, respectively. In the hologram synthesis, this light field was spatially divided into 9 × 7 tiles of Ntx×Nty×Nu×Nv = 180 × 180 × 64 × 64 data size and processed independently. A random complex distribution was used as the complex weight W(xc, yc) in the hologram synthesis. Figure 12 is the amplitude of the synthesized hologram.

 figure: Fig. 11.

Fig. 11. Light field used for 25K×17 K resolution hologram synthesis. (a) Single orthographic view in the light field, (b) collection of 64 × 64 orthographic views.

Download Full Size | PDF

 figure: Fig. 12.

Fig. 12. Amplitude of generated holograms from the light field in Fig. 11.

Download Full Size | PDF

Figure 13 and Visualization 3 show the numerical reconstructions at various depths of the synthesized hologram. From Fig. 13, it can be confirmed that the numerical reconstructions of the hologram synthesized by the proposed technique shows high resolution reconstructions as high as the spatial resolution of the original light field within the extended depth range -4.0zoz≤4.0zo with correct focusing effects. It is also interesting to observe in the magnified red rectangle portion that, the chess pawn (green arrow) and its reflection (yellow arrow) from the chess board are focused at different depths following their physical depths. Moreover, the chess board itself (blue arrow) and the reflection of the chess pawn (yellow arrow) are focused independently even though they are super-imposed in the same spatial position in the view. Note that this case cannot be properly dealt with previously reported techniques which enhance the reconstruction resolution by matching the phase between the hogels using the depth map, because the scene depth cannot be determined uniquely in this super-imposed area. The non-hogel-based CGH of this paper generates coherent parabolic wavefront only from the light field without requiring the depth map, which enables high resolution reconstruction with individual focusing even in this super-imposed area.

 figure: Fig. 13.

Fig. 13. Numerical reconstructions for different depths of the hologram in Fig. 12. (Visualization 3)

Download Full Size | PDF

Finally, several perspective views were extracted from the synthesized hologram to verify the parallax reconstruction. Figure 14 shows the geometry used in the perspective view extraction. The hologram has 25K×17 K resolution with Δxy=4µm pixel pitch, which measures 10cm×6.9 cm physical size. The diffraction angle is |θx|=|θy|=λ/2Δx=λ/2Δy=3.81°. The perspective views were extracted by scanning a virtual camera within the diffraction angle. More specifically, the virtual camera of a 2 cm diameter aperture was placed at 150 cm distance from the hologram as shown in Fig. 14. The transverse position of the camera was scanned from -5 cm to +5 cm both in horizontal and vertical directions. The camera lens focal length and the lens to image plane distance were adjusted to focus on the hologram plane. For the perspective view extraction at the given camera location, the hologram field is numerically propagated to the camera lens plane, clipped by the lens aperture, multiplied by the lens function, and again numerically propagated to the image plane.

 figure: Fig. 14.

Fig. 14. Geometry used in the perspective view extraction from the hologram in Fig. 12.

Download Full Size | PDF

The extracted perspective views are shown in Fig. 15 and Visualization 4. Unlike the numerical reconstructions shown in Fig. 13 which use the full hologram NA, the 2 cm aperture of the virtual camera located at 150 cm distance significantly limits the NA, increasing the depth of focus of the extracted views. The enlarged speckle grains shown in Fig. 15 are also due to the limited NA of the virtual camera. In spite of the enlarged speckle grain size, Fig. 15 and Visualization 4 clearly demonstrate the reproduction of the horizontal and vertical parallax of the 3D scene as expected.

 figure: Fig. 15.

Fig. 15. Perspective views extracted from the hologram in Fig. 12. (Visualization 4)

Download Full Size | PDF

Figure 16 is the magnified portion of the extracted perspective views. It is observed from Fig. 16 and Visualization 4 that the relative positions of the chess pawn and its reflection with respect to the chess board change in opposite directions, which again verifies the correct parallax reproduction even in the case where the usual depth-map-based techniques are not available as the depth map is not uniquely determined.

 figure: Fig. 16.

Fig. 16. Magnified portion in the perspective views extracted from the hologram in Fig. 12.

Download Full Size | PDF

4. Conclusion

In this paper, we proposed an efficient calculation scheme for the non-hogel-based CGH technique which synthesizes the hologram from light field data without requiring depth map information. Unlike usual holographic stereograms which generate each hogel only from corresponding local view, the non-hogel-based CGH generates the hologram by globally processing the light field data, generating continuous parabolic wavefront for each object point with desired initial phase distribution determined by the carrier wave used in the calculation. The global processing, however, increases the computational load of the non-hogel-based CGH significantly, prohibiting the synthesis of high-resolution holograms. The angular-frequency-slice-based processing scheme proposed in this paper enables independent and concurrent processing of each angular frequency slice of the light field, making it suitable for the GPU implementation with reduced memory requirement. The tiled processing scheme was also proposed, enabling the synthesis of arbitrarily large resolution holograms without limited by the physical memory of the processing device. Using these two techniques, the synthesis of 25K×25 K resolution hologram from the light field data having 64 × 64 orthographic views of 1563 × 1563 resolution was successfully demonstrated. The enhanced depth range and the realistic depth and parallax reproduction of the object reflection in the hologram reconstruction were also demonstrated with the synthesized high pixel resolution hologram.

Funding

National Research Foundation of Korea (NRF-2017R1A2B2011084).

Disclosures

The authors declare no conflict of interests.

References

1. J.-H. Park, “Recent progresses in computer generated holography for three-dimensional scene,” J,” Inf. Disp. 18(1), 1–12 (2017). [CrossRef]  

2. S.-C. Kim and E.-S. Kim, “Effective generation of digital holograms of three-dimensional objects using a novel look-up table method,” Appl. Opt. 47(19), D55–D62 (2008). [CrossRef]  

3. T. Shimobaba, H. Nakayama, N. Masuda, and T. Ito, “Rapid calculation algorithm of Fresnel computer-generated-hologram using look-up table and wavefront-recording plane methods for three-dimensional display,” Opt. Express 18(19), 19504–19509 (2010). [CrossRef]  

4. L. Wei and Y. Sakamoto, “Fast calculation method with foveated rendering for computer-generated holograms using and angle-changeable ray-tracing method,” Appl. Opt. 58(5), A258–A266 (2019). [CrossRef]  

5. H. Zhang, L. Cao, and G. Jin, “Computer-generated hologram with occlusion effect using layer-based processing,” Appl. Opt. 56(13), F138–F143 (2017). [CrossRef]  

6. H. Kim, J. Hahn, and B. Lee, “Mathematical modeling of triangle-mesh-modeled three-dimensional surface objects for digital holography,” Appl. Opt. 47(19), D117–D127 (2008). [CrossRef]  

7. K. Matsushima and S. Nakahara, “Extremely high-definition full-parallax computer-generated hologram created by the polygon-based method,” Appl. Opt. 48(34), H54–H63 (2009). [CrossRef]  

8. J. H. Park, S. B. Kim, H. J. Yeom, H. J. Kim, H. J. Zhang, B. N. Li, Y. M. Ji, S. H. Kim, and S. B. Ko, “Continuous shading and its fast update in fully analytic triangular-mesh-based computer generated hologram,” Opt. Express 23(26), 33893–33901 (2015). [CrossRef]  

9. T. Yatagai, “Stereoscopic approach to 3-D display using computer-generated holograms,” Appl. Opt. 15(11), 2722–2729 (1976). [CrossRef]  

10. M. Yamaguchi, “Light-field and holographic three-dimensional displays [Invited],” J. Opt. Soc. Am. A 33(12), 2348–2364 (2016). [CrossRef]  

11. R. Ng, M. Levoy, M. Bredif, G. Duval, M. Horowitz, and P. Hanrahan, “Light field photography with a handheld plenoptic camera,” Stanford Tech. Rep. CTSR 2005–02 (Stanford University, 2005).

12. S.-K. Lee, S.-I. Hong, Y.-S. Kim, H.-G. Lim, N.-Y. Jo, and J.-H. Park, “Hologram synthesis of three-dimensional real objects using portable integral imaging camera,” Opt. Express 21(20), 23662–23670 (2013). [CrossRef]  

13. S. Igarashi, T. Nakamura, K. Matsushima, and M. Yamaguchi, “Efficient tiled calculation of over-10-gigapixel holograms using ray-wavefront conversion,” Opt. Express 26(8), 10773–10786 (2018). [CrossRef]  

14. N. Padmanaban, Y. Peng, and G. Wetzstein, “Holographic near-eye displays based on overlap-add stereograms,” ACM Trans. Graph. 38(6), 1–13 (2019). [CrossRef]  

15. K. Wakunami and M. Yamaguchi, “Calculation for computer generated hologram using ray-sampling plane,” Opt. Express 19(10), 9086–9101 (2011). [CrossRef]  

16. H. Kang, E. Stoykova, and H. Yoshikawa, “Fast phase-added stereogram algorithm for generation of photorealistic 3D content,” Appl. Opt. 55(3), A135 (2016). [CrossRef]  

17. H. Zhang, Y. Zhao, L. Cao, and G. Jin, “Fully computed holographic stereogram based algorithm for computer-generated holograms with accurate depth cues,” Opt. Express 23(4), 3901–3913 (2015). [CrossRef]  

18. H. Zhang, Y. Zhao, L. Cao, and G. Jin, “Layered holographic stereogram based on inverse Fresnel diffraction,” Appl. Opt. 55(3), A154–A159 (2016). [CrossRef]  

19. H. Zhang, L. Cao, and G. Jin, “Three-dimensional computer-generated hologram with Fourier domain segmentation,” Opt. Express 27(8), 11689–11697 (2019). [CrossRef]  

20. J.-H. Park and M. Askari, “Non-hogel-based computer generated hologram from light field using complex field recovery technique from Wigner distribution function,” Opt. Express 27(3), 2562–2574 (2019). [CrossRef]  

21. J. W. Goodman, Introduction to Fourier Optics (McGraw-Hill, 1996).

22. K. Matsushima and T. Shimobaba, “Band-limited angular spectrum method for numerical simulation of free-space propagation in far and near fields,” Opt. Express 17(22), 19662–19673 (2009). [CrossRef]  

Supplementary Material (4)

NameDescription
Visualization 1       Numerical propagation results of the 25Kx25K resolution hologram of 4x4 resolution target objects to different depths. The hologram was synthesized from the light field data using the holographic stereogram technique.
Visualization 2       Numerical propagation results of the 25Kx25K resolution hologram of 4x4 resolution target objects to different depths. The hologram was synthesized from the light field data using the non-hogel-based CGH technique.
Visualization 3       Numerical propagation results of the 25Kx17K resolution hologram of the chessboard object to different depths. The hologram was synthesized from the light field data using the non-hogel-based CGH technique.
Visualization 4       Reconstructions of the 25Kx17K resolution hologram of the chessboard object observed at different virtual camera position. The hologram was synthesized from the light field data using the non-hogel-based CGH technique.

Cited By

Optica participates in Crossref's Cited-By Linking service. Citing articles from Optica Publishing Group journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (16)

Fig. 1.
Fig. 1. Angular frequency slice processing.
Fig. 2.
Fig. 2. Spatial extent of the hologram corresponding to a given light field.
Fig. 3.
Fig. 3. Hologram tiling process.
Fig. 4.
Fig. 4. Light field and synthesized holograms with different interpolation factor M. (a) Light field data used for the hologram synthesis, (b) synthesized holograms (top two rows) and their numerical reconstructions (bottom two rows).
Fig. 5.
Fig. 5. Measured computation time for three implementations of the non-hogel-based CGH technique, i.e. conventional direct implementation of Eq. (1) using CPU, proposed implementation based on the angular frequency slice processing using CPU, and the proposed implementation based on the same algorithm using GPU.
Fig. 6.
Fig. 6. Light field data used for the verification of the proposed hologram tiling algorithm. (a) Single pixel point object, (b) 8 × 8 pixel width square object, and (c) two sockets at different depths.
Fig. 7.
Fig. 7. Hologram tiles generated from corresponding light field tiles (left), the composite hologram generated by combining the hologram tiles (center), and the original hologram generated from whole light field without tiling process (right). Holograms are generated from (a) light field of single pixel point object in Fig. 6(a), 6(b) light field of 8 × 8 pixel square object in Fig. 6(b), and 6(c) light field of two socket object in Fig. 6(c).
Fig. 8.
Fig. 8. Light field used for the 25K×25 K resolution hologram and its extended depth range verification. (a) Single orthographic view in the light field, (b) collection of 64 × 64 orthographic views.
Fig. 9.
Fig. 9. Amplitude of generated holograms from the light field in Fig. 8. (a) Holographic stereogram, (b) hologram synthesize by the non-hogel-based CGH.
Fig. 10.
Fig. 10. Numerical reconstructions for different depths of the holograms in Fig. 9. (a) Holographic stereogram (Visualization 1), (b) hologram synthesize by the non-hogel-based CGH (Visualization 2).
Fig. 11.
Fig. 11. Light field used for 25K×17 K resolution hologram synthesis. (a) Single orthographic view in the light field, (b) collection of 64 × 64 orthographic views.
Fig. 12.
Fig. 12. Amplitude of generated holograms from the light field in Fig. 11.
Fig. 13.
Fig. 13. Numerical reconstructions for different depths of the hologram in Fig. 12. (Visualization 3)
Fig. 14.
Fig. 14. Geometry used in the perspective view extraction from the hologram in Fig. 12.
Fig. 15.
Fig. 15. Perspective views extracted from the hologram in Fig. 12. (Visualization 4)
Fig. 16.
Fig. 16. Magnified portion in the perspective views extracted from the hologram in Fig. 12.

Tables (1)

Tables Icon

Table 1. Computation time comparison.

Equations (10)

Equations on this page are rendered with MathJax. Learn more.

H ( x , y ) = L ~ ( x + x c 2 , y + y c 2 , x x c , y y c ) W ( x c , y c ) d x c d y c ,
H ( x , y ) = H τ x , τ y ( x , y ) d τ x d τ y ,
H τ x , τ y ( x , y ) = L ~ ( x τ x 2 , y τ y 2 , τ x , τ y ) W ( x τ x , y τ y )
H τ x , τ y ( t x + τ x 2 , t y + τ y 2 ) = L ~ ( t x , t y , τ x , τ y ) W ( t x τ x 2 , t y τ y 2 ) = I ~ τ x , τ y ( t x , t y ) W ( t x τ x 2 , t y τ y 2 ) .
L ( t x i , t y i , u , v ) = m , n a m , n L ( t x , m , t y , n , u , v ) ,
L ~ ( t x i , t y i , τ x , τ y ) = m , n a m , n L ( t x , m , t y , n , u , v ) e j 2 π ( u τ x + v τ y ) d u d v = m , n a m , n L ~ ( t x , m , t y , m , τ x , τ y ) ,
Δ u = 1 N u Δ x , Δ v = 1 N v Δ y .
| z | min ( 1 Δ θ x B x , 1 Δ θ y B y ) = min ( M x Δ x λ Δ u , M y Δ y λ Δ v ) ,
| z | min ( N u M x Δ x 2 λ , N v M y Δ y 2 λ ) .
| z | min ( M x 2 Δ x 2 λ , M y 2 Δ y 2 λ ) .
Select as filters


Select Topics Cancel
© Copyright 2024 | Optica Publishing Group. All rights reserved, including rights for text and data mining and training of artificial technologies or similar technologies.