Expand this Topic clickable element to expand a topic
Skip to content
Optica Publishing Group

Non-hogel-based computer generated hologram from light field using complex field recovery technique from Wigner distribution function

Open Access Open Access

Abstract

We propose a novel method that synthesizes computer-generated holograms from light field data. Light field, or ray space, is the spatio-angular distribution of light rays coming from three-dimensional scene, and it can also be represented using a large number of views from different observation directions. The proposed method synthesizes a hologram by applying the complex field recovery technique from its Wigner distribution function to the light field data. Unlike conventional approaches, the proposed method synthesizes holograms without hogel configuration, generating a converging parabolic wave for each object point with continuous wavefront. The proposed method does not trade the spatial resolution with angular resolution like conventional hogel-based approaches. Moreover, the proposed method works not only for random phase light field like conventional approaches, but also for arbitrary phase distribution with corresponding carrier waves. Therefore, the proposed method is useful in synthesizing holographic contents for a wide range of applications. The proposed method is verified by simulations and optical experiments, showing successful reconstruction of three-dimensional objects.

© 2019 Optical Society of America under the terms of the OSA Open Access Publishing Agreement

1. Introduction

Computer generated hologram (CGH) is a complex amplitude distribution or complex field of light which is synthesized computationally [1–4]. CGH is vital in generating contents for holographic three-dimensional (3D) displays, holographic projectors, and holographic printers. In CGH techniques, 3D data of objects is represented in various forms and corresponding complex field is synthesized. One form of the 3D data representation is light field. Light field, or also called ray space, is the spatio-angular distribution of the light rays coming from 3D objects [5]. Or equivalently it is a collection of large number of views of the 3D objects observed from slightly different directions. The light field based CGH techniques have advantages in the points that the light field data can be readily obtained for computer graphics objects using rendering software or also for real 3D scene using light field cameras [6,7]. Easier occlusion and surface angular reflectance processing is another advantage of the light field based CGH techniques.

Most light field based CGH techniques divide the hologram plane into many non-overlapping sub-areas which is called hogel [1,8–10]. Then the complex field within each hogel is calculated such that it reconstructs the light rays, or the corresponding view from the hogel position. In a simple implementation, the view from the hogel position is multiplied with a random phase distribution and then Fourier-transformed to obtain the corresponding complex field. Various sophisticated techniques have been developed including introduction of ray sampling plane to reduce the spatial resolution loss for large depth objects [11,12], consideration of depth distribution in each view to represent the 3D objects more realistically [13,14], and iterative optimization of the complex field to match its Wigner distribution function (WDF) smoothed over the hogel area to the corresponding view [15]. However, they still rely on hogel based configuration, and thus they are accompanied by corresponding drawbacks.

One drawback of these hogel based approaches is that they trade the spatial resolution of the reconstruction with the angular resolution [15]. As an individual hogel functions as a 3D pixel of the reconstruction, smaller hogel size is desirable to have better spatial resolution. To the contrary, the angular resolution of the reconstructed light rays is determined by the number of the hologram sampling points inside each hogel. Therefore, for a given hologram resolution, or the total number of sampling points in the hologram plane, there exists a trade-off relation between the spatial and angular resolution of the reconstruction.

Another drawback of the hogel based approaches is that they reconstruct not a continuous wavefront for each object point but discrete light rays converging toward it from multiple hogels. The converging light rays only approximate the ideal spherical or parabolic wave, and thus the reconstruction is not the physically exact complex field of the objects. Moreover, the phase relation between the light rays reconstructed from different hogels is not usually considered. Although there are approaches to create the ideal parabolic wave [13] or converging rays with phase matching [16–18], but they require the explicit depth information for each object point.

Note that there exists another type of light field based CGH techniques which do not rely on the hogel based configuration [19,20]. They reconstruct continuous parabolic wave for each object point. However, those techniques can only reconstruct 3D objects with uniform surface phase distribution and thus the CGH synthesis of 3D objects with arbitrary surface phase distribution or with arbitrary carrier waves is not possible.

In this paper, we propose a novel light field based CGH technique. The proposed technique processes the light field globally rather than processes it locally in each hogel position without global relationship across multiple hogels. It generates continuous wavefront for each object point, being free from the spatial and angular resolution tradeoff of the conventional techniques. The proposed technique can also handle arbitrary object surface phase distribution or arbitrary carrier waves without extracting depth information of the scene explicitly. Therefore the proposed technique is versatile in various applications which require different types of carrier waves. The proposed method is in essence the application of the complex field recovery technique from its WDF to the light field. Unlike the WDF, the absence of the cross terms in the light field enables the carrier wave control in the synthesized CGH. In the followings, the principle of the proposed method is explained with an analysis on the required sampling rate of the light field. The numerical simulation and optical experimental results are also presented for the verification of the proposed method.

2. Principle

2.1 Proposed method

Light field in a certain plane, which is assumed to be a hologram plane hereafter, can be represented as four-dimensional (4D) distribution. Let us denote a real valued intensity of a light ray that passes through the hologram plane at a spatial position (x,y) with a direction corresponding to a spatial frequency pair (u,v) as L(x,y,u,v). The spatial frequency pair (u, v) is related to the angle pair (θx, θy) that the light ray makes with x and y axes by sinθx=λu,sinθy=λv where λ is the wavelength. Considering that each object point is associated with light rays converging at the point, the light field corresponding to 3D objects which consist of numerous object points is given under paraxial approximation by [21]

L(x,y,u,v)=mam2δ(u+xxmλzm,v+yymλzm),
where m-th object point with a real amplitude am is located at (xm, ym, zm).

Given the light field data L(x,y,u,v) of 3D objects, the proposed method synthesizes hologram H(x,y) by

H(x,y)=H(x,y;xc,yc)W(xc,yc)dxcdyc,
where H(x,y;xc,yc) is given by
H(x,y;xc,yc)=L(x+xc2,y+yc2,u,v)ej2π{u(xxc)+v(yyc)}dudv.
In Eq. (2), W(xc, yc) is a complex valued weight related to a desired carrier wave which will be explained later.

For the verification of the proposed method given by Eqs. (2) and (3), we substitute Eq. (1) to Eq. (3), which gives

H(x,y;xc,yc)=mam2δ(u+x+xc2xmλzm,v+y+yc2ymλzm)ej2π{u(xxc)+v(yyc)}dudv=mam2exp[jπλzm{(xxm)2+(yym)2}]exp[jπλzm{(xcxm)2+(ycym)2}].
The second row of Eq. (4) shows coherently accumulated weight lens profiles of voxels excited by a virtual point source at (xc, yc) in the hologram plane. Or it can be interpreted such that the voxel of the 3D objects re-emits the illumination parabolic wave emanating from (xc, yc) in the hologram plane by the form of backward parabolic wave. Therefore H(x,y;xc,yc) in Eq. (3) is the complex field of 3D objects which is illuminated by a parabolic carrier wave emanating from (xc, yc) in the hologram plane. The final hologram H(x,y) in Eq. (2) is the superposition of H(x,y;xc,yc) with a complex weight distribution W(xc,yc), and thus it is possible to obtain the hologram H(x,y) with an arbitrary carrier wave by controlling W(xc,yc) in Eq. (2).

Note that H(x,y;xc,yc) reconstructs not a set of discrete light rays like conventional hogel based approaches but a parabolic wave with continuous wavefront converging at each object point. Also note that the proposed method does not require any explicit depth extraction from light field. The depth information included in the light field as Eq. (1) is utilized implicitly by the proposed method given in Eq. (3) to form complex field with continuous wavefront like Eq. (4). Therefore, the proposed method is free from any errors coming from explicit depth extraction from the light field data. One drawback of the proposed method is that it generates the parabolic waves for individual object point or voxel with an amplitude am2 not with the actual amplitude am. This artifact can be remedied by pre-square-root scaling the light field.

2.2 Calculation using Fourier transform

The proposed method given by Eqs. (2) and (3) requires 4D integration of L(x, y, u, v) over xc, yc, u, and v axes, which is computationally expensive. However, the implementation of the proposed method can be modified as explained in this sub-section to reduce computational load using Fourier transform.

By changing variables x, y, xc and yc to tx, ty, τx and τy, respectively,

x=tx+τx2,y=ty+τy2,xc=txτx2,yc=tyτy2.
H(x,y;xc,yc) in Eq. (3) can be obtained by
H(x,y;xc,yc)=H(tx+τx2,ty+τy2;txτx2,tyτy2)=L(tx,ty,u,v)exp[j2π(uτx+vτy)]dudv=L˜(tx,ty,τx,τy)=L˜(x+xc2,y+yc2,xxc,yyc),
where L˜(tx,ty,τx,τy) is the 2D Fourier transform of L(tx,ty,u,v) over u and v axes which can be obtained by applying the 2D fast Fourier transform (FFT) algorithm to the 4D light field L(tx,ty,u,v). The final hologram H(x,y) in Eq. (2) is obtained by
H(x,y)=L˜(x+xc2,y+yc2,xxc,yyc)W(xc,yc)dxcdyc,
which is an accumulation of L˜(tx,ty,τx,τy) with different shifts and complex weights. With this modification given in Eqs. (6) and (7), the computational load can be reduced in comparison with the original computational scheme given in Eqs. (2) and (3). Figure 1 shows the computational scheme of the proposed method in comparison with the conventional hogel based method.

 figure: Fig. 1

Fig. 1 Computation scheme of the proposed method in comparison with conventional hogel based methods.

Download Full Size | PDF

Note that even with this modification, the computational load of the proposed method is still large. With the number of samples in x, y, u, and v axes denoted by Nx, Ny, Nu, and Nv, respectively, the 2D FFT of the 4D light field given in Eq. (6) requires NxNy times 2D FFTs of NuNv size and the accumulation of Eq. (7) requires NuNv times accumulations of NxNy size 2D matrix. Overall the computational load is higher than conventional hogel based approaches which only require (Nx/Nu)(Ny/Nv) times 2D FFTs of NuNv size. The large computational load of the proposed method is caused by the global processing of the light field data and its reduction is a topic of further research.

2.3 Relation to complex field recovery from its WDF

The proposed CGH synthesis from light field data is closely related to the inverse WDF, i.e. complex field recovery from its WDF. In this sub-section, the proposed method is explained in the context of the WDF to show its physical meaning more clearly. For brevity, we use 2D abstraction, for instance WDF(tx,u) rather than WDF(tx,ty,u,v) in this sub-section. Its extension to the original 4D representation is straightforward.

Suppose that a complex field corresponding to many object points is given by

U(x)=mUm(x)=mamejϕmexp[jπλzm(xxm)2],
where ϕm represents a phase of m-th object point and Um(x) is the complex field for m-th object point. The WDF of this complex field is given by [22,23]
WDF(tx,u)=U(tx+τx2)U*(txτx2)ej2πuτxdτx=mUm(tx+τx2)Um*(txτx2)ej2πuτxdτx+m,n(m)Um(tx+τx2)Un*(txτx2)ej2πuτxdτx=mam2δ(u+txxmλzm)+m,n(m)Um(tx+τx2)Un*(txτx2)ej2πuτxdτx=L(tx,u)+m,n(m)Um(tx+τx2)Un*(txτx2)ej2πuτxdτx,
where the first term L(tx,u) in the last row of Eq. (9) is the light field given in Eq. (1). The second term in the last row of Eq. (9) represents cross-terms between different object points caused by the quadratic operation of the WDF [22,23]. Recovering the complex field U(x) from its WDF(tx,u) can be performed by taking inverse Fourier transform of the WDF(tx,u) over u axis and changing variables [22]; i.e.
WDF(tx,u)ej2πuτxdu|tx=x+xc2,τx=xxc=U(x)U*(xc)=mUm(x)nUn*(xc),
which, however, recovers the complex field U(x) only up to a complex constant and does not allow adjusting the phase of individual object points, or equivalently giving desired carrier wave.

The proposed method starts from the light field L(tx,u) which does not contain cross-terms. The proposed method again takes the inverse Fourier transform of the light field L(tx,u) over u axis and changes the variable in the same way, i.e.

L(tx,u)ej2πuτxdu|tx=x+xc2,τx=xxc=L˜(tx,τx)|tx=x+xc2,τx=xxc=mUm(x)Um*(xc)=mam2exp[jπλzm(xxm)2]exp[jπλzm(xcxm)2]=H(x;xc),
which now enables the control of the carrier wave by the weighted addition explained in Eqs. (2) and (7). Therefore the proposed method can be thought as the application of the standard inverse WDF technique to the light field. The absence of the cross-terms in the light field provides the carrier wave control capability.

2.4 Sampling analysis

In order to obtain final hologram H(x) with sampling interval Δx and the number of samples Nx, the spatial sampling interval Δtx and the number of samples Ntx of the light field L(tx,u) should be prepared satisfying

Δtx=Δx,Ntx=Nx.
Considering Fourier transform relation given in Eqs. (6) and (7), the product of the sampling interval Δu and the number of samples Nu along the angular axis, or spatial frequency axis, of the light field L(tx,u) should satisfy
ΔuNu=1Δτx=1Δx=sinθfullλ,
where θfull is the full angular range of the hologram reconstruction or the light field.

Sampling requirement of the light field can also be found from the relationship with the objects. It is known that the spatial Btx and angular bandwidth Bu of the light field L(tx,u) that corresponds to a planar object of a spatial bandwidth Bx and a depth z is given by [6,24]

Btx=Bx,Bu=λ|z|Bx,
which leads to
Δtx1Bx,Δu1λ|z|Bx.
From Eqs. (12), (13), and (15), the required number of samples Nu in the angular direction of the light field L(tx,u) is found to be
Nuλ|z|Bx2.
In summary, with the light field L(tx,u) satisfying Eqs. (15) and (16), the hologram H(x) of the corresponding objects can be synthesized without aliasing by the proposed method.

Note that Eqs. (15) and (16) indicate that the angular sampling requirement on Δu and Nu is dependent on the absolute depth |z| of the object from the hologram plane. Figure 2 shows the maximum absolute depth |zmax| which is calculated using Eq. (16) when the wavelength λ = 532nm for different spatial bandwidth Bx of the object. In order to cover wide depth range without increasing the angular sampling requirement significantly, it is possible to synthesize the complex field around individual objects using the proposed method, and then propagate and aggregate them in the common hologram plane as suggested in [11,12] for conventional hogel based approach.

 figure: Fig. 2

Fig. 2 Maximum depth for different spatial bandwidth of the object (λ = 532nm).

Download Full Size | PDF

3. Numerical simulation

In numerical simulation, a 3D scene which consists of two socket objects at −3mm and 2mm distances from the hologram plane was used. Using POV-RAY, a rendering software, 40 × 40 orthographic projection images were generated with 300 × 300 pixel resolution and 6um pixel pitch, i.e. Nu = Nv = 40, Nx = Ny = 300, and Δx = Δy = 6um. The angular sampling interval measured in the spatial frequency is Δu = Δv = 4.17 × 103m−1 (or cycles per meter), which satisfies Eq. (13). Figure 3 shows 9 examples out of 40 × 40 projection images.

 figure: Fig. 3

Fig. 3 Examples of orthographic view images used for the hologram synthesis.

Download Full Size | PDF

The holograms synthesized from this light field data by the proposed method are shown in Fig. 4 along with their numerical reconstructions. In the hologram synthesis and numerical reconstructions, the wavelength λ was assumed to be 532nm. All holograms have the 300 × 300 pixel resolution with 6um pixel pitch like the light field data. The computational time to synthesize a hologram for each carrier wave was measured to be 12.2 sec in our Matlab implementation running on a PC without GPU. In Fig. 4, the first and the second columns show the amplitude and phase of the synthesized holograms. The third column shows the amplitudes of the Fourier transform of the holograms, i.e. the amplitude of the angular spectrum. The last two columns are the numerical reconstructions at z = −3mm and + 2mm. Each row corresponds to different carrier wave which is selected by controlling the complex weight W(xc) in Eq. (2). In Fig. 4, five examples of the carrier waves, i.e. normal plane wave, random phase, two different slanted plane waves, and a spherical carrier wave are shown. It can be confirmed from Fig. 4 that the proposed method can generate the hologram from the given light field data with arbitrary carrier wave and the generated hologram reconstructs the objects at their original depths.

 figure: Fig. 4

Fig. 4 Holograms synthesized by the proposed method and their reconstructions. 1st and 2nd columns show amplitude and phase of the synthesized holograms. 3rd column is angular spectrum, i.e. amplitude of Fourier transform of the hologram. 4th and 5th columns are numerical reconstructions at z = −3mm and 2mm. Each row corresponds to different carrier wave.

Download Full Size | PDF

Figure 5 shows the resolution comparison between the reconstructions of the holograms synthesized by the proposed method and the conventional hogel based method. A resolution target pattern at 1.9mm distance from the hologram plane was used as the object. Nu × Nv = 10 × 10 orthographic projection images were generated for the resolution target pattern with Δu = Δv = 1.0 × 104m−1 spatial frequency sampling interval for λ = 532nm wavelength. Each orthographic projection L(x, y, uo, vo) has Nx × Ny = 1000 × 1060 pixel resolution and Δx = Δy = 10um pixel pitch. In the proposed method, this light field data generates the hologram H(x,y) with Nx × Ny = 1000 × 1060 pixel resolution and Δx = Δy = 10um pixel pitch. In the conventional hogel based method, 100 × 106 hogels of 10 × 10 pixel resolution were synthesized and tiled to form the hologram with the same pixel resolution 1000 × 1060 as the proposed method. For the synthesis of each hogel, the corresponding perspective projection image L(xo,yo,u,v) of the 10 × 10 pixel resolution was multiplied with a random phase distribution of the same size and Fourier transformed to obtain the complex field within the hogel following the method in [1,11]. The computation time was measured to be 123.1 sec for the proposed method and 0.53 sec for the hogel based method.

 figure: Fig. 5

Fig. 5 Comparison of reconstruction resolution between the proposed method and conventional hogel based method. In both cases, 10 × 10 orthographic view images are used and the pixel resolution of the hologram is 1000(H) × 1060(V). The object, i.e. resolution target pattern is apart from the hologram plane by 1.9mm. Amplitude and phase of the hologram and its numerical reconstruction at z = 1.9mm for the (a) proposed method, and (b) conventional hogel based method.

Download Full Size | PDF

Figure 5 shows the amplitude and phase of the generated holograms and their numerical reconstructions. It is clear from Fig. 5 that the reconstruction of the hologram synthesized by the proposed method is much crisper than that of the conventional hogel based hologram even though the total pixel resolutions of the holograms are the same. It is because the proposed method is free from the × 1/10 spatial resolution loss of the conventional hogel based method caused by the hogels of 10 × 10 pixel size.

4. Optical experiment

Figure 6 shows the optical experimental setup. The hologram is loaded to a spatial light modulator (SLM) after the encoding. The light reflected from the SLM is filtered using a 4f optics with an aperture to block the DC and the conjugate terms as shown in Fig. 6 and the reconstructions are captured around the image plane of the 4f optics. The wavelength of the laser used in the experiment is 532nm. The reflection-type phase modulating SLM with 8um pixel pitch and 1920 × 1080 resolution was used as the SLM. All holograms were synthesized with random phase carrier wave. In the encoding process for the phase modulating SLM, the complex field hologram is Fourier transformed and left half of the angular spectrum was forced to zero. Then it is inverse Fourier transformed and the phase part was loaded to the SLM.

 figure: Fig. 6

Fig. 6 Experimental setup.

Download Full Size | PDF

In the first experiment, the 40 × 40 orthographic projection images of two socket objects shown in Fig. 3 were used to generate the hologram. Unlike the numerical simulation, however, the resolution of each orthographic projection image was upscaled to 1021 × 1021 before the processing to obtain the hologram with the same resolution, i.e. 1021 × 1021. Figure 7 shows the optical reconstruction results captured with different focal distances of the camera. Since only half plane of the angular spectrum is used in the optical reconstruction to block DC and conjugate terms, the depth of field of the reconstruction is enlarged in comparison with the numerical simulation shown in the second row of Fig. 4. Figure 7, however, clearly shows that the holographic images of two socket objects are formed at different distances, demonstrating that the proposed method can generate 3D hologram from the light field data successfully.

 figure: Fig. 7

Fig. 7 Optical experimental result for 3D reconstruction. The reconstruction was captured with the camera focus at (a) front and (b) rear object.

Download Full Size | PDF

In the second experiment, the resolution comparison in Fig. 5 was verified optically. The holograms were generated with the same condition as the numerical simulation except that the pixel pitch of the hologram is now 8um. The optical reconstruction results are shown in Fig. 8. In Fig. 8, horizontal blur which might be caused by the encoding process and the 4f optics filtering is visible in both cases. Nevertheless, it is clear that the reconstruction of the hologram synthesized by the proposed method has much higher resolution than that of the conventional hogel based method, which agrees well with the numerical simulation result in Fig. 5. Therefore, the resolution advantage of the proposed method over the conventional hogel based method is confirmed experimentally.

 figure: Fig. 8

Fig. 8 Optical experimental result for resolution comparison. Reconstructions of the holograms synthesized by (a) the proposed method and (b) the conventional hogel based method.

Download Full Size | PDF

5. Conclusion

In this paper, a novel hologram synthesis technique from a light field data is proposed. The proposed technique processes the light field globally by applying the complex field recovery technique from the WDF to the light field. Since the proposed method does not rely on the hogel configuration, it is free from the spatio-angular resolution tradeoff of the conventional hogel based techniques. Arbitrary carrier wave can also be used in the synthesis of the hologram without any explicit depth extraction process from the light field unlike the conventional hogel based techniques which only work for random phase carrier wave. It was verified by numerical simulations and optical experiments that the hologram synthesized by the proposed technique can reconstruct 3D images with higher resolution than the hologram synthesized by the conventional hogel based technique.

Funding

IITP, MSIT, Openholo library technology development for digital holographic contents and simulation (2017-0-00417); Basic Science Research Program, NRF (NRF-2017R1A2B2011084); ITRC Support Program, IITP, MSIT (IITP-2018-2015-0-00448); The Cross-Ministry Giga KOREA Project, IITP, MSIT (GK18D0100).

References

1. J.-H. Park, “Recent progresses in computer-generated holography for three-dimensional scenes,” J. Inform. Display 18(1), 1–12 (2017). [CrossRef]  

2. T. Shimobaba, H. Nakayama, N. Masuda, and T. Ito, “Rapid calculation algorithm of Fresnel computer-generated-hologram using look-up table and wavefront-recording plane methods for three-dimensional display,” Opt. Express 18(19), 19504–19509 (2010). [CrossRef]   [PubMed]  

3. K. Matsushima, M. Nakamura, and S. Nakahara, “Silhouette method for hidden surface removal in computer holography and its acceleration using the switch-back technique,” Opt. Express 22(20), 24450–24465 (2014). [CrossRef]   [PubMed]  

4. M. Askari, S.-B. Kim, K.-S. Shin, S.-B. Ko, S.-H. Kim, D.-Y. Park, Y.-G. Ju, and J.-H. Park, “Occlusion handling using angular spectrum convolution in fully analytical mesh based computer generated hologram,” Opt. Express 25(21), 25867–25878 (2017). [CrossRef]   [PubMed]  

5. M. Yamaguchi, “Light-field and holographic three-dimensional displays [Invited],” J. Opt. Soc. Am. A 33(12), 2348–2364 (2016). [CrossRef]   [PubMed]  

6. R. Ng, M. Levoy, M. Bredif, G. Duval, M. Horowitz, and P. Hanrahan, “Light field photography with a handheld plenoptic camera,” Stanford Tech. Rep. CTSR 2005–02 (Stanford University, 2005).

7. S.-K. Lee, S.-I. Hong, Y.-S. Kim, H.-G. Lim, N.-Y. Jo, and J.-H. Park, “Hologram synthesis of three-dimensional real objects using portable integral imaging camera,” Opt. Express 21(20), 23662–23670 (2013). [CrossRef]   [PubMed]  

8. T. Ichikawa, K. Yamaguchi, and Y. Sakamoto, “Realistic expression for full-parallax computer-generated holograms with the ray-tracing method,” Appl. Opt. 52(1), A201–A209 (2013). [CrossRef]   [PubMed]  

9. Z. Wang, G. Lv, Q. Feng, A. Wang, and H. Ming, “Simple and fast calculation algorithm for computer-generated hologram based on integral imaging using look-up table,” Opt. Express 26(10), 13322–13330 (2018). [CrossRef]   [PubMed]  

10. Y. Ichihashi, R. Oi, T. Senoh, K. Yamamoto, and T. Kurita, “Real-time capture and reconstruction system with multiple GPUs for a 3D live scene by a generation from 4K IP images to 8K holograms,” Opt. Express 20(19), 21645–21655 (2012). [CrossRef]   [PubMed]  

11. K. Wakunami and M. Yamaguchi, “Calculation for computer generated hologram using ray-sampling plane,” Opt. Express 19(10), 9086–9101 (2011). [CrossRef]   [PubMed]  

12. K. Wakunami, H. Yamashita, and M. Yamaguchi, “Occlusion culling for computer generated hologram based on ray-wavefront conversion,” Opt. Express 21(19), 21811–21822 (2013). [CrossRef]   [PubMed]  

13. H. Zhang, Y. Zhao, L. Cao, and G. Jin, “Fully computed holographic stereogram based algorithm for computer-generated holograms with accurate depth cues,” Opt. Express 23(4), 3901–3913 (2015). [CrossRef]   [PubMed]  

14. H. Zhang, Y. Zhao, L. Cao, and G. Jin, “Layered holographic stereogram based on inverse Fresnel diffraction,” Appl. Opt. 55(3), A154–A159 (2016). [CrossRef]   [PubMed]  

15. S. Hamann, L. Shi, O. Solgaard, and G. Wetzstein, “Time-multiplexed light field synthesis via factored Wigner distribution function,” Opt. Lett. 43(3), 599–602 (2018). [CrossRef]   [PubMed]  

16. Y. Ohsawa, K. Yamaguchi, T. Ichikawa, and Y. Sakamoto, “Computer-generated holograms using multiview images captured by a small number of sparsely arranged cameras,” Appl. Opt. 52(1), A167–A176 (2013). [CrossRef]   [PubMed]  

17. H. Kang, E. Stoykova, and H. Yoshikawa, “Fast phase-added stereogram algorithm for generation of photorealistic 3D content,” Appl. Opt. 55(3), A135–A143 (2016). [CrossRef]   [PubMed]  

18. S. Ding, S. Cao, Y. F. Zheng, and R. L. Ewing, “From image pair to a computer generated hologram for a real-world scene,” Appl. Opt. 55(27), 7583–7592 (2016). [CrossRef]   [PubMed]  

19. N. T. Shaked, J. Rosen, and A. Stern, “Integral holography: white-light single-shot hologram acquisition,” Opt. Express 15(9), 5754–5760 (2007). [CrossRef]   [PubMed]  

20. J.-H. Park, M.-S. Kim, G. Baasantseren, and N. Kim, “Fresnel and Fourier hologram generation using orthographic projection images,” Opt. Express 17(8), 6320–6334 (2009). [CrossRef]   [PubMed]  

21. H.-S. Kim, K.-M. Jeong, S.-I. Hong, N.-Y. Jo, and J.-H. Park, “Analysis of image distortion based on light ray field by multi-view and horizontal parallax only integral imaging display,” Opt. Express 20(21), 23755–23768 (2012). [CrossRef]   [PubMed]  

22. H. O. Bartelt, K.-H. Brenner, and A. W. Lohmann, “The Wigner distribution function and tis optical production,” Opt. Commun. 32(1), 32–38 (1980). [CrossRef]  

23. Z. Zhang and M. Levoy, “Wigner distributions and how they relate to the light field,” 2009 IEEE International Conference on Computational Photography (ICCP), San Francisco, CA, 2009, pp. 1–10. [CrossRef]  

24. J.-H. Park, S. K. Lee, N. Y. Jo, H. J. Kim, Y. S. Kim, and H. G. Lim, “Light ray field capture using focal plane sweeping and its optical reconstruction using 3D displays,” Opt. Express 22(21), 25444–25454 (2014). [CrossRef]   [PubMed]  

Cited By

Optica participates in Crossref's Cited-By Linking service. Citing articles from Optica Publishing Group journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (8)

Fig. 1
Fig. 1 Computation scheme of the proposed method in comparison with conventional hogel based methods.
Fig. 2
Fig. 2 Maximum depth for different spatial bandwidth of the object (λ = 532nm).
Fig. 3
Fig. 3 Examples of orthographic view images used for the hologram synthesis.
Fig. 4
Fig. 4 Holograms synthesized by the proposed method and their reconstructions. 1st and 2nd columns show amplitude and phase of the synthesized holograms. 3rd column is angular spectrum, i.e. amplitude of Fourier transform of the hologram. 4th and 5th columns are numerical reconstructions at z = −3mm and 2mm. Each row corresponds to different carrier wave.
Fig. 5
Fig. 5 Comparison of reconstruction resolution between the proposed method and conventional hogel based method. In both cases, 10 × 10 orthographic view images are used and the pixel resolution of the hologram is 1000(H) × 1060(V). The object, i.e. resolution target pattern is apart from the hologram plane by 1.9mm. Amplitude and phase of the hologram and its numerical reconstruction at z = 1.9mm for the (a) proposed method, and (b) conventional hogel based method.
Fig. 6
Fig. 6 Experimental setup.
Fig. 7
Fig. 7 Optical experimental result for 3D reconstruction. The reconstruction was captured with the camera focus at (a) front and (b) rear object.
Fig. 8
Fig. 8 Optical experimental result for resolution comparison. Reconstructions of the holograms synthesized by (a) the proposed method and (b) the conventional hogel based method.

Equations (16)

Equations on this page are rendered with MathJax. Learn more.

L( x,y,u,v )= m a m 2 δ( u+ x x m λ z m ,v+ y y m λ z m ) ,
H(x,y)= H( x,y; x c , y c )W( x c , y c )d x c d y c ,
H( x,y; x c , y c )= L( x+ x c 2 , y+ y c 2 ,u,v ) e j2π{ u( x x c )+v( y y c ) } dudv .
H( x,y; x c , y c )= m a m 2 δ( u+ x+ x c 2 x m λ z m ,v+ y+ y c 2 y m λ z m ) e j2π{ u( x x c )+v( y y c ) } dudv = m a m 2 exp[ j π λ z m { ( x x m ) 2 + ( y y m ) 2 } ]exp[ j π λ z m { ( x c x m ) 2 + ( y c y m ) 2 } ].
x= t x + τ x 2 ,y= t y + τ y 2 , x c = t x τ x 2 , y c = t y τ y 2 .
H( x,y; x c , y c )=H( t x + τ x 2 , t y + τ y 2 ; t x τ x 2 , t y τ y 2 ) = L( t x , t y ,u,v )exp[ j2π( u τ x +v τ y ) ] dudv = L ˜ ( t x , t y , τ x , τ y )= L ˜ ( x+ x c 2 , y+ y c 2 ,x x c ,y y c ),
H(x,y)= L ˜ ( x+ x c 2 , y+ y c 2 ,x x c ,y y c )W( x c , y c ) d x c d y c ,
U(x)= m U m (x) = m a m e j ϕ m exp[ j π λ z m ( x x m ) 2 ] ,
WDF( t x ,u)= U( t x + τ x 2 ) U * ( t x τ x 2 ) e j2πu τ x d τ x = m U m ( t x + τ x 2 ) U m * ( t x τ x 2 ) e j2πu τ x d τ x + m,n(m) U m ( t x + τ x 2 ) U n * ( t x τ x 2 ) e j2πu τ x d τ x = m a m 2 δ( u+ t x x m λ z m ) + m,n(m) U m ( t x + τ x 2 ) U n * ( t x τ x 2 ) e j2πu τ x d τ x =L( t x ,u)+ m,n(m) U m ( t x + τ x 2 ) U n * ( t x τ x 2 ) e j2πu τ x d τ x ,
WDF( t x ,u) e j2πu τ x du | t x = x+ x c 2 , τ x =x x c =U( x ) U * ( x c )= m U m (x) n U n * ( x c ) ,
L( t x ,u) e j2πu τ x du | t x = x+ x c 2 , τ x =x x c = L ˜ ( t x , τ x )| t x = x+ x c 2 , τ x =x x c = m U m (x) U m * ( x c ) = m a m 2 exp[ j π λ z m ( x x m ) 2 ] exp[ j π λ z m ( x c x m ) 2 ]=H(x; x c ),
Δ t x =Δx, N tx = N x .
Δu N u = 1 Δ τ x = 1 Δx = sin θ full λ ,
B tx = B x , B u =λ| z | B x ,
Δ t x 1 B x ,Δu 1 λ| z | B x .
N u λ| z | B x 2 .
Select as filters


Select Topics Cancel
© Copyright 2024 | Optica Publishing Group. All rights reserved, including rights for text and data mining and training of artificial technologies or similar technologies.