Expand this Topic clickable element to expand a topic
Skip to content
Optica Publishing Group

Performance enhancement of integral imaging based Fresnel hologram capturing by the intermediate view reconstruction

Open Access Open Access

Abstract

A method aiming at improving the performance of integral imaging (II) based Fresnel hologram is proposed, which is generated by using the intermediate view reconstruction (IVR). The conventional integral holograms are generally generated through Fourier transforming the elemental images (EI) of II into hogels. However, a trade-off between the angular resolution and the spatial resolution of II is inevitable within the generation of integral hologram. The IVR is introduced to enhance the angular spectrum of II-based Fresnel hologram while keeping a compact image size and being free from moving the lenslet array. Multiple elemental image array (EIA) sequences are generated with the IVR and transformed to the corresponding holograms. All the generated hologram sequences shift depending on the relative position of the virtual lens array and are added together to synthesize the Fresnel hologram with a high angular spectrum. The synthesized hologram can reconstruct the 3D image with the combined light fields of all the integral hologram sequences. Finally, both the simulation with multiple objects and experiments of real 3D object are numerically and optically conducted. The high matching results among them confirm this work a better performance over the conventional methods.

© 2019 Optical Society of America under the terms of the OSA Open Access Publishing Agreement

1. Introduction

Holographic display is a commonly discussed type of diffraction-based display technology with the advantages of full parallax, autostereoscopic, wide viewing angle, and sufficient provision of phase and amplitude for the recorded object [13]. It is even considered as the most promising technique for the next generation of 3-D television. Breakthrough achievements on the holographic display have been published in the journal of Science and Nature [49]. However, the acquisition of the hologram pattern is still tough to realize due to the utilization of interference optical system, in which, the extremely stable optical platform, the coherent light source and the darkroom are required. In order to record the hologram of the real 3D object easily, methods for mitigating the above strict requirement have been researched. J. Rosen et. al proposed a Fresnel incoherent correlation holography (FINCH) technique to record the hologram under incoherent illumination [10]. However, as the optical field is coherence in the coherence length, only objects thinner than the coherence length can be clearly recorded [10]. Computer-generated hologram (CGH) algorithms combining with depth camera were also proposed to capture the hologram of the real object under natural light [11,12]. However, the occlusion and shading information of the 3D scene are hard to be presented as the processes of each point or slice are independent due to the optical transmission.

Integral holography works as an effective way to record the hologram under natural light and is able to address the above issues [1315]. By inserting a 2-D lenslet array between the captured object and the camera, multi-view projection images can be captured and reproduced in space. The Fresnel hologram can be generated from the tiling of all hogels obtained from FFT process of corresponding views according to the Fresnel diffraction theory [16], and the Fourier hologram can be obtained from multiple orthographic projection-view images [17]. Because of the simplicity, the research group of NICT even developed a real-time capture and reconstruction system by the generation of 8 K holograms from 4 K integral imaging images [16,18]. In fact, according to the angular spectrum propagation (ASP) theory, the optical field can be regarded as a combination of various plane wave components [19]. That is, hogels transforming from FFT of EI represent the directional angular information and can reproduce the optical field of the 3-D object.

Figure 1 shows two integral Fresnel holograms generated from EIAs with same pixel resolution and different number of EI, and their optical reconstruction. The results in Figs. 1(b–4) and 1(a–4) show obviously that the hologram generated from EIA with a large amount of EI from small lenslet pitch provide better reconstruction performance.

 figure: Fig. 1.

Fig. 1. Schematic of integral imaging pick-ups with different lenslet array and their corresponding integral holograms: (a) Large pitch capture. (b) Relative small pitch capture.

Download Full Size | PDF

However, the fundamental tradeoff between the spatial resolution and the angular resolution in integral imaging extends to integral holography. That is, the small lens pitch and the pixel pitch of digital camera may limit the spatial resolution of the captured EIs and lead to a degraded 3D image quality in the reconstruction. For the integral hologram, the insufficient spatial resolution of EI can lead to an aliasing effect from the FFT process. On the other hand, the large lens pitch will increase the angular interval of the integral hologram, and cause unsatisfactory performance because of the insufficient angular information [20]. Therefore, in order to improve the performance of integral hologram restricted by the spatio-angular resolution trade-off in integral imaging, several methods were proposed. N. Chen et al. analyzed the parameters that affect the performance of the reconstruction of a Fourier hologram that is generated from multiple orthographic views. Based on the analysis, the method to enhance the reconstruction performance by shifting the lenslet array was proposed to increase the accuracy of angular information [21]. However, for this method, the information of dynamic scenes is difficult to be captured in real time. They also compared the reconstruction performance of the integral hologram obtained from a rectangular lenslet array and a hexagonal lenslet array. From the sampling theory, the results showed that the performance of reconstruction adopting the hexagonal lens array is better because of the denser sampling rate [22]. K. Wakunami et al. introduced a dense ray-sampling (RS) plane which was set near the object and applied it in the integral hologram capturing association with the light field rendering technique. This method enhances the performance of integral hologram reconstruction with no need of mutli-capturing, but the limitation from the angular-spatial resolution trade-off of integral imaging still exists in the RS plane [23,24]. Y. Chen et al. proposed an integral Fourier holographic imaging method based on micro-scanning of the lenslet array. The micro-scanning can increase the sampling rate in spatial frequency domain and eliminate the overlapping effect in the reconstructed 3D image [20]. However, this method is achieved by increasing the total pixel number of the generated hologram and is not proper to the current holographic display device.

Accordingly, in this paper, a method for improving the performance of the integral Fresnel hologram is proposed with no need of lenslet shifting and pixel number increasing. Due to the superposition principle that the total complex amplitude in the hologram plane is the sum of all plane contributions, the integral Fresnel hologram containing more angular information is generated by the superimposition of IVR based integral holograms. Simulations is performed with a single planar object to analyze the superiority of the proposed method. Finally, the colorful hologram reconstruction of multiple objects and the optical holographic display of the real 3D object captured by commercial light field camera are achieved with better performance.

2. Proposed method

Figure 2 depicts the overall block-diagram of our proposed method, which includes three phases. Firstly, the real EIA is captured by the integral imaging direct-pickup system in a single shot, and then, the IVR algorithm is adapted to generate EIs at the different sampling point and merged into EIAs. Secondly, all EIAs at their relative virtual capturing position are transformed to the corresponding integral Fresnel holograms through FFT process of EIs, afterwards, the resulting hologram with more angular information is generated by the superimposition of these integral holograms with the relative displacements. At final step, the reconstruction based on Fresnel diffraction is performed at the calculated depth.

 figure: Fig. 2.

Fig. 2. The whole structure of the proposed performance enhanced integral holography.

Download Full Size | PDF

2.1 EIAs acquisition

The data collection procedure of our proposed method is straightforward. As a critical distinction from the conventional holography interference technique, we only need to acquire the intensity of EIA based on the integral imaging direct pick-up system with a single shot. Then, instead of lenslet shifting, the other EIAs are generated by synthesizing the intermediate EIs through the IVR algorithm [25,26], the concept of the IVR used in the intermediate EI generation method is shown in Fig. 3.

 figure: Fig. 3.

Fig. 3. The principle of the intermediate EI generation at any sampling position.

Download Full Size | PDF

Two elemental images Ep, q (x, y) and Ep, q+1(x, y) and their disparity d(x, y) can generate the intermediate elemental image Ep, q+α(x, y), as shown in Code 1 [30] . The position of the corresponding intermediate elemental image (EI) is defined with a normalized distance α from the left EI. The distance from the left to the right view plane is normalized to 1, thus, 0 ≤α ≤1. Therefore, the intermediate EI can be synthesized by a linear combination of two images using interpolation. Here, the interpolation is a technique to represent an arbitrary continuous function as a discrete sum of weighted and shifted synthesis functions. Equation (1) shows the case of interpolation with a weighted mean by using the position value of the viewpoints as follows:

$${E_{p,q + \alpha }}(x,y) = (1 - \alpha ){E_{p,q}}(x + \alpha d(x,y),y) + \alpha {E_{p,q + 1}}(x - (1 - \alpha )d(x,y),y).$$
Assume that the (p, q)th elemental image is denoted by Ep, q(x, y), where x and y are pixel positions within the EI, p = 1, 2, . . ., P and q = 1, 2, . . ., Q. The IVR algorithm is applied to two neighboring EIs, Ep, q (x, y) and Ep, q+1(x, y). For the generation of next line intermediate EI Ep+1, q+α(x, y), it can be obtained with the IVR process applied to Ep+1, q(x, y) and Ep+1, q+1(x, y).

In order to obtain the intermediate EI of any virtual sampling position, the intermediate views of any virtual horizontal position are synthesized firstly with the neighboring left and right EIs, then two EIs with the same horizontal coordinate and vertical adjacency ought to be rotated 90 degrees for configuring horizontal adjacency since the IVR algorithm is applied in left and right views, presented as:

$${E_{p + \beta ,q + \alpha }}(y,x) = (1 - \beta ){E_{p + 1,q + \alpha }}(y + \beta d(x,y),x) + \beta {E_{p,q + \alpha }}(y - (1 - \beta )d(x,y),x),$$
Ep+1, q+α(y, x) and Ep, q+α(y, x) is the 90 degree rotation of the Ep+1, q+α(x, y) and Ep, q+α(x, y), their disparity is d(x, y). The position of the corresponding intermediate EI is defined with a normalized distance β from the bottom EI, 0 ≤β ≤1. EIA composed of EIs with the same β and α value is defined as EIAβ,α.

For testing the performance of the IVR based integral Fresnel hologram, numerical simulation is first performed. A plane object ‘3’ shown as Fig. 4(a) at a depth of 95.5 mm was captured using the integral imaging system with a 30 × 30 lenslet array, in which the lens pitch and the focal length were 5 mm and 3 mm, respectively, and each lenslet contains 40 × 40 pixels as shown in Fig. 4(b). In order to generate another EIA virtually captured at half pitch displacement vertically and horizontally, i.e. β=1/2, α=1/2 the IVR algorithm is applied to generated a sequence of Ep+1/2, q+1/2 according to Eqs. (1) and (2), then, the EIA comprised of these synthetic EIs Ep+1/2, q+1/2 is achieved as shown in Fig. 2(c) and marked as EIA1/2,1/2. It can be imagined that the EIA1/2,1/2 was captured after moving the original lens array 1/2 lenslet pitch along both the horizontal and vertical axis. As seen from the magnified segmentation of Figs. 2(b) and 2(c), the EIs in the virtual captured EIA1/2,1/2 has a little difference with that of captured EIA.

2.2 EIA-to-hologram and hologram fusion

After achieving the EIAs with their relative virtual capturing displacement, the conversion from the 2D EIA to the hologram is processed. The EIA is placed at the front focal plane of the lens array, and the hologram plane is placed at the back focal plane of the lens array. According to the Fresnel diffraction theory, the optical field at the back and the front focal plane have the relation of Fourier transform with each other [16,18]. Thus, each hogel of the optical field at the back focal plane can be obtained by Fourier transforming the corresponding EI in front of it, as shown in Code 2 [31] . Here, the hogel can be expressed as:

$${H_{p\textrm{ + }\beta ,q\textrm{ + }\alpha }}(u,v) ={-} \frac{{{e^{ - 2ikf}}}}{{\lambda f}}\; \int_{ - \infty }^\infty {\int_{ - \infty }^\infty {{E_{p\textrm{ + }\beta ,q\textrm{ + }\alpha }}(x,y)} } {e^{i2\pi \left( {\frac{{xu + yv}}{{\lambda f}}} \right)}}dxdy,$$
where Ep+β, q+α(x, y) is the light intensity distribution of EI from the captured EIA or the generated intermediate EIAs, when both α and β equal 0, the EI is the original captured EI, otherwise, it is synthesized intermediate EI. Hp+β, q+α(u, v) is the hogel at hologram plane, k and λ is the wavenumber and wavelength of the object light, respectively, f is the focal length of the lenslet array. Because the image is collected and displayed digitally, the discrete formation can be expressed as:
$${H_{p\textrm{ + }\beta ,q\textrm{ + }\alpha }}(u\Delta {p_m},v\Delta {p_n}) = \sum\limits_{x = 1}^M {\sum\limits_{y = 1}^N {{E_{p\textrm{ + }\beta ,q\textrm{ + }\alpha }}(x,y){e^{i2\pi (xu\frac{{\Delta {p_m}^2}}{{\lambda f}} + yv\frac{{\Delta {p_n}^2}}{{\lambda f}})}}} } ,$$
where M and N are the pixel number of EI in horizontal and vertical directions, respectively, $\varDelta$pm and $\varDelta$pn are the horizontal and vertical pixel pitch of the EIs. The integral Fresnel holograms can be defined as HHβ, α obtained from the corresponding EIAβ, α, and are made up of P×Q hogels transformed from the P×Q EIs. Here, it should be noticed that the following Eq. (5) needs to comply for arranging the hogels without intervening [20].
$$M\Delta {p_m}^2 = \lambda f,N\Delta {p_n}^2 = \lambda f,$$
where λ is the wavelength of the hologram illumination, and f is the focal length of the lenslet array. The Fresnel hologram is tiled with hogels, which are Fourier transformed from EIs, thus the real size of the hogel should to be set a proper value to avoid tiling’s overlapping or interspacing. Thus, the pixel number is determined by the pixel pitch of the holographic display device and the focal length of the lenslet according to Eq. (5), and it is easy to get by resizing the EI to be M×N. Inversely, the equivalent focal length also can be calculated according to a determinate M and N.

Figure 5 shows an integral Fresnel hologram transformed from the EIAs of Figs. 4(b) and 4(c). Figures 5(a) and 5(b) is the amplitude and phase of the integral complex matrix from the captured EIA shown in Fig. 4(b). Figures 5(c) and 5(d) is the amplitude and phase of the integral complex matrix from the intermediated generated EIA shown in Fig. 4(c). They are made up of 30×30 hogels with 40×40 pixel number each of them.

 figure: Fig. 4.

Fig. 4. Captured EIA and synthetic EIA with different relative virtual capturing displacement, (a) Plane object of ‘3’. (b) EIA from capture. (c) EIA1/2,1/2 from the IVR synthesis.

Download Full Size | PDF

 figure: Fig. 5.

Fig. 5. Integral Fresnel hologram generated from the two EIAs. (a) Amplitude of integral hologram from EIA1. (b) Phase of integral Fresnel hologram from EIA. (c) Amplitude of integral Fresnel hologram from EIA1/2, 1/2. (d) Phase of integral Fresnel hologram from EIA1/2, 1/2.

Download Full Size | PDF

For enhancing the performance of the generated hologram without increasing the total pixel number, the fusion approach to increase the angular spectrum of the integral hologram is applied. Because each hogel is an angular spectrum component in a unique direction, the more angular spectrums a hologram contains, the higher quality the reconstructed image has. Thus, the integral Fresnel holograms with relative virtual capturing displacement in sub-lens pitch are superimposed for the acquisition of the resulting hologram with more angular spectrum.

As shown in Fig. 6, the integral imaging reconstruction stage is an inverse process of integral imaging capturing. In the case of the same system parameters, the depth of the reconstructed 3-D image is equal to that of the captured object. For the integral Fresnel hologram generated by EIA, it could be thought as an optical field of the reconstructed 3-D image placed at the back focal plane of the lenslet shown as Fig. 6(a). It means the generated hologram can reconstruct the 3-D image according to the optics diffraction at the distance of D-f. The optical field comprised of M ×N angular spectrum hogels gets accuracy by the superimposition of multiple initial integral holograms with relative capture displacement, as shown in Fig. 6(b).

 figure: Fig. 6.

Fig. 6. The concept of hologram fusion for providing sufficient spectrum angular of the reconstructed 3D image.

Download Full Size | PDF

As mentioned above, a multitude number of EI is needed to provide sufficient spectrum angular of the captured object. However, because of the pixelate limitation, the EI should not be too small for the provision of sufficient number of pixels for FFT process. Thus, the integral holograms generated from the captured EIA and the synthetic EIAs are superimposed with their relative virtual capturing displacement to form a more accurate hologram with much angular information. For the hologram superimposition, it is processed in the spatial frequency domain of the integral hologram, as shown in Fig. 7.

 figure: Fig. 7.

Fig. 7. The flowchart of the holograms fusion process.

Download Full Size | PDF

The hologram superimposition in the spatial domain with their relative capture position displacement can be presented as:

$$HH = \frac{1}{S}\sum\limits_{s = 1}^S {H{H_{{\beta _s},{\alpha _s}}}(u - {\beta _s}M\Delta {p_m},v - {\alpha _s}N\Delta {p_n})} ,$$
where HHβs,αs are the holograms generated from EIAβs,αs, s is the index of β and α value for the total S holograms, the βsMΔpm and αsMΔpn are the relative virtual capturing displacements of the corresponding hologram HHβs,αs. However, because the overlap region and final hologram size need to be considered for the superposition in the spatial domain, the frequency domain of the holograms are applied for superimposition as shown in Fig. 7. Firstly, sequences of holograms are Fourier transformed. Secondly, as the shift in the spatial domain corresponds to a phase multiplication in the frequency domain, holograms after Fourier transform are multiplied with a phase factor corresponding to displacement inferred as:
$$HH = {F^{ - 1}}\left\{ {\frac{1}{S}\sum\limits_{s = 1}^S {F\{{H{H_{{\beta_s},{\alpha_s}}}} \}{e^{ - j\frac{{2\pi }}{{PM}}{\beta_s}M\Delta {p_m}{k_u}}}{e^{ - j\frac{{2\pi }}{{QN}}{\alpha_s}N\Delta {p_n}{k_v}}}} } \right\},$$
where F{HHβs,αs} is the frequency domain of the corresponding HHβs,αs (u, v) with the pixel number PM×QN. Finally, the converged solution in the Fourier domain is retransformed to the spatial domain with a more angular spectrum, as shown in Code 3 [32] .

Figure 8 presents the resulting hologram superimposed from holograms of Figs. 5(a) and 5(c), Fig. 8(a) is the real part of resulting hologram and Fig. 8(b) is the phase of the resulting hologram. Figure 8(c) is the magnification of the yellow grid on Fig. 8(a), and Fig. 8(b) is the magnified same area of Fig. 5(a). Comparing with Figs. 8(c) and 8(d), it can be seen that synthetic hologram of Fig. 8(a) has a more obvious strip comparing with that of conventional one shown in Fig. 8(d), usually, thus strips contains the optical field of the captured 3-D objects.

 figure: Fig. 8.

Fig. 8. (a) The amplitude of resulting hologram. (b) The phase of the resulting hologram. (c) The magnification of yellow square on (a). (d) The magnification of the same position in the hologram of Fig. 5(a).

Download Full Size | PDF

2.3 Reconstruction

Unlike the integral imaging reconstruction based on the geometry optics, the reconstruction of the integral imaging based hologram is approached by the Fresnel diffraction based on wave optics as formulated [19]:

$$I(x^{\prime},y^{\prime}) = \frac{1}{{i\lambda {d^{\prime}}}}[{e^{ikd^{\prime}}}{e^{i\pi d^{\prime}\lambda ({{(\frac{{x^{\prime}}}{{d^{\prime}\lambda }})}^2} + {{(\frac{{y^{\prime}}}{{d^{\prime}\lambda }})}^2})}}]\int\!\!\!\int {HH(u,v)r(u,v)} {e^{\frac{{i\pi }}{{d^{\prime}\lambda }}[{u^2} + {v^2}]}}{e^{ - 2\pi i[u\frac{{x^{\prime}}}{{d^{\prime}\lambda }} + v\frac{{y^{\prime}}}{{d^{\prime}\lambda }}]}}dudv,$$
where (x’, y’) is the coordinates of the reconstructed image plane, (u, v) is the coordinates of the hologram plane, HH(u, v) is the resulting hologram and r(u, v) is the reference, d’ is the reconstruction distance. Here, it is worth noting that the reconstruction distance is not always equal to the capture distance. As mentioned above, Eq. (3) should be followed based on sampling theory without aliasing [20]. Originally, the focal length of the lenslet array is f, and the capture distance is D as shown in Fig. 6. However, since the pixel number M, N in the EI is changeable, the equivalent focal length is also changed according to Eq. (5).
$$f^{\prime} = \frac{{M\Delta {p_m}^2}}{\lambda }.$$
Thus, the object distance changed to D’=Df’ from lenslet to reconstructed object, accordingly, the reconstruction distance from hologram plane to the reconstructed object is d’=D’-f’ and it can be formulated as:
$$d^{\prime} = D\frac{{M\Delta {p_m}^2}}{{\lambda f}} - \frac{{M\Delta {p_m}^2}}{\lambda }.$$
The numerical reconstruction results of the above simulations have been shown in Fig. 9. For verifying the superiority of the IVR-based integral Fresnel hologram generation method, the profiles of the reconstructed plane object ‘3’ are also figured as shown in Figs. 9(e)–9(h). Figure 9(a) is the computational integral imaging reconstruction (CIIR) result of original EIA reconstructed at 95.5 mm shown in Fig. 4(b). It is used as ground truth for the comparison of integral hologram reconstruction results. The intensity profiles along the lines in the image of Fig. 9(a) are very smooth; moreover, the width across the ‘3’is thin as plotted in Fig. 9(e). Figure 9(b) is the numerical reconstruction resulted from a hologram generated by an EIA consists of 150×150 EIs with 8×8 pixel numbers each of them. The capturing distance D is 477 mm, the reconstruction distance is 13.1 cm according to Eq. (8) and Eq. (10), in which both M and N equal to 8, $\varDelta$pm is 8.1um and λ is 632.8 nm. The intensity profiles along the lines in the image are very rough, up and down as plotted in Fig. 9(f). That is because the aliasing effect is happened caused by insufficient pixel number of EI. Figure 9(c) is the numerical reconstruction result obtained from the hologram of Fig. 5(a) generated by an EIA consists of 30×30 EIs with 40×40 pixel numbers each of them, the reconstruction distance is 12.79 cm according to Eq. (12) and Eq. (13), in which M equals to 8, $\varDelta$pm is 8.1um and λ is 632.8 nm, the capturing distance D = 95.5 mm. The intensity profiles along the lines in the image is much better compared with the Fig. 9(f), while the background is still a little noisy and the width across the ‘3’ is wide compared with CIIR result of Fig. 9(e). Figure 9(d) is the numerical reconstruction result obtained from our proposed IVR based resulting hologram of Fig. 6(a), the reconstruction distance is 12.79 cm which is same as the original that of Fig. 9(c), the resulting hologram is generated by the superimposition of two holograms transforming from the captured EIA and IVR-based generated EIA. The intensity profile along the line in the image is better than that of the conventional one in Fig. 9(f), the background is flat and the width across the ‘3’ is thinner than the Figs. 9(f) and 9(g). Thus, it is preliminarily verified that the reconstruction quality of integral Fresnel hologram is improved using IVR based superposition.

 figure: Fig. 9.

Fig. 9. Numerical reconstruction results of (a) CIIR of EIA shown in Fig. 4(b). (b) Integral Fresnel hologram generated from insufficient pixel number of EI. (c) Integral Fresnel hologram of Fig. 5(a). (d) The resulting hologram of Fig. 8(a). (e) The intensity profile along the white line in Fig. 9(a). (f) The intensity profile along the white line in Fig. 9(b). (g) The intensity profile along the white line in Fig. 9(c). (h) The intensity profile along the white line in Fig. 9(d).

Download Full Size | PDF

3. Experiment

Numerical simulations with two objects and experiment of the real 3-D object was performed to further verify the reliability of the proposed method. Firstly, an Olympic rings with a background of the mosaic were built in the modeling software(3ds Max). A virtual lenslet array was also built to capture the EIA of the 3D scene. The lenslet array consisted of 75×75 elemental lens, and the pitch and focal length were set as 5 mm and 3 mm. The resolution of each

EIA sequence was 1200×1200 pixels. Three EIAs were generated based on the IVR algorithm with the relative displacement of 1/2 lenslet pitch horizontally, 1/2 lenslet pitch vertically and 1/2 lenslet pitch both in horizontal and vertical directions. The numerical reconstructions were performed using the Eq. (8). To avoid the speckle noise, we did not add a random phase to each elemental image in the FFT calculation. Figures 10(a) and 10(d) are the original two objects of Olympic rings and mosaic placed at 204 mm and 252 mm, respectively. Figures 10(b) and 10(e) are the reconstructed images on the Olympic rings and mosaic from the conventional integral Fresnel hologram. Figures 10(c) and 10(f) are the reconstructed images of focused on the Olympic rings and mosaic background from the final hologram with IVR based superimposition. It can be seen that in Figs. 10(b) and 10(e), the image quality is low with jagged edges caused by the insufficient angular frequency. While in Figs. 10(c) and 10(f), the images have brighter intensity due to the increased angular frequency presented on the resulting hologram. Therefore, the peak signal-to-noise ratio (PSNR) between the reconstructed images and the original image were also calculated. The PSNR for Figs. 10(b) and 10(c) are 12.9231 dB and 13.4769 dB, respectively, an improvement of 0.5 dB is obtained. The PSNR for Figs. 10(e) and 10(f) are 4.9933 dB and 4.98 dB, respectively, these values are contrary to our subjective assessment because of the occlusion of Olympic rings. The image quality improvement is confirmed in the comparison, and the advantage of the proposed method is that no scanning technique is needed in the capturing stage and no time-multiplexing technique is needed in the display stage.

 figure: Fig. 10.

Fig. 10. Simulation for two color objects:(a) Image of Olympic rings. (b) The numerical reconstruction result focused on 11.2 cm of conventional integral hologram. (c) The numerical reconstruction result focused on 11.2 cm the resulting hologram. (d) Image of background mosaic. (e) The numerical reconstruction result focused on 13.9 cm of conventional integral hologram. (f) The numerical reconstruction result focused on 13.9 cm of resulting hologram.

Download Full Size | PDF

Secondly, practical experiments of 3-D real object integral imaging capturing and the holographic displaying were conducted. As shown in Fig. 11(a) 3D doll is captured by the light field camera. After rearranging the original light field image and cropping the effective area of the 3-D doll, as shown in Code 4 [33], the EIA containing 45×45 EIs was obtained with 45×45pixel number each of them as shown in Fig. 11(c), the reconstruction distance of CIIR was 30 times the focal length; the reconstruction result of CIIR based on ray optics is shown in Fig. 11(d). Then, the sequences of EIs were generated by setting the step of Eq. (2) to be 1/3 lens pitch vertically and horizontally, respectively. It could imagine that nine EIAs are captured by moving lenslet array shifting with 1/3 lens pitch step. The optical reconstructions were performed on the holographic display system, in here, the laser Modeled GCL-200-S with 532 nm wavelength is used to illuminate the reflective SLM typed of LETO from HOLOEYE, the pixel pitch of this SLM is 6.4um. To obtain an amplitude-type hologram, the complex matrix was encoded into the intensity distribution. Then, the hologram was cropped to 1920×1080 from 2025×2025 for fitting the pixel resolution of SLM. The distance of 3-D image to the SLM can be calculated to be around 10 cm according to Eq. (10) where M = 45, Δpm is 6.4um and λ is 532 nm after subtracting the length of the 4f system. It can be seen that in Fig. 11(e), the image quality is low and even cannot be seen. While in Fig. 11(f), the images have a brighter intensity and obvious profiles of the 3-D object. Of course, it can be noted that the multi-facet structures still exist in Fig. 11(f) because only 9 holograms are superimposed. Meanwhile, due to the imperfection of the applied IVR algorithm, the number of virtual EIA cannot be continues increased. Otherwise, the low quality of the IVR based generated EIAs will affect the performance of resulting holograms. In the experiment, we also demonstrate the feasibility of hologram generation adopting the commercialized light field camera, best of our knowledge, we are the first to apply the commercialized light field camera for the hologram generation. It may be possible to provide the reference for the holographic camera.

 figure: Fig. 11.

Fig. 11. Optical experiment of real 3D object: (a) Capturing setup with the light field camera. (a) Setup of the holographic display. (c) EIA resampled from original light filed image. (d) CIIR of the EIA at the depth of 30 times focal length. (e) Optical reconstruction result from conventional integral Fresnel hologram. (f) Optical reconstruction of the resulting hologram.

Download Full Size | PDF

4. Conclusion and discussion

In this paper, a performance-enhanced integral Fresnel hologram using the IVR algorithm was proposed to break through the inherent limitation of angular-spatial resolution trade-off in integral imaging. With no need of lenslet shifting and pixel number increasing, the integral holograms at different virtual capturing positions were generated from the IVR based EIAs and superimposed into one resulting hologram with more angular information. Simulations with a single planar object showed the superiority of the proposed method. Experiments applying the commercialized light field camera have demonstrated the feasibility of hologram generation, and can provide the reference for the holographic camera.

Motivated by developing a device that can capture the hologram of real scene in an easy way, the integral holography comes to our eye as this technique can generate the hologram from the 2D intensity image based EIA. By simply performing the FFT on the EIs, corresponding hogels presenting directions of 3D information are obtained and are merged into a Fresnel hologram at the lenslet back focal plane. However, the integral hologram suffers from a low accuracy comparing with the interference based hologram because of the sparse angular information on the integral hologram. Meanwhile, the angular sampling interval cannot reduce too much in a single shot of EIA for providing sufficient EI’s spatial resolution for FFT process. Thus, for generating the integral hologram with dense angular information, our method was proposed based on the IVR algorithm. As a result, we can generate the integral hologram with better reconstruction performance using the Lytro camera. Even though the proposed method has been successfully demonstrated, some issues for the final holographic camera should be considered. One of the issues is the effectiveness and efficiency of our applied IVR algorithm. According to Eq. (1), the disparity map d(x, y) directly affect the image quality of generated intermediate EI, in this paper, we used the disparity estimation method from [27], hence a more efficient algorithm can be applied for better performance [28]. In addition, when the elemental lens baseline becomes wider, there will be significant errors and blurs on synthesized views, due to scene depth discontinuity. Thus, convolutional neural network (CNN) can be used to synthesize novel views based on supervised learning [29]. Another issue is the correction of aberrations introduced by the main lens of the camera at the capturing stage. The EIs lying behind their corresponding elemental lens may shift as much as a full microlens pitch. The pixels of raw images are monochromatic and belong to one specific RGB wavelength. The colorful pixel values are the mixing of neighboring RGB pixels. The raw integral image before mixing is not fully incorporated into the reconstruction. Thus, the aberration correction of integral imaging based optical device for the degraded captured EIA should be considered further. These are interesting future issues that should be considered for the holographic camera development with high reconstruction performance.

Funding

Natural Science Foundation of Jiangsu Province (BK20180597); National Natural Science Foundation of China (61703185); Fundamental Research Funds for the Central Universities (252050205171780); The 111 project (B12018).

References

1. P. W. M. Tsang and T.-C. Poon, “Review on the state-of-the-art technologies for acquisition and display of digital holograms,” EE Trans. Ind. Inf. 12(3), 886–901 (2016). [CrossRef]  

2. S.-F. Lin and E.-S. Kim, “Single SLM full-color holographic 3-D display based on sampling and selective frequency-filtering methods,” Opt. Express 25(10), 11389–11404 (2017). [CrossRef]  

3. Y. Sando, D. Barada, and T. Yatagai, “Holographic 3D display observable for multiple simultaneous viewers from all horizontal directions by using a time division method,” Opt. Lett. 39(19), 5555–5557 (2014). [CrossRef]  

4. A. P. Mosk, A. Lagendijk, G. Lerosey, and M. Fink, “Controlling waves in space and time for imaging and focusing in complex media,” Nat. Photonics 6(5), 283–292 (2012). [CrossRef]  

5. P.-A. Blanche, A. Bablumian, R. Voorakaranam, C. Christenson, W. Lin, T. Gu, D. Flores, P. Wang, W.-Y. Hsieh, M. Kathaperumal, B. Rachwal, O. Siddiqui, J. Thomas, R. A. Norwood, M. Yamamoto, and N. Peyghambarian, “Holographic three-dimensional telepresence using large-area photorefractive polymer,” Nature 468(7320), 80–83 (2010). [CrossRef]  

6. D. E. Smalley, E. Nygaard, K. Squire, J. Van Wagoner, J. Rasmussen, S. Gneiting, K. Qaderi, J. Goodsell, W. Rogers, M. Lindsey, K. Costner, A. Monk, M. Pearson, B. Haymore, and J. Peatross, “A photophoretic-trap volumetric display,” Nature 553(7689), 486–490 (2018). [CrossRef]  

7. L. Huang, X. Chen, H. Mühlenbernd, H. Zhang, S. Chen, B. Bai, Q. Tan, G. Jin, K.-W. Cheah, C.-W. Qiu, J. Li, T. Zentgraf, and S. Zhang, “Three-dimensional optical holography using a plasmonic metasurface,” Nat. Commun. 4(1), 2808 (2013). [CrossRef]  

8. K. Wakunami, P.-Y. Hsieh, R. Oi, T. Senoh, H. Sasaki, Y. Ichihashi, and K. Yamamoto, “Projection-type see-through holographic three-dimensional display,” Nat. Commun. 7(1), 12954 (2016). [CrossRef]  

9. M. Ozaki, J.-I. Kato, and S. Kawata, “Surface-Plasmon Holography with White-Light Illumination,” Science 332(6026), 218–220 (2011). [CrossRef]  

10. J. Rosen and G. Brooker, “Non-scanning motionless fluorescence three-dimensional holographic microscopy,” Nat. Photonics 2(3), 190–195 (2008). [CrossRef]  

11. G. Li, A. H. Phan, N. Kim, and J. H. Park, “Synthesis of computer-generated spherical hologram of real object with 360 field of view using a depth camera,” Appl. Opt. 52(15), 3567–3575 (2013). [CrossRef]  

12. Y. Zhao, K. Kwon, Y. Piao, Y. Lim, S. Jeon, and N. Kim, “Computer generated holograms of real object from depth camera using polygon-based method,” in Imaging and Applied Optics 2016, OSA Technical Digest (online) (Optical Society of America, 2016), paper DW5I.8.

13. J.-S. Chen and D.-P. Chu, “Improved layer-based method for rapid hologram generation and real-time interactive holographic display applications,” Opt. Express 23(14), 18143–18155 (2015). [CrossRef]  

14. Q. Smithwick, “Coarse integral holographic display,” U.S. Patent 9,310,769[P]. (2016).

15. H. Yoshikawa and H. kameyama, “Integral holography,” Proc. SPIE 2406, 226–234 (1995). [CrossRef]  

16. Y. Ichihashi, R. Oi, T. Senoh, K. Yamamoto, and T. Kurita, “Real-time capture and reconstruction system with multiple GPUs for a 3D live scene by a generation from 4 K IP images to 8 K holograms,” Opt. Express 20(19), 21645–21655 (2012). [CrossRef]  

17. T. Mishina, M. Okui, and F. Okano, “Calculation of holograms from elemental images captured by integral photography,” Appl. Opt. 45(17), 4026–4036 (2006). [CrossRef]  

18. H. Sasaki, K. Yamamoto, K. Wakunami, Y. Ichihashi, R. Oi, and T. Senoh, “Large size three-dimensional video by electronic holography using multiple spatial light modulators,” Sci. Rep. 4(1), 6177 (2015). [CrossRef]  

19. M.-K. Kim, Digital Holographic Microscopy (Springer, 2011), Chap. 2.

20. C. Yang, X. Wang, J. Zhang, M. Martinez-Corral, and B. Javidi, “Reconstruction Improvement in Integral Fourier Holography by Micro-Scanning Method,” J. Display Technol. 11(9), 709–714 (2015). [CrossRef]  

21. N. Chen, J.-H. Park, and N. Kim, “Parameter analysis of integral Fourier hologram and its resolution enhancement,” Opt. Express 18(3), 2152–2167 (2010). [CrossRef]  

22. N. Chen, J. Yeom, J.-H. Jung, J.-H. Park, and B. Lee, “Resolution comparison between integral-imaging-based hologram synthesis methods using rectangular and hexagonal lens arrays,” Opt. Express 19(27), 26917–26927 (2011). [CrossRef]  

23. K. Wakunami and M. Yamaguchi, “Calculation for computer generated hologram using ray-sampling plane,” Opt. Express 19(10), 9086–9101 (2011). [CrossRef]  

24. K. Wakunami, M. Yamaguchi, and B. Javidi, “High-resolution three-dimensional holographic display using dense ray sampling from integral imaging,” Opt. Lett. 37(24), 5103–5105 (2012). [CrossRef]  

25. D.-C. Hwang, J.-S. Park, S.-C. Kim, D.-H. Shin, and E.-S. Kim, “Magnification of 3D reconstructed images in integral imaging using an intermediate-view reconstruction technique,” Appl. Opt. 45(19), 4631–4637 (2006). [CrossRef]  

26. J.-S. Park, D.-C. Hwang, D.-H. Shin, and E. S. Kim, “Enhanced-resolution computational integral imaging reconstruction using an intermediate-view reconstruction technique,” Opt. Eng. 45(11), 117004 (2006). [CrossRef]  

27. M.-H. Kim and K.-H. Sohn, “Edge-preserving directional regularization technique for disparity estimation of steroscopic images,” IEEE Trans. on Consumer Electron. 45(3), 804–811 (1999). [CrossRef]  

28. Z. Pan, Y. Zhang, and S. Kwong, “Efficient Motion and Disparity Estimation Optimization for Low Complexity Multiview Video Coding,” IEEE Trans. on Broadcast. 61(2), 166–176 (2015). [CrossRef]  

29. N. Kalantari, T. Wang, and R. Ramamoorthi, “Learning-based view synthesis for light field cameras,” ACM Trans. Graph. 35(6), 1–10 (2016). [CrossRef]  

30. L. Ai, “Intermediate view image reconstruction,” Figshare. (2019). https://doi.org/10.6084/m9.figshare.9735725

31. L. Ai, “Hologram generation from elemental image array,” Figshare. (2019) https://doi.org/10.6084/m9.figshare.9735728

32. L. Ai, “Hologram superimposition,” Figshare. (2019) https://doi.org/10.6084/m9.figshare.9735731

33. L. Ai and X. Shi, “Light field image processing,” Figshare. (2019). https://doi.org/10.6084/m9.figshare.9735746

Supplementary Material (4)

NameDescription
Code 1       the algorithm of intermediated view reconstruction, named IVRsoteware.zip
Code 2       hologram generation from EIA
Code 3       he algorithm of holograms superimposition in the spatial frequency domain, named HoloSuper.m
Code 4       the algorithm of EIA acquisition from light field camera’s raw data, named LF-code.zip

Cited By

Optica participates in Crossref's Cited-By Linking service. Citing articles from Optica Publishing Group journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (11)

Fig. 1.
Fig. 1. Schematic of integral imaging pick-ups with different lenslet array and their corresponding integral holograms: (a) Large pitch capture. (b) Relative small pitch capture.
Fig. 2.
Fig. 2. The whole structure of the proposed performance enhanced integral holography.
Fig. 3.
Fig. 3. The principle of the intermediate EI generation at any sampling position.
Fig. 4.
Fig. 4. Captured EIA and synthetic EIA with different relative virtual capturing displacement, (a) Plane object of ‘3’. (b) EIA from capture. (c) EIA1/2,1/2 from the IVR synthesis.
Fig. 5.
Fig. 5. Integral Fresnel hologram generated from the two EIAs. (a) Amplitude of integral hologram from EIA1. (b) Phase of integral Fresnel hologram from EIA. (c) Amplitude of integral Fresnel hologram from EIA1/2, 1/2. (d) Phase of integral Fresnel hologram from EIA1/2, 1/2.
Fig. 6.
Fig. 6. The concept of hologram fusion for providing sufficient spectrum angular of the reconstructed 3D image.
Fig. 7.
Fig. 7. The flowchart of the holograms fusion process.
Fig. 8.
Fig. 8. (a) The amplitude of resulting hologram. (b) The phase of the resulting hologram. (c) The magnification of yellow square on (a). (d) The magnification of the same position in the hologram of Fig. 5(a).
Fig. 9.
Fig. 9. Numerical reconstruction results of (a) CIIR of EIA shown in Fig. 4(b). (b) Integral Fresnel hologram generated from insufficient pixel number of EI. (c) Integral Fresnel hologram of Fig. 5(a). (d) The resulting hologram of Fig. 8(a). (e) The intensity profile along the white line in Fig. 9(a). (f) The intensity profile along the white line in Fig. 9(b). (g) The intensity profile along the white line in Fig. 9(c). (h) The intensity profile along the white line in Fig. 9(d).
Fig. 10.
Fig. 10. Simulation for two color objects:(a) Image of Olympic rings. (b) The numerical reconstruction result focused on 11.2 cm of conventional integral hologram. (c) The numerical reconstruction result focused on 11.2 cm the resulting hologram. (d) Image of background mosaic. (e) The numerical reconstruction result focused on 13.9 cm of conventional integral hologram. (f) The numerical reconstruction result focused on 13.9 cm of resulting hologram.
Fig. 11.
Fig. 11. Optical experiment of real 3D object: (a) Capturing setup with the light field camera. (a) Setup of the holographic display. (c) EIA resampled from original light filed image. (d) CIIR of the EIA at the depth of 30 times focal length. (e) Optical reconstruction result from conventional integral Fresnel hologram. (f) Optical reconstruction of the resulting hologram.

Equations (10)

Equations on this page are rendered with MathJax. Learn more.

E p , q + α ( x , y ) = ( 1 α ) E p , q ( x + α d ( x , y ) , y ) + α E p , q + 1 ( x ( 1 α ) d ( x , y ) , y ) .
E p + β , q + α ( y , x ) = ( 1 β ) E p + 1 , q + α ( y + β d ( x , y ) , x ) + β E p , q + α ( y ( 1 β ) d ( x , y ) , x ) ,
H p  +  β , q  +  α ( u , v ) = e 2 i k f λ f E p  +  β , q  +  α ( x , y ) e i 2 π ( x u + y v λ f ) d x d y ,
H p  +  β , q  +  α ( u Δ p m , v Δ p n ) = x = 1 M y = 1 N E p  +  β , q  +  α ( x , y ) e i 2 π ( x u Δ p m 2 λ f + y v Δ p n 2 λ f ) ,
M Δ p m 2 = λ f , N Δ p n 2 = λ f ,
H H = 1 S s = 1 S H H β s , α s ( u β s M Δ p m , v α s N Δ p n ) ,
H H = F 1 { 1 S s = 1 S F { H H β s , α s } e j 2 π P M β s M Δ p m k u e j 2 π Q N α s N Δ p n k v } ,
I ( x , y ) = 1 i λ d [ e i k d e i π d λ ( ( x d λ ) 2 + ( y d λ ) 2 ) ] H H ( u , v ) r ( u , v ) e i π d λ [ u 2 + v 2 ] e 2 π i [ u x d λ + v y d λ ] d u d v ,
f = M Δ p m 2 λ .
d = D M Δ p m 2 λ f M Δ p m 2 λ .
Select as filters


Select Topics Cancel
© Copyright 2024 | Optica Publishing Group. All rights reserved, including rights for text and data mining and training of artificial technologies or similar technologies.