Expand this Topic clickable element to expand a topic
Skip to content
Optica Publishing Group

Digital holographic content manipulation for wide-angle holographic near-eye displays

Open Access Open Access

Abstract

In recent years, the development of holographic near-eye displays (HNED) has surpassed the progress of digital hologram recording systems, especially in terms of wide-angle viewing capabilities. Thus, there is capture-display parameters incompatibility, which makes it impossible to reconstruct recorded objects in wide-angle display. This paper presents a complete imaging chain extending the available content for wide-angle HNED of pupil and non-pupil configuration with narrow-angle digital holograms of real objects. To this end, a new framework based on the phase-space approach is proposed that includes a set of affine transformations required to account for all differences in capture-display cases. The developed method allows free manipulation of the geometry of reconstructed objects, including axial and lateral positioning and size scaling. At the same time, it has a low computational effort. The presented work is supported with non-paraxial formulas developed using the phase-space approach, enabling accurate tracing of the holographic signal, its reconstruction, and measuring appearing deformations. The applicability of the proposed hologram manipulation method is proven with experimental results of digital hologram reconstruction in wide-angle HNED.

Published by Optica Publishing Group under the terms of the Creative Commons Attribution 4.0 License. Further distribution of this work must maintain attribution to the author(s) and the published article's title, journal citation, and DOI.

1. Introduction

Virtual and augmented reality (VR/AR) are technologies that are gradually entering our lives thanks to the continued development of electronic devices such as smartphones, video game consoles, and computers [1]. These visual technologies open new ways of working and entertaining, enabling interaction with virtual objects [2]. Among many technologies, near-eye display (NED) [3,4] is envisioned as the next VR/AR platform because it enables a new way of immersion [3]. To become an ideal VR/AR platform, NED faces serious technical challenges, as the display parameters must match human vision, where the main ones are wide viewing angle and lack of vergence accommodation conflict (VAC) [4]. Holographic NED (HNED), considered the future of VR/AR [5], provides a reconstruction of the entire optical field with all perceptual cues, making it VAC-free. The wide field of view (FOV), comparable to that of humans, has already been successfully implemented in HNEDs [5,6]. HNEDs usually work with laser illumination, reducing image quality by high contrast speckle noise. The use of LED illumination [7] reduces speckles and provides a high level of safety for the eyes. In addition, it has been shown that LED illumination HNED does not reduce image resolution [8]. A key element in the success of HNED technology is the availability of high-quality content. Such holographic content can come from either real [9,10] or virtual [11] objects. While computer-generated holographic (CGH) content for HNED is broadly researched, there are no studies for wide-angle holographic content from real objects applied in HNEDs.

High-quality holographic content for HNEDs is challenging because it requires a large space-bandwidth product (SBP). Thus, large computational effort is demanded when considering large FOV, hologram accuracy, and high quality. One of the content sources is CGH methods, which are classified by used units for hologram encoding [12], for example: cloud of points [13], collection of polygons [14] or set of axial slices [15,16]. Independently of the chosen method, the ideal CGH algorithm should fulfill two contradictory conditions: fast calculations and wide-angle hologram generation, including perceptional cues. Regarding calculation time, it has been shown that real-time hologram generation can be achieved, but it applies only to small, simple objects and narrow angles [17,18]. Willing to have high-quality wide-angle holograms, it is required to use high-resolution objects made of millions of units and involving non-paraxial dependencies. As a consequence, the calculation speed is low. Moreover, high image quality means considering 3D effects such as occlusion [19], shadows [20], uneven object intensity [21], and reflections [21]. Unfortunately, CGH methods still cannot simulate all of them simultaneously [22], so further development of CGH methods is necessary. The other source of image content for HNED is real-life objects. For this, the digital holograms of real objects are recorded in physical setups, where the complex optical field with all the information about the 3D scene is captured. Thus, they are the only ones showing real-like 3D reconstructions with all the perception cues. However, digital holography (DH) faces further challenges that come from the use of physical setups. DH setups FOV and digital hologram SBP are directly related to the parameters of the camera sensor and illumination wavelength. The available hardware limits DH capabilities, which emerges as a narrow FOV of capturing setups up to 15° [23] since the sensor pixel sizes are relatively large. The discrepancies between DH capturing setups and HNED question the possibility of using digital holograms as content in the wide-angle HNEDs.

Direct attempts to reconstruct narrow-angle digital holograms in wide-angle HNEDs without any geometrical manipulations lead to a significant change in the geometry of the reconstructed object, which, as shown in this paper, results in an imperceptible image. In order to adapt digital holograms to serve as content for HNED, it is necessary to apply numerical transformations to the holographic data. In Ref. [24] it was shown that the digital hologram can be manipulated with the affine transformation, giving a zoomable effect of the reconstructed object. The affine transformation was also applied for color image magnification [25,26] and holographic image manipulation [27] in Fresnel holograms, as well as in Fourier DH [7,28]. Notably, these tools were designed for narrow-angle conditions in the paraxial regime, and their usage was not investigated in wide-angle conditions. Hence, there is a lack of proper manipulation methodology that describes hologram transformations for wide-angle HNED. Since wide-angle HNEDs are built in two configurations, pupil and non-pupil, which are based on different imaging approaches, the proper numerical manipulation method must consider different sets of affine transformations for each configuration.

In this paper, we develop and investigate theoretically, numerically, and experimentally the hologram manipulations based on linear transformations for the case of using narrow-angle digital holograms in wide-angle HNED built in pupil [7] and non-pupil [8] configurations. It is worth noting that it is the first work concerning geometrical manipulation of wide-angle holographic content, which extends the applicability of paraxial content, also for narrow-angle CGH, in wide-angle displays. This work employs the phase-space approach (PSA) to set a framework for tracking the paraxial and non-paraxial holographic signal under 3D geometrical transformations and its reconstruction process, which is novel in this context. The PSA analysis of the direct use of narrow-angle digital holograms in wide-angle HNED shows that reconstruction distance is altered, and the object cannot be seen. Using PSA, we have developed the hologram manipulation method that allows the position and size of reconstructed monochromatic and color objects to be set freely. The method is based on the use of defocusing spherical wave and wavefront propagation, taking into account pupil and non-pupil HNEDs. The solution is supported with theoretical formulas defining the parameters of the spherical defocusing wave. It is important to note that thanks to PSA analysis, the developed hologram manipulation tool considers all the differences between reconstruction and capture setups, i.e., the axial location of a hologram, magnification, spherical reference wave, and wavelengths. Further, we provide the non-paraxial tool to investigate the accuracy and deformations induced by the presented hologram manipulation method. It is shown that the main aberration is field curvature, which is generally acceptable in near-eye setups due to the accommodation and foveated nature of human vision. Notably, the wave error introduced to the hologram by the applied defocusing wave is low. Thus, any point from the captured 3D object is sharply reconstructed. Finally, experimental results prove the applicability and flexibility of the developed holographic image manipulation methods.

2. Geometry of optical hologram recording and reconstruction

2.1. Digital hologram recording setup

Digital holograms used in this work are recorded in the lensless Fourier capture setup shown in Fig. 1(a). In this article, holograms that are optically recorded are referred to as digital holograms. A holographic pattern created by the interference of a wave scattered on the surface of an object with a reference wave is recorded by a digital camera. In Fig. 1(a), the setup scheme shows only the use of a single color light source. However, it can be modified to capture color digital holograms sequentially using RGB light sources, as in [28,29]. A laser source generates reference and object waves, marked with orange and green colors, respectively. The interference between these two waves creates the holographic pattern at a distance zo away from the object, which is captured by a CCD camera with a resolution of 2448 × 2050 pixels and pixel pitch Δc = 3.45 µm. The FOV of the recorded digital hologram is related to the parameters of the used camera sensor and light source wavelength, given as FOV= 2sin−1(λ/2Δ). Thus, for hologram registration with wavelength λgreen = 532 nm, the FOV = 8.8°. When for the RGB hologram, the FOV is limited by the shortest wavelength. For the wavelengths λred = 637 nm, λgreen = 532 nm, and λblue = 457 nm, FOV = 7.6°. This work investigates two digital holograms: one RGB – λred, green, blue, and one monochromatic – λgreen. The monochrome digital hologram encodes a reindeer figurine at a distance zo = 770 mm [30], while the RGB digital hologram encodes a lowiczanka doll for zo = 1060 mm [28]. The physical size in which both objects have to fit is given by Bxo = Byo= 2zotan(FOV/2), and thus, Bxo = Byo = 119 mm and 159 mm for reindeer and lowiczanka objects, respectively. The resolution of the CCD camera is much smaller than the resolution of the display setups described in the following subsection. However, the size of the recorded digital holograms can be extended by applying the synthetic aperture (SA) concept in DH [28,29]. Thus, investigated in this work digital holograms have the size of 4160 × 2464 pixels, which corresponds to the resolution of presented display setups.

 figure: Fig. 1.

Fig. 1. (a) Scheme of lensless Fourier capturing system. Green and orange colors correspond to reference and object waves, respectively. HWP – half-wave plate, BS – beam splitter, MO – microscopic objective, PH – pinhole, Lc – collimator, L1 – focusing lens, M1, M2 – mirrors, DP – diffuser plate, SF – spatial filter, Po – object point, H – digital hologram. (b) Photo of recorded objects: Lowiczanka and Reindeer figurines.

Download Full Size | PDF

2.2. Wide-angle pupil and non-pupil holographic near-eye displays

There are two main HNED architectures with wide FOV, pupil [7] and non-pupil [8]. In the first case, the hologram is reconstructed at the eyebox plane (in the eye) while the object is far from the eyebox. In contrast, the hologram and object are far from the eyebox plane for non-pupil display. Figure 2 shows diagrams of both setups: Fig. 2(a) the pupil configuration, while Fig. 2(b) non-pupil. The HoloEye GAEA2 phase-only spatial light modulator (SLM) with a resolution of 4160 × 2464 pixels and pixel size ΔSLM = 3.74 µm is employed for both setups. In the pupil HNED setup, a single laser source with λred = 640 µm is employed for a monochromatic reconstruction. The SLM with loaded holographic data is illuminated with a linearly polarized plane wave. Then, the hologram is an SLM plane image formed by a 4F system in the plane of the user's eye pupil. The filter mask FM in the 4F system is used to cut out the reconstruction's zero order. The 4F system composed of lenses L1 and Lep, with focal lengths F1 = 200 mm and Fep = 33 mm, introduces hologram magnification mh = Fep/F1 = 0.165. As a result, the pixel size in the hologram plane is Δrp = 0.62 µm. This gives FOV of pupil display 62°. In the non-pupil display, RGB LED sources with wavelengths λred = 635 nm, λgreen = 515 nm, and λblue = 465 nm are employed to obtain full-color reconstruction. Due to the decreased light coherency, it can profit from speckle noise reduction, and it is safer for the user's eye than the system with a laser source. The SLM is illuminated with linearly polarized collimated beams. The 4F system made of lenses L1 and L2 (focals F1 = 100 mm and F2 = 122 mm) includes the filter mask to use a complex coding method. The 4F system forms the intermediate hologram image at distance s = 31 mm in front of the eyepiece [8]. Then, with the eyepiece Lep (focal Fep = 33 mm) the intermediate hologram is magnified and its image appears at a distance zH away from the eyebox. The zH value is calculated using lens equation. In total, hologram magnification is equal mh = F2Fep/F1(Fep− s) = 20.13 [8], and the large virtual hologram with a size of 313.2 × 185.5 mm2 and pixel pitch Δrnp = 75.28 µm is obtained, while the setup FOV = 31.6°.

 figure: Fig. 2.

Fig. 2. Scheme of the (a) pupil and (b) non-pupil HNED setup. Lc – collimator, L1,2 – lenses, Lep – eyepiece, P – linear polarizer, BS – beam splitter, SLM – spatial light modulator, FM – filter mask, Pr – object's point reconstruction.

Download Full Size | PDF

2.3. Elementary paraxial hologram transformations in phase-space

Comparing the parameters of displays and capture setups, there are large differences in the pixel pitches and corresponding FOVs. Thus, the direct reconstruction of digital hologram is impractical. It appears that when such is directly reconstructed in wide-angle HNED, the holographic image is formed very close or behind the eyebox, making the object observation impossible. In conclusion, there is a need to develop a tool for geometric manipulation of holographic images.

To investigate the case of narrow-angle digital hologram reconstruction in wide-angle HNED and introduce necessary image manipulation, we built the PSA framework where four transformations are considered: hologram reconstruction with different wavelengths, its magnification, propagation, and use of spherical wave. It is shown that this PSA framework not only allows paraxial analysis of direct hologram reconstruction, introducing a tool for hologram manipulation but also allows generalization to the non-paraxial case, enabling the study of the effect of wide-angle reconstruction of the geometrically manipulated hologram.

For clarity of presentation, the effect of each of the transformations considered on the reconstruction of a paraxial hologram in a wide-angle HNED will be shown using 1D analysis in the x-axis since the 2D case is derived analogically. Let us consider a paraxial hologram of a point object Po(xo, zo), recorded with wavelength λo, which is represented as:

$${u_o}(x) = exp \left\{ { - \frac{{i{k_o}{c_o}{{({x - {x_o}} )}^2}}}{2}} \right\},$$
where ­o is the wavenumber and co= zo−1 is the curvature of a recorded spherical wave. While applying one or a sequence of the elementary transformations, the reconstruction point is changed to Pr(xr, zr). This point, when reconstructed with wavelength λr, generates a spherical wave of phase-space (PS) distribution equal to:
$${f_r} ={-} \lambda _r^{ - 1}{c_r}({x - {x_r}} ),$$
where the cr = zr−1 is the curvature of a reconstructed spherical wave. Table 1 presents the influence of four transformations on the coordinates of reconstructed point Pr when reconstructing the hologram of point P­o. These elementary transformations are: hologram propagation by zp, transverse image magnification by factor m = Δr/Δc, hologram multiplication by the spherical wave of curvature cd, and change of the reconstruction wavelength to λr. For clarity, the first three transformations are given with an assumption of the equality of recording-reconstruction wavelengths (λo= λr).

Tables Icon

Table 1. Effect of elementary transformations on coordinates of reconstruction point in the paraxial regime.

2.4. Phase-space approach for non-paraxial analysis of hologram transformations

The methodology presented above for manipulating the geometry of a 3D image using linear transformations is simple and efficient. However, it is based on the paraxial approximation, which is not valid for the studied case of wide-angle display [31]. Here, the non-paraxial framework based on PSA is introduced. It allows for calculating the accurate coordinates of the reconstructed point when the paraxial hologram is reconstructed in the wide-angle HNED. For this, a hologram of a single-point object at (xo, zo) given by:

$${u_c}(x) = exp \{{ - i{k_o}R[{x - {x_o},{z_o}} ]} \}$$
is considered, where R[x,z] = (x2+ z2)1/2. The basic case of direct hologram reconstruction of the point Po(xo, zo) takes into account two issues. The first one is the difference of FOVs, considered as an effect of large transverse magnification by m. The second is hologram reconstruction with wavelength λr while it was captured with λo. Thus, the distribution of the reconstructed hologram can be written as
$${u_r}(x) = exp \left\{ { - \frac{{i{k_r}{\lambda_r}R[{x - m{x_o},m{z_o}} ]}}{{{\lambda_o}m}}} \right\}.$$

To determine the coordinates of the reconstructed point Pr(xr, zr), we use two PSA tools: local spatial frequency [32] and local curvature [8]. The local spatial frequency of the reconstructed hologram given by Eq. (4) is

$${f_r} = \frac{{\partial ARG[{{u_r}(x)} ]}}{{2\pi \partial x}} ={-} \frac{{x - m{x_o}}}{{m{\lambda _o}R[{x - m{x_o},m{z_o}} ]}},$$
while the local spatial curvature of this wave can be calculated using
$${c_r} = {\lambda _r}\frac{{\partial {f_r}}}{{\partial x}} ={-} \frac{{m{\lambda _r}z_o^2}}{{{\lambda _o}R{{[{x - m{x_o},m{z_o}} ]}^3}}}.$$

Notably, the local spatial frequency does not depend on the reconstruction wavelength, while the local curvature does. To estimate the coordinates of the reconstructed point Pr(xr, zr), the local spatial frequency ${f_{r0}}$ and local curvature ${c_{r0}}$ are used, which are the values obtained from Eq. (5) and (6) for the hologram center coordinate x = 0. The desired coordinates of the reconstruction point can be calculated via formula:

$$({{x_r},{z_r}} )= {R_{r0}} \times [{{\lambda_r}{f_{r0}},{{({1 - \lambda_r^2f_{r0}^2} )}^{1/2}}} ],$$
where ${R_{r0}} ={-} ({1 - \lambda_r^2f_{r0}^2} )c_{r0}^{ - 1}.$ The formula was obtained by inspecting the equation of spherical wave for m = 1, for which the λrfr0 = xoR[xo,zo]−1 and cr0 = zo2R[xo,zo]−3. Thus, from the local spatial frequency, we can obtain the direction of a point object as the value of sine. Using the corresponding value of cosine in the formula for local spatial curvature, we get the distance of the hologram to the object point ${R_{r0}}$ and the coordinates of thr object point.

3. Numerical manipulation of holographic image for pupil HNED

This section first presents a detailed analysis of the hologram reconstruction in the pupil HNED, shown in Fig. 2(a). Then, to correct the too-close reconstruction distance obtained, the tool for image geometry manipulation is developed. It was found that the correction of the reconstruction distance can be achieved by manipulating the hologram with a parabolic defocusing wave. Its parameter, which is calculated using the developed equation, takes into account all differences in capture-display setups. The capabilities of the technique are further extended with axial shift and transverse zoom features.

To illustrate the process of deriving a hologram manipulation algorithm for pupil HNED, the example CGH hologram of the object point Po(xo, zo) = (0.19, 1000) [mm] is examined. The PS diagram of the CGH hologram of point Po with wavelength λo = 532 nm shown in Fig. 3(a)-(c) with the blue line, is described by fxo = −­λo−1co(xxo), where co = zo−1 is a curvature of the reconstructed wave. For a better view, Fig. 3(b) presents the zoomed area of the PS diagram from the black-dashed box in Fig. 3(a). When this hologram is directly reconstructed in pupil HNED, its reconstructed signal, marked in Fig. 3(a)-(b) with orange color, can be described by PS distribution as:

$${f_{xr}} ={-} \lambda _r^{ - 1}{c_r}({x - {m_p}{x_o}} ),$$
where cr = mp−2λrλo−1co is the curvature of a reconstructed spherical wave in the display setup and mp= Δrpc is transverse magnification, which according to Sec. 2.1 and 2.2 is mp = 0.179. Equation (8) allows us to determine the position of the reconstructed image as Pr(mpxo, cr−1). Thus, the transverse position is magnified directly by mp, and the axial position changes with the mp2. For the considered CGH hologram image of point Po, the reconstruction distance is zr= 27 mm and transverse point position xr = 0.03 mm when the reconstruction wavelength is λr = 640 nm. This reconstruction distance is too short, and an obtained image cannot be viewed in the HNED. Therefore, it is necessary to modify zr. The proposed technique involves numerical manipulation of the input hologram with a spherical wave with PS distribution fd = −cdλo−1x, where cd is the curvature of this wave. For the example case presented in Fig. 3, the defocusing wave is marked with a purple dotted line. It means the hologram is multiplied by exp(i­πx2cdo). When applying such a defocusing wave to the input CGH hologram of point Po, the PS distribution of the reconstructed wave in HNED, shown in Fig. 3(c) with green color, is given by:
$${f_{xrd}} ={-} \lambda _r^{ - 1}{c_{rd}}\left( {x - {m_p}\frac{{{c_o}}}{{{c_o} - {c_d}}}{x_o}} \right),$$
where crd = mp−2λrλo−1(co – cd) is the curvature of the reconstructed spherical wave. Thus, the new axial reconstruction distance after the application of the numerical defocusing is zrd = crd−1. In the above equation for crd, there is a factor cocd. To obtain a reconstructed real object, the following condition needs to be satisfied co > cd. Otherwise, the image behind the eye is obtained. This limits the depth of the recorded object. The relationship for crd can be transformed into an expression that enables determining the defocus curvature needed to achieve the expected reconstruction distance
$${c_d} = {c_o} - m_p^2{\lambda _o}\lambda _r^{ - 1}{c_{rd}}.$$

This equation allows to modify the axial position of the reconstruction freely. For an investigated example of CGH hologram, to obtain a reconstruction distance of zrd = 1 m, according to Eq. (10), the spherical wave of zd = 1.027 m should be used. The transverse position of the object in the example was chosen so that the reconstructed signal (Fig. 3(c)) is at the edge of the HNED hologram (xrd = 1.26 mm).

 figure: Fig. 3.

Fig. 3. PS diagrams in hologram plane x of (a) recorded hologram signal fxo, defocusing wave fd and signal of direct hologram reconstruction fxr in pupil HNED, (b) the zoom of the black-dashed area from (a), and (c) the recorded and defocused signals fxo and fxrd.

Download Full Size | PDF

Notably, the defocusing wave does not change the angular coordinate of the reconstruction since it does not modify the carrier frequency of the encoded object spherical wave. Thus, the angular coordinate is altered only by the magnification mp. This means that xr/zr= xrd/zrd = mp−1xo/zo. Also, the display magnification and adding a defocusing spherical wave do not change the hologram SBP since the frequency support is divided by mp, and spatial support is multiplied by mp. Thus, while the SBP of the hologram does not change during the geometrical transformation and transfer of the hologram to the display, an additional transverse magnification is possible, allowing separate control over the reconstruction axial position and its transverse size. The technique from [28] is used for the image transverse magnification. This technique involves interpolation of the reconstructed image. The whole hologram manipulation process is presented in Fig. 4, sphere represents the defocus correction applied. In order to independently manipulate the size and position of the image, the hologram is reconstructed in the first step, and transverse magnification is applied on this plane. Then, the manipulated image is propagated back to the hologram. For propagation, the Fourier Transform Fresnel Diffraction (FT-FD) solution is used [33]. Finally, in the second step, the position of the reconstructed image is modified by the applied spherical wave.

 figure: Fig. 4.

Fig. 4. Diagram of manipulation method process for pupil HNED.

Download Full Size | PDF

4. Numerical manipulation of holographic image for non-pupil HNED

The content of this section delivers a paraxial algorithm similar to that in Sec. 3, except that it considers hologram reconstruction in non-pupil HNED. For this display, the direct reconstruction is analyzed, and then a method for correcting the axial position of the reconstruction is developed. Similarly, as in Sec. 3, an example of hologram manipulation is included, and its PS diagram is shown in Fig. 5. The chosen example corrects the reconstruction distance in non-pupil HNED to zrd = 1 m of input CGH hologram of zo = 1 m. In the non-pupil HNED, the input recorded hologram is located at the eyebox, while the hologram of the display is placed at the zH distance from the eyebox (see Fig. 2(b)). Therefore, the input hologram must be numerically modified before loading it in the non-pupil HNED. For this purpose, two elementary transformations are needed: firstly, the hologram is propagated by a distance of zH, and secondly, a spherical wave of a lens with a focus of zH is added to the propagated signal. The effect of using both manipulations independently is shown in Table 1 (first and third columns, respectively). The resulting modified hologram by both linear transformations has a PS distribution

$${f_{xo}} ={-} \lambda _o^{ - 1}({c_o^{\prime} + {c_H}} )\left( {x - \frac{{c_o^{\prime}}}{{c_o^{\prime} + {c_H}}}{x_o}} \right),$$
where co=1/(zozH) and cH = zH−1. The applied propagation [33] changes the pixel pitch of the hologram to λozHcNxc, where Nxc is the number of pixels of a hologram. The PS diagram of the propagated hologram, calculated with Eq. (11), is presented in Fig. 5(a). The bandwidth of the propagated hologram is marked with the yellow dash-dotted lines, and the blue line presents the PS distribution of the reconstructed wave. Then, when such a hologram of λo = 637 nm and Nxc = 4160 is directly reconstructed in the non-pupil HNED with wavelength λr = 635 nm, firstly, the magnification mnp = ΔrnpΔcNxcozH = 3.66 of non-pupil configuration is introduced. Secondly, the direct reconstruction adds the phase of a spherical wave of focal zH, which is a reference wave of the non-pupil HNED [8]. As a result, the reconstructed hologram can be described by PS distribution as
$${f_{xr}} ={-} \lambda _r^{ - 1}{c_r}({x - {m_r}{x_o}} ),$$
where cr = mnp−2λrλo−1(co’+ cH) − cH is a curvature of a reconstructed spherical wave and mr = mnpco’ (co'+cHcHλoλr−1mnp2)−1 a transverse magnification. This equation determines the position of the reconstructed point Pr(mrxo, cr−1). In Fig. 5(b), a PS diagram of the reconstructed hologram is presented with an orange line. It can be seen that the reconstructed wave of the magnified hologram forms the image behind the eye of the viewer. It should be noted that the axial position of the reconstruction is determined relative to the hologram plane of non-pupil HNED. For the configuration of the investigated display, the direct hologram reconstruction is obtained at the distance zr = −676 mm. This means that the reconstruction is obtained 132 mm behind the eyebox. As with the pupil HNED, the solution to ensure correct axial localization of the holographic image is to use a parabolic defocusing wave of PS distribution fd = ­cdλo−1x on the input hologram. When multiplying the input hologram by exp(i­πx2cdo), the resulting hologram reconstruction is obtained at Prd(mrdxo,crd−1) and can also be defined by PS distribution described by Eq. (12) with coefficients
$${c_{rd}} = \frac{{{\lambda _r}c_H^2}}{{m_{np}^2{\lambda _o}({{c_H} - {c_o} + {c_d}} )}} - {c_H}$$
and
$${m_{rd}} = {m_{np}}\frac{{{c_H}{c_o}}}{{{c_H} - {c_o} + {c_d}}} \times {\left( {\frac{{c_H^2}}{{{c_H} - {c_o} + {c_d}}} - \frac{{{c_H}m_{np}^2{\lambda_o}}}{{{\lambda_r}}}} \right)^{ - 1}}.$$

Similarly to pupil setup, in the non-pupil configuration there is a limit of the depth of the recorded object. The condition can be derived from Eq. (13) and is given as co >cH + cdrcH/(mnp2λo). The relationship for crd can be transformed into an equation to determine the defocus curvature needed to achieve the desired reconstruction distance, which is given as:

$${c_d} = \frac{{{c_H}{c_o} - K{c_H} + K{c_o}}}{{{c_H} + K}},$$
where K = mnp2λoλr−1(cr + cH) −cH. Using Eq. (15), the position of the reconstruction can be freely modified. For example, willing to obtain the object reconstruction at the same distance as it was encoded in the input hologram (zrd = zo = 1 m), the spherical wave with zd = −1.285 m should be used. The PS diagram of the reconstructed hologram after adding a defocusing wave is presented in Fig. 5(c) with the green line, where it is focused at xrd = 3 mm. Also, the same property of preserving SBP is shared with the pupil HNED. Thus, as discussed in Sec. 3, the transverse object magnification technique can be applied, giving freedom in object scaling and positioning by those means.

 figure: Fig. 5.

Fig. 5. PS diagrams in hologram plane x of parabolic waves of: (a) point hologram without magnification, (b) point hologram in non-pupil HNED, (c) point hologram in non-pupil HNED with corrected axial position.

Download Full Size | PDF

The procedure of hologram manipulation for non-pupil configuration, including transverse scaling and shifting of an object, is presented in Fig. 6.

 figure: Fig. 6.

Fig. 6. Diagram of manipulation method process for non-pupil HNED.

Download Full Size | PDF

5. Errors of reconstruction of geometrically manipulated holograms

Sections 3 and 4 examine the imaging properties of the two HNED configurations in which the paraxial holograms are reconstructed. It has been shown that the reconstruction of such holograms without any modifications in both systems results in obtaining images at distances on which they cannot be seen. Therefore, Sec. 3 and 4 present a simple and effective paraxial tool for correcting these reconstruction distances. In this section, the non-paraxial framework, introduced in Sec. 2.4, is applied to study the effects of image manipulations in both HNED geometries. The performed analysis calculates the 3D locations and aberrations of reconstructed paraxial holograms in wide-angle HNEDs when applying paraxial image manipulation tools.

5.1 Analysis for pupil HNED

To analyze reconstruction in the pupil HNED, the non-paraxial imaging analysis has to include the numerical defocus since its necessity was shown in Sec. 3. Thus, to estimate the coordinates of the defocus-corrected reconstruction in the pupil HNED, we first find the effect of the applied parabolic defocus wave with phase kocdx2/2 on the input CGH hologram of point Po. Applying the defocus correction method to the input hologram gives a reconstruction at coordinates zod = cod−1 = (co – cd)−1 and xod= xococod−1. Introducing these coordinates, the single-point hologram reconstructed in the pupil display is given as

$${u_p}(x) = exp \left\{ { - \frac{{i{k_r}{\lambda_r}R[{x - {m_p}{x_{od}},{m_p}{z_{od}}} ]}}{{{\lambda_o}{m_p}}}} \right\},$$
where parameter cd of the defocused wave is determined from Eq. (10) to obtain the desired reconstruction distance zrd. Note that the distance zrd is correct for paraxial angles. For larger angles, the coordinates of the reconstructed point should be found using Eq. (7), where the hologram’s local spatial frequency and curvature described by Eq. (16) should be used. Notably, the applied defocus-correction method does not modify the local spatial frequency at x = 0. Therefore, only the local spatial curvature needs to be determined. Consequently, the coordinates of the reconstructed hologram are given by Eq. (7), where the spatial curvature at x = 0 is
$${c_{r0}} ={-} \frac{{{\lambda _r}{c_{od}}}}{{{\lambda _o}m_p^2R{{[{{x_o}{c_o},1} ]}^3}}}.$$
With the developed non-paraxial analysis, the geometric distortions and aberrations of the hologram reconstruction are numerically studied. The input CGH hologram, of the size of Nxc = 4096 pixels, consists of a set of 21 points evenly distributed on the entire image space along the x-axis with coordinate zo = 0.75 m. Two cases of object defocusing are considered: first, where the initial axial position of points is preserved (zrd = zo), and second, where points are moved to the distance zrd = 2zo = 1.5 m. The capture and reconstruction wavelengths are the same λo = λr = 640 nm, pixel pitches are Δc = 3.45 µm and Δrp= 1 µm. For these parameters, the FOV of the analyzed display = 39°. This value was chosen because it corresponds to the angular size of the reindeer object reconstructed in the experimental section. Figure 7(a) and (d) show the wave aberrations in the hologram plane, which are introduced by the defocusing wave for four different transverse coordinates of the point objects.

 figure: Fig. 7.

Fig. 7. Analysis for pupil HNED for point source object with initial distance zo = 750 mm. Hologram defocusing wavefront error for reconstruction distance (a) zrd= 750 mm and (d) zrd = 1500 mm. The non-paraxial points reconstruction positions for (b) zrd= 750 mm and (e) zrd= 1500 mm, with their magnification nonuniformity on (c) and (f), respectively.

Download Full Size | PDF

The obtained aberrations are negligibly small, and their maximum absolute value does not exceed 1/20λ in both cases. Also, they are smaller for the case zrd = 2zo. However, the geometries of the reconstructions [xr, zr] are noticeably altered, which is shown in Fig. 7(b) and (e) by the blue points indicating the locations of the reconstructed points. The light blue area with a dotted outline corresponds to the local depth of field [34]. As can be seen, the reconstructed points are distributed on a curvature instead of a plane. It is worth noting that this deformation will not blur the image during the visual perception, and the effect will be imperceptible. Due to the foveated nature of human vision, the eye can only focus on one part of the image. So, the eye will accommodate the deformation while shifting the gaze to a different part of the reconstruction. Also, the magnification of the reconstructed image is non-uniform, as shown in Fig. 7(c) and (f), but it is relatively low. Thus, it indicates that some image distortion would appear. However, its quantity is low enough and would not significantly change the visual perception of reconstruction.

5.2 Analysis for non-pupil HNED

As in Section 5.1, to analyze the image reconstruction in non-pupil HNED, we first need to apply the defocus-correction method to the input hologram. This defocus-correction method changes the coordinates of the reconstruction to zod = cod−1 = (co – cd)−1zH and xod= xoco(cocd)−1. Notably, these coordinates are given relatively to the hologram plane of the display, which is axially separated from the eyebox plane by the distance zH, as shown in Fig. 2(b). After applying these geometrically modified coordinates, a distribution of the reconstructed hologram in the non-pupil display can be written as

$${u_{np}}(x) = exp \left\{ { - \frac{{i{k_o}({R[{x - {m_{np}}{x_{od}},{m_{np}}{z_{od}}} ]+ R[{x,{m_{np}}{z_H}} ]} )}}{{{m_{np}}}} + i{k_r}R[{x,{z_H}} ]} \right\}.$$

To estimate the coordinates of the reconstructed point in the non-pupil HNED, the hologram's local spatial frequency and curvature given by Eq. (18) for coordinate x = 0 needs to be evaluated. Then, the local spatial frequency fr0 and curvature cr0 are given as follows:

$${f_{r0}} = \frac{{{m_{np}}{x_{oH}}}}{{{\lambda _r}R[{{m_{np}}{x_{oH}},{z_H}} ]}},$$
$${c_{r0}} = \frac{{{\lambda _r}({ - {c_o} - {c_H} + {c_d}} )}}{{{\lambda _o}m_{np}^2R{{[{{x_o}{c_o},1} ]}^3}}} - \frac{{{c_H}}}{{R{{[{{m_{np}}{x_o}{c_o},1} ]}^3}}},$$
where xoH = xozH/zo. Consequently, the coordinates of the reconstruction can be calculated as
$$({{x_r},{z_r}} )= [{{R_{r0}}{\lambda_r}{f_{r0}} + m{x_{oH}},{R_{r0}}{{({1 - \lambda_r^2f_{r0}^2} )}^{1/2}}} ],$$
and Rr0 is defined by Eq. (7).

In the case of a non-pupil HNED display, the analysis of hologram manipulation similar to that performed in Section 5.1 for a pupil configuration is realized. The investigated input CGH hologram and the encoded object are the same. The analysis is carried out for a similar FOV for comparison purposes. The parameters of the analyzed non-pupil display are Δrnp = 89.72 µm and zH = 544 mm, giving an FOV of 37.9°. Figure 8(a) and (d) show the resulting wave aberration generated due to the application of the defocused wave. It is worth noting that the calculated aberration does not apply to the entire hologram since only a tiny region Ωh with a center at the xhc encodes information about the object point [8]. The presented wave aberrations do not exceed 1/100λ. Like in the pupil HNED, as illustrated in Fig. 8(c) and (f), the reconstructed image is characterized by non-uniform magnification, and an inevitable image distortion appears, which is relatively low. Additionally, the reconstructed points of the object are located on a curvature, and the distance between them varies, indicating that field curvature is present, the same as in the pupil HNED case. The positions of the reconstructed points are illustrated in Fig. 8(b) and (e) with blue dots, while the light blue area marks the depth of field. However, here, the same comment stated for pupil HNED is applicable; the field curvature will not be perceived due to the wide depth of field and the foveated nature of human vision, so with gaze change, the eye will accommodate the deformation.

 figure: Fig. 8.

Fig. 8. Analysis for non-pupil HNED for point sources object with initial distance zo = 750 mm. Hologram defocusing wavefront error for reconstruction distance (a) zrd = 750 mm and (d) zrd = 1500 mm. The non-paraxial points reconstruction positions for (b) zrd = 750 mm and (e) zrd = 1500 mm, with their magnification nonuniformity on (c) and (f), respectively.

Download Full Size | PDF

5.3. Comparison of pupil and non-pupil HNEDs

Comparison of investigated HNED configurations based on performed analyses shows that for the first case of correction of the position of the reconstruction (zrd = zo), less deformation for non-pupil display is obtained - smaller curvature and more uniform magnification. The difference between the reconstruction distances of central on-axis and corner off-axis points is 35 mm, while for pupil HNED, this is 104 mm. Interestingly, for the second case of zrd = 2zo, this difference for non-pupil setup grows significantly and reaches 288 mm, while it increases to 208 mm for pupil setup. For non-pupil configuration, the observed behavior of larger deformation between the central and off-axis part of reconstruction can also be seen in the magnification diagram shown in Fig. 8(c) and (f), meaning the increase of distortion.

6. Experimental results

This section presents the optical reconstructions of narrow-angle digital holograms in the two investigated wide-angle HNEDs, whose schemes were presented in Section 2.2. The reconstructions were done in AR mode to emphasize the changes of the focus planes. Images of the reconstructed scenes were shot with the smartphone with parameters set to imitate the human eye vision.

6.1. Optical reconstruction in pupil HNED

The application of the image manipulation method in a pupil HNED is presented on the monochromatic SA digital hologram of the reindeer figurine of size 75 K × 2 K pixels [30]. In this hologram, the reindeer figurine of a physical size of 110 × 40 × 105 mm3 (deep × wide × high) was captured with a spherical reference wave of point source at a distance o= 770 mm. When the manipulation method is applied to the digital hologram setting the reconstruction distance to zrd = zo, the height of the reconstructed object is 670 mm, according to pupil display parameters from Section 2.2. However, the used eyepiece diameter physically cuts the visible FOV, limiting the actual display's FOV. It means the entire reconstructed object cannot be seen. Thus, an additional digital transverse magnification is applied to decrease the size of the reconstructed reindeer to 524 × 190 × 500 mm3 (deep × wide × high) for a distance of zrd= 800 mm.

The prepared scene contains the reconstructed reindeer beside the real reindeer figurine and a bush. The front legs of the reindeer and the bush are located 800 mm from the observer. The full range of developed features in the manipulation method is presented in Visualization 1. There, the holographic reindeer travels from a distance of 4 m towards the observer up to 0.8 m, with a single stop at a distance of 2 m. Taking advantage of a large SA digital hologram, the different perspectives are shown. These are seen as object rotation in the horizontal direction at distances 2 m and 0.8 m away from the observer. The hologram reconstructions shown in Fig. 9 and in the second experiment with non-pupil HNED are captured using time-multiplexing of several holograms to reduce speckle noise. Although, other techniques for speckle noise suppression in holographic displays, such as [35,36], can be used. Figure 9 presents the object depth preservation after introduced manipulations, where the focus is set on (a) the front horns and (b) the back leg. Zooms of these elements, while one is in focus and the other is out of focus, are shown in Fig. 9(c)-(f). Also, it can be seen that there is no noticeable distortion since the nonlinear magnification of the reconstructed object is around 10%, as shown in the investigation from Section 5.1 and Fig. 7(c)-(f).

 figure: Fig. 9.

Fig. 9. Reconstruction of digital hologram in wide-angle HNED setup in AR mode for camera focus set at (a) 500 mm and (b) 800 mm. Zooms of the front horns and back leg are shown in (c),(d) for a focus distance of 500 mm and in (e),(f) for a focus distance of 800 mm, respectively.

Download Full Size | PDF

6.2. Optical reconstruction in non-pupil HNED

The second experiment shows the reconstruction of the color SA digital hologram in the non-pupil HNED of size 31 K × 30 K pixels [28]. The input digital hologram shows the lowiczanka doll of a physical size of 53 × 56 × 116 mm3 (deep × wide × high), captured at a distance of zo = 1060 mm. For visualization purposes the display setup from Fig. 2(b) is modified into AR mode by putting the beam splitter in front of the eyebox. Reconstruction is displayed in the scene where the lowiczanin doll (male) of a height of 127 mm at a distance of 750 mm is in the FOV. In Fig. 10, the reconstructions of two axial positions of lowiczanka are shown, where the camera is focused on the reconstructed object each time. In Fig. 10(a), the positions of the lowiczanka and lowiczanin dolls are the same (750 mm); as the photo shows, both objects are in focus. At this distance, the reconstructed doll image is 210 mm high, which is set as the physical constant size of this object. In the second scene presented in Fig. 10(b), the lowiczanka doll is moved 1500 mm away from the eyebox (observer) and is digitally demagnified to simulate the sensation of the object moving away. The lowiczanin doll (male), being at the same distance as in the first scene, is now out of focus. Figures 10(c)-(f) presents zooms of heads and hands of the reconstructed lowiczanka doll images. As can be seen in zoomed parts of the reconstruction, there is some speckle noise that is minimized by applied LED illumination. The main source of speckles is a laser of DH data setup. Thus, potentially this speckle noise can be minimized at the recording stage with the use of the approach presented in [37]. Similarly, as for pupil configuration, the reconstructed image in the non-pupil display also experiences a certain distortion due to nonlinear image magnification. Nonetheless, the relative change of image magnification is quite low (∼10%), and it is not noticeable on the reconstructed object.

 figure: Fig. 10.

Fig. 10. Color reconstructions of digital holograms (a) at 750 mm, (b) 1500 mm. (c)-(f) Zooms of details.

Download Full Size | PDF

7. Conclusions

In this work, we have shown that the direct reconstruction of narrow-angle holograms in wide-angle HNEDs results in significant alteration of reconstruction geometry. This makes it impossible to observe the image. For the development of the hologram manipulation method, the framework that includes a set of linear transformations needed to describe capture/display differences is established. For framework development, the required tools for hologram manipulation were created by using the PSA, which is novel due to considered wide angle. A hologram manipulation method for HNEDs in pupil and non-pupil configurations is created. Our method allows changing freely the reconstruction distance, thus paraxial holograms can be adapted to serve as a holographic content source in wide-angle HNEDs. The presented investigation and application tests were done for two HNED configurations: pupil and non-pupil, where our experimental setups have FOVs of 62° and 31.6°, respectively.

The proposed method of free manipulation of the image geometry consists of two independent operations: the first is the axial image shift, while the second is the transverse shifting and scaling. For both operations and display setups, paraxial approximations are employed. The accurate axial displacement of the object on a chosen distance zrd is gained by adding a defocusing spherical wave, whose parameters are found through the given formulas. The transverse movement and scaling of the object are achieved by simple linear digital transformation in the Fourier domain.

Additionally, the accurate non-paraxial analysis of the wavefront errors, introduced by added spherical defocusing wave and reconstruction geometry, is provided with necessary formulas derived from the PSA framework. Our analysis shows that in these conditions, wave errors introduce only a change of the focusing distance of single points, creating certain field curvature. In the non-pupil display, the field curvature appears smaller than in the pupil configuration. For both configurations, field curvatures decrease with the farther object position. However, there are no other significant aberrations, thus, reconstructions of single points are accurate in the entire FOV.

Lastly, the accuracy of the proposed manipulation method is confirmed with the performed optical reconstructions of recorded paraxial digital holograms in the pupil (monochromatic) and non-pupil (color) setups, presenting object axial displacement and digital scaling in the AR observation mode. With these results and given analyses, we are convinced that the developed framework and DH manipulation methodology widen available holographic content with real object representations, enriching the attractiveness and usefulness of wide-angle HNED. Moreover, this study also applies to the narrow-angle CGH and thus, paraxial CGH methods can now be used in wide-angle displays. It is worth noting that it is the first work concerning the geometrical manipulation of wide-angle holographic content. Hence, new studies are expected.

Funding

Narodowe Centrum Nauki (UMO-2018/31/B/ST7/02980); Politechnika Warszawska.

Disclosures

The authors declare no conflicts of interest.

Data availability

Data underlying the results presented in this paper are available from the authors upon reasonable request.

References

1. P. Cipresso, I. A. C. Giglioli, M. A. Raya, et al., “The past, present, and future of virtual and augmented reality research: A network and cluster analysis of the literature,” Front. Psychol. 9(11), 1 (2018). [CrossRef]  

2. M. Mekni and A. Lemieux, “Augmented Reality : Applications, Challenges and Future Trends,” Applied Computational Science anywhere 1, 1 (2014).

3. T. Zhan, K. Yin, J. Xiong, et al., “Augmented Reality and Virtual Reality Displays: Perspectives and Challenges,” iScience 23(8), 101397 (2020). [CrossRef]  

4. D. M. Hoffman, A. R. Girshick, K. Akeley, et al., “Vergence-accommodation conflicts hinder visual performance and cause visual fatigue,” J. Vis. 8(3), 33 (2008). [CrossRef]  

5. X. Duan, J. Liu, X. Shi, et al., “Full-color see-through near-eye holographic display with 80° field of view and an expanded eye-box,” Opt. Express 28(21), 31316 (2020). [CrossRef]  

6. A. Maimone, A. Georgiou, and J. S. Kollin, “Holographic near-eye displays for virtual and augmented reality,” ACM Trans. Graph. 36(4), 1–16 (2017). [CrossRef]  

7. T. Kozacki, M. Chlipala, and P. L. Makowski, “Color Fourier orthoscopic holography with laser capture and an LED display,” Opt. Express 26(9), 12144 (2018). [CrossRef]  

8. T. Kozacki, M. Chlipala, J. Martinez-Carranza, et al., “LED near-eye holographic display with a large non-paraxial hologram generation,” Opt. Express 30(24), 43551 (2022). [CrossRef]  

9. T. Kozacki, J. Martinez-Carranza, R. kukołowicz, et al., “Fourier horizontal parallax only computer and digital holography of large size,” Opt. Express 29(12), 18173 (2021). [CrossRef]  

10. K. Matsushima and N. Sonobe, “Full-color digitized holography for large-scale holographic 3D imaging of physical and nonphysical objects,” Appl. Opt. 57(1), A150 (2018). [CrossRef]  

11. J. H. Park, “Recent progress in computer-generated holography for three-dimensional scenes,” J. Inf. Disp. 18(1), 1–12 (2017). [CrossRef]  

12. E. Sahin, E. Stoykova, J. Mäkinen, et al., “Computer-Generated Holograms for 3D Imaging: A Survey,” ACM Comput. Surv. 53(2), 1–35 (2021). [CrossRef]  

13. P. W. M. Tsang, T.-C. Poon, and Y. M. Wu, “Review of fast methods for point-based computer-generated holography [Invited],” Photonics Res. 6(9), 837 (2018). [CrossRef]  

14. Y. Zhang, H. Fan, F. Wang, et al., “Polygon-based computer-generated holography: a review of fundamentals and recent progress [Invited],” Appl. Opt. 61(5), B363 (2022). [CrossRef]  

15. H. Zhang, L. Cao, and G. Jin, “Computer-generated hologram with occlusion effect using layer-based processing,” Appl. Opt. 56(13), F138 (2017). [CrossRef]  

16. N. Okada, T. Shimobaba, Y. Ichihashi, et al., “Fast calculation of a computer-generated hologram for RGB and depth images using a Wavefront recording plane method,” Photonics Lett Pol 6(3), 1 (2014). [CrossRef]  

17. D. Blinder, T. Nishitsuji, and P. Schelkens, “Real-Time Computation of 3D Wireframes in Computer-Generated Holography,” IEEE Trans. on Image Process. 30, 9418–9428 (2021). [CrossRef]  

18. Y. Wang, X. Sang, Z. Chen, et al., “Real-time photorealistic computer-generated holograms based on backward ray tracing and wavefront recording planes,” Opt. Commun. 429, 12–17 (2018). [CrossRef]  

19. J. Martinez-Carranza, T. Kozacki, R. Kukołowicz, et al., “Occlusion culling for wide-angle computer-generated holograms using phase added stereogram technique,” Photonics 8(8), 298 (2021). [CrossRef]  

20. J. Dong, B.-R. Yang, and Z. Qin, “Fast shadow casting algorithm in analytical polygon-based computer-generated holography,” Opt. Express 31(9), 14821 (2023). [CrossRef]  

21. A. Symeonidou, D. Blinder, and P. Schelkens, “Colour computer-generated holography for point clouds utilizing the Phong illumination model,” Opt. Express 26(8), 10282 (2018). [CrossRef]  

22. D. Blinder, T. Birnbaum, T. Ito, et al., “The state-of-the-art in computer generated holography for 3D display,” Light: Advanced Manufacturing 3(3), 1 (2022). [CrossRef]  

23. B. M. Hennelly, D. P. Kelly, D. S. Monaghan, et al., “Digital holographic capture and optoelectronic reconstruction for 3D displays,” International Journal of Digital Multimedia Broadcasting 2010, 1–14 (2010). [CrossRef]  

24. P. Memmolo, V. Bianco, M. Paturzo, et al., “Numerical Manipulation of Digital Holograms for 3-D Imaging and Display: An Overview,” Proc. IEEE 105(5), 892–905 (2017). [CrossRef]  

25. P. Memmolo, A. Finizio, M. Paturzo, et al., “Multi-wavelengths digital holography: reconstruction, synthesis and display of holograms using adaptive transformation,” Opt. Lett. 37(9), 1445 (2012). [CrossRef]  

26. P. Memmolo, M. Leo, C. Distante, et al., “Coding Color Three-Dimensional Scenes and Joining Different Objects by Adaptive Transformations in Digital Holography,” J. Display Technol. 11(10), 854–860 (2015). [CrossRef]  

27. M. Paturzo, P. Memmolo, A. Finizio, et al., “Synthesis and display of dynamic holographic 3D scenes with real-world objects,” Opt. Express 18(9), 8806–8815 (2010). [CrossRef]  

28. P. L. Makowski, W. Zaperty, and T. Kozacki, “Digital hologram transformations for RGB color holographic display with independent image magnification and translation in 3D,” Appl. Opt. 57(1), A76 (2018). [CrossRef]  

29. A. Gołoś, W. Zaperty, G. Finke, et al., “Fourier RGB synthetic aperture color holographic capture for wide angle holographic display,” in Optics and Photonics for Information Processing X, 9970 (2016), p. 99701E.

30. T. Kozacki, J. Martinez-Carranza, R. Kukolowicz, et al., “Accurate reconstruction of horizontal parallax-only holograms by angular spectrum and efficient zero-padding,” Appl. Opt. 59(27), 8450 (2020). [CrossRef]  

31. T. Kozacki, W. Finke, J.-M. Carranza, et al., “Numerical reconstruction of large HPO Fourier holograms,” in Proceedings of SPIE - The International Society for Optical Engineering (2020), 11353.

32. T. Kozacki, “Numerical errors of diffraction computing using plane wave spectrum decomposition,” Opt. Commun. 281(17), 4219–4223 (2008). [CrossRef]  

33. J. W. Goodman, Introduction to Fourier Optics (Roberts and Company Publishers, 2005).

34. R. W. Meier, “Depth of Focus and Depth of Field in Holography,” J. Opt. Soc. Am. 55(12), 1693_1 (1965). [CrossRef]  

35. Z.-S. Li, Y.-W. Zheng, Y.-L. Li, et al., “Method of color holographic display with speckle noise suppression,” Opt. Express 30(14), 25647 (2022). [CrossRef]  

36. P. Memmolo, V. Bianco, M. Paturzo, et al., “Encoding multiple holograms for speckle-noise reduction in optical display,” Opt. Express 22(21), 25768 (2014). [CrossRef]  

37. V. Bianco, P. Memmolo, M. Paturzo, et al., “Quasi noise-free digital holography,” Light: Sci. Appl. 5(9), e16142 (2016). [CrossRef]  

Supplementary Material (1)

NameDescription
Visualization 1       Presentation of capabilities of developed hologram manipulation method in using paraxial digital hologram in wide-angle near-eye holographic display.

Data availability

Data underlying the results presented in this paper are available from the authors upon reasonable request.

Cited By

Optica participates in Crossref's Cited-By Linking service. Citing articles from Optica Publishing Group journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (10)

Fig. 1.
Fig. 1. (a) Scheme of lensless Fourier capturing system. Green and orange colors correspond to reference and object waves, respectively. HWP – half-wave plate, BS – beam splitter, MO – microscopic objective, PH – pinhole, Lc – collimator, L1 – focusing lens, M1, M2 – mirrors, DP – diffuser plate, SF – spatial filter, Po – object point, H – digital hologram. (b) Photo of recorded objects: Lowiczanka and Reindeer figurines.
Fig. 2.
Fig. 2. Scheme of the (a) pupil and (b) non-pupil HNED setup. Lc – collimator, L1,2 – lenses, Lep – eyepiece, P – linear polarizer, BS – beam splitter, SLM – spatial light modulator, FM – filter mask, Pr – object's point reconstruction.
Fig. 3.
Fig. 3. PS diagrams in hologram plane x of (a) recorded hologram signal fxo, defocusing wave fd and signal of direct hologram reconstruction fxr in pupil HNED, (b) the zoom of the black-dashed area from (a), and (c) the recorded and defocused signals fxo and fxrd.
Fig. 4.
Fig. 4. Diagram of manipulation method process for pupil HNED.
Fig. 5.
Fig. 5. PS diagrams in hologram plane x of parabolic waves of: (a) point hologram without magnification, (b) point hologram in non-pupil HNED, (c) point hologram in non-pupil HNED with corrected axial position.
Fig. 6.
Fig. 6. Diagram of manipulation method process for non-pupil HNED.
Fig. 7.
Fig. 7. Analysis for pupil HNED for point source object with initial distance zo = 750 mm. Hologram defocusing wavefront error for reconstruction distance (a) zrd= 750 mm and (d) zrd = 1500 mm. The non-paraxial points reconstruction positions for (b) zrd= 750 mm and (e) zrd= 1500 mm, with their magnification nonuniformity on (c) and (f), respectively.
Fig. 8.
Fig. 8. Analysis for non-pupil HNED for point sources object with initial distance zo = 750 mm. Hologram defocusing wavefront error for reconstruction distance (a) zrd = 750 mm and (d) zrd = 1500 mm. The non-paraxial points reconstruction positions for (b) zrd = 750 mm and (e) zrd = 1500 mm, with their magnification nonuniformity on (c) and (f), respectively.
Fig. 9.
Fig. 9. Reconstruction of digital hologram in wide-angle HNED setup in AR mode for camera focus set at (a) 500 mm and (b) 800 mm. Zooms of the front horns and back leg are shown in (c),(d) for a focus distance of 500 mm and in (e),(f) for a focus distance of 800 mm, respectively.
Fig. 10.
Fig. 10. Color reconstructions of digital holograms (a) at 750 mm, (b) 1500 mm. (c)-(f) Zooms of details.

Tables (1)

Tables Icon

Table 1. Effect of elementary transformations on coordinates of reconstruction point in the paraxial regime.

Equations (21)

Equations on this page are rendered with MathJax. Learn more.

u o ( x ) = e x p { i k o c o ( x x o ) 2 2 } ,
f r = λ r 1 c r ( x x r ) ,
u c ( x ) = e x p { i k o R [ x x o , z o ] }
u r ( x ) = e x p { i k r λ r R [ x m x o , m z o ] λ o m } .
f r = A R G [ u r ( x ) ] 2 π x = x m x o m λ o R [ x m x o , m z o ] ,
c r = λ r f r x = m λ r z o 2 λ o R [ x m x o , m z o ] 3 .
( x r , z r ) = R r 0 × [ λ r f r 0 , ( 1 λ r 2 f r 0 2 ) 1 / 2 ] ,
f x r = λ r 1 c r ( x m p x o ) ,
f x r d = λ r 1 c r d ( x m p c o c o c d x o ) ,
c d = c o m p 2 λ o λ r 1 c r d .
f x o = λ o 1 ( c o + c H ) ( x c o c o + c H x o ) ,
f x r = λ r 1 c r ( x m r x o ) ,
c r d = λ r c H 2 m n p 2 λ o ( c H c o + c d ) c H
m r d = m n p c H c o c H c o + c d × ( c H 2 c H c o + c d c H m n p 2 λ o λ r ) 1 .
c d = c H c o K c H + K c o c H + K ,
u p ( x ) = e x p { i k r λ r R [ x m p x o d , m p z o d ] λ o m p } ,
c r 0 = λ r c o d λ o m p 2 R [ x o c o , 1 ] 3 .
u n p ( x ) = e x p { i k o ( R [ x m n p x o d , m n p z o d ] + R [ x , m n p z H ] ) m n p + i k r R [ x , z H ] } .
f r 0 = m n p x o H λ r R [ m n p x o H , z H ] ,
c r 0 = λ r ( c o c H + c d ) λ o m n p 2 R [ x o c o , 1 ] 3 c H R [ m n p x o c o , 1 ] 3 ,
( x r , z r ) = [ R r 0 λ r f r 0 + m x o H , R r 0 ( 1 λ r 2 f r 0 2 ) 1 / 2 ] ,
Select as filters


Select Topics Cancel
© Copyright 2024 | Optica Publishing Group. All rights reserved, including rights for text and data mining and training of artificial technologies or similar technologies.