Expand this Topic clickable element to expand a topic
Skip to content
Optica Publishing Group

On resolution and viewing of holographic image generated by 3D holographic display

Open Access Open Access

Abstract

In the paper Wigner Distribution (WD) representation analysis of holographic display is presented. The display reconstructs holographic image by means of Spatial Light Modulator. Two major aspects are covered: imaging and viewing. Optically reconstructed images are characterized by low and spatially variant resolution. Utilizing WD representation we present a simple formula for resolution as a function of both coordinates: transverse and longitudinal. The analysis of an aliasing effect allows for meaningful extension of the field of view. All theoretical results are proven experimentally. The WD representation of angularly and spatially limited holographic image is extended to cover its visual perception as well. Angular resolution and field of view are theoretically examined. Both monocular and binocular perception are studied and illustrated experimentally.

©2010 Optical Society of America

1. Introduction

Human's natural world perception is three dimensional. No wonder that people want to transfer the third dimension into entertainment such as TV and please their eyes with 3D images. One of the techniques which allows to view 3D images is holography [1]. The holographic technique works with a complex optical wave such as generated in real world and therefore correctly reconstructs 3D object wavefronts [2]. Recently liquid crystal based Spatial Light Modulators (SLM) have been recognized as the most feasible devices for designing holographic displays [35]. Due to rapid progress in liquid crystal display technology, SLM devices provides high accuracy in optical wave reproduction [6].

In holographic displays, the extent of angular view-ability remains one of the most important factors affecting the experience of observing optically reconstructed holographic images. It is limited by a finite pixel pitch of the particular SLM device used in holographic display implementation. To overcome this problem several attempts have been made to enhance this feature by employing innovative optical holographic display modules with several tilled SLMs [7, 8]. Recently the work on monocular focusing of object of variable depths was reported [9]. However only little attention has been given to the visual perception of angularly and spatially limited holographic image. Therefore in this paper we present an analysis of real holographic image limitation in joint space-spatial frequency domain (Wigner Domain) [10]. Such an analysis gives both spatial and angular information about generated holographic image. The results are then used to study visual perception of holographic image. The analysis is performed in Wigner space as well.

In the paper we analyze SLM based holographic display schematically presented in Fig. 1 , where reflective phase only liquid crystal on silicon SLM (model HEO 1080P) is illuminated by a linearly polarized plane wave. Thus in a phase SLM display amplitude information is disregarded. However the image of a scattering object is well approximated by phase only distribution at a plane distant from the object [11]. The real holographic image is formed at a distance z from a phase SLM. This image is viewed from distance zo. The SLM due to its low spatial bandwidth product [12] (pixel pitch 8 µm of HD resolution) produces on-axis and low resolution field only. The low resolution of the reconstructed holographic image presents a problem when considering visual human perception and it is a reason why we present study of limited holographic image resolution and its effect on human perception.

 figure: Fig. 1

Fig. 1 The experimental setup for observation of holographic image generated by SLM based holographic display.

Download Full Size | PDF

In holographic display we either reconstruct synthetic or digital holograms. Certainly imaging of real world objects captured in digital holographic systems is more attractive. In order to reconstruct directly a captured digital hologram at a phase only modulator system we convert intensity fringes into phase ones and introduce them into SLM. In this way we get optical reconstruction of digital hologram with strong 0 order. It is therefore better to first filter a complex object beam from a captured hologram using numerical techniques [13] or phase shifting digital holography [14]. Then in reconstruction we use a phase of an object beam only. However the lack of modulation of both amplitude and phase is problematic. It limits the class of imaging objects to scattering ones. Moreover the phase SLM on the market are characterized by lover than 100% diffraction efficiency. This disturbs an reconstructed image.

In the experimental setup an asymmetric diffuser is placed within longitudinal center of real object reconstruction space. The diffuser scatters light in one direction only (Y). Therefore it gives more convenient holographic image observation, i.e., viewing eye does not have to by posited within both transverse coordinates. Every image point generates a pencil beam unaffecting X perspective of holographic image.

In the paper we present theoretical analysis on the resolution of an image generated by a planar SLM display. The considerations are given in one dimension, only. However we study holographic image as an assembly of point images and it was proven [15] that distribution of a point image generated by spherical beam apodized with a rectangular aperture is a product of two independent (X and Y) functions. Therefore the presented analysis is a general one. On a basis of this analysis we study visual perception of holographic image.

2. Spatial evolution of resolution of holographic image

2.1 Theory

In a hologram reconstruction experiment phase SLM scatters illuminating plane wave, and real holographic image is formed at a distance z (Fig. 1). According to the classical sampling theory [16] the holographic signal leaving SLM (z = 0) is frequency and spatially limited, i.e., Bf = Δ−1, Bx = NΔ, where Δ is a pitch size and N is a number of pixels. The discussion here and in what follows is limited to paraxial optics (Fresnel approximation), where transverse distributions (Fig. 1) of holographic signal u(x,z) at two parallel planes are related by

u(x,z)=exp(ikz)iλzu(xs,0)exp(ik(xxs)22z)dxs,
In Fresnel domain the WD's of signal u(x,z) undergoes linear transformation [17]

Wu(z=d)(x,f)=Wu(z=0)(xλzf,f).

WD confinement of a holographic signal for three distances z is presented in Fig. 2 . Solid line represents WD of signal leaving SLM. We assume that SLM generates limited signal in space and frequency. Higher diffraction orders are not treated, since they do not contribute in generation of holographic image.

 figure: Fig. 2

Fig. 2 The evolution of WD of holographic signal during propagation.

Download Full Size | PDF

In Fig. 2 we present an imaging process of an axial spherical beam at minimum distance z = d min(axial) (dashed line). At this distance full SLM aperture participates in generation of on-axis point image. The inclined line represents spherical beam at SLM plane while the line perpendicular to x axis represents point image at reconstruction plane. In the Wigner chart free space propagation for distance dmin (axial) corresponds to the shift of marginal space-frequency point to on-axis location i.e., W(Bx/2, Bf/2) → W(0, Bf/2). This gives a minimum hologram reconstruction distance

dmin(axial)=BxΔλ1.
For off-axis point of position x 0, applying full aperture criteria the minimal distance is

dmin(offaxis)=(Bx+2x0)Δλ1.

At shorter distances than specified by this condition one spherical beam generates multiple secondary point sources. This surely results in quality loss of holographic image. In Fig. 2 with dotted line the imaging of point (x = NΔ/2) for minimum off-axis condition is presented. It is seen that with increased propagation distance the resolution of holographic image decreases. Moreover the resolution is spatially variant. The spatial frequency span can be deduced from WD chart. In Fig. 3a the generation of point images for distance larger (a) and closer (b, c) than dmin is presented.

 figure: Fig. 3

Fig. 3 Generation of holographic point images: (a) for distance z > d min, (b) for distance z < d min, (c) for highly off-axis points.

Download Full Size | PDF

In the first case (no aliasing) the frequency spectrum span of the point image is entirely determined by size of SLM. The inclined line sampled within SLM bandwidth is converted into line at spatial position x0. Within zero diffraction order single spherical beam generates exclusive point image. In Sec. 4.1 we have proven that the width of Fourier spectrum of point image is Bx/λz with carrier frequency x0/λz. This result fully agrees with the graphical WD representation analysis of the problem in Fig. 3a.

At distances closer than dmin (axial) the spectrum width of point image is determined by pixel pitch (Bf). The aliased point images are repeating at spatial intervals Bfλz. Both conclusions are proven in Sec. 4.2 and visualized in Fig. 3b. We have now proven and justified the further use of graphical WD analysis.

At off-axis regions the Fourier spectrum of point image is delimited by both, the spatial extent and the pixel pitch (Fig. 3c). In this case we apply both results given in Sec. 4. There are now two point images within 0 diffraction order area, e.g., the original image (spatial position x0) and an aliased one. The images are separated by distance Bfλz. The Fourier spectrum of both images can be deduced from WD chart analyzing corresponding lines length.

The transverse spatial resolution of point image generated with SLM is a function of both coordinates: transverse (x0) and longitudinal (z). We summarize the above discussion presenting formula for intensity distribution of a point image generated by SLM:

I(x,z)={sinc2(Bf(x-x0))for|x0|(BxBfλz)/2sinc2(Bxxλz(xx0))for|x0|<(BfλzBx)/2sinc2(Bx+Bfλz2x02λz(x-x0))for|BfλzBx2|>|x0|>Bx+Bfλz2.0otherwise

Equation (5) considers spatial distribution of an original point image, without considering aliasing sources. In Fig. 4 the transverse resolution and field of view of holographic image is visualized. The criterion of first zero of sinc(...) distribution of a point image is applied and it is used as a measure of resolution. Within the SLM closest blue triangle the point images with the higher and constant resolution can be formed. However at least two aliased sources are generated within this region as well. The region ‘no aliasing’ confined by dmin line characterizes resolution constant in transverse direction and dropping down with distance z. The image points within two green confined areas: upper and down ones, are characterized by resolution decreasing with the distance from optical axis. Within this area the power and resolution of an aliased image decreases with the increased resolution of primary point image. This is a positive effect which allows to use a part of this region for holographic image generation. We therefore propose extension of field of view to 'Holographic Imaging Area' (red dashed line). The allowed holographic image area has then the size of Bfλz, where z > dmin ( axial ). Aliasing point images are here generated within the region outside the 'Holographic Imaging Area'.

 figure: Fig. 4

Fig. 4 Transverse resolution and field of view of holographic image.

Download Full Size | PDF

2.2 Experiment

The theoretical discussion concluded with Eq. (5) is focused on the resolution of holographic image and on its spatial evolution. Here we present the results of experiments aiming at the verification of two aspects: space dependent resolution and extension of field of view. Before performing experiments the SLM phase modulation was calibrated. We have characterized both: nonlinearity of SLM phase response [18] and wave aberration [19]. Calibration measurements were accomplished with two different techniques employing interferometry.

Then we have generated and displayed spherical waves with corrected phase SLM response in the setup presented in Fig. 5 . The series of spherical waves were generating point sources for various distances z. We have captured intensity spatial distributions of these point sources and characterized the positions of the first zero. The results are plotted in Fig. 6 . We have received good agreement with the theoretical line. It has to be mentioned that the accurate calibration of SLM is a crucial factor. Without calibration measurements results were far from theoretical expectations.

 figure: Fig. 5

Fig. 5 The experimental setup for capturing a real holographic image.

Download Full Size | PDF

 figure: Fig. 6

Fig. 6 The resolution of a real holographic image (position of the first zero of point image as a function of reconstruction distance).

Download Full Size | PDF

To verify theoretically predicted extension of a field of view we have designed computer generated hologram with Gerchberg-Saxton algorithm [20]. The algorithm applies accurate plane wave spectrum decomposition method [21], so no numerically induced aliasing is present in the hologram. The hologram target and the result of its optoelectronic reconstruction are shown in Fig. 7 , respectively. The reconstruction was captured in setup presented in Fig. 5. The image was captured with long exposure time of CCD camera so the central reconstruction region was saturated and all aliased images are well seen. The hologram was designed for reconstruction distance twice the minimum on-axis reconstruction distance z = 2dmin ( axial ) = 461 [mm]. For this distance 'no aliasing' area (Fig. 4) is of the size of SLM (15.36 [mm]). This region (letters: EFGH) is enclosed by two inner lines of hologram reconstruction. The reconstructions from target area (letters: EFGH) are free of aliased images. The middle region (letters: CD-EFGH-IJ - 'Holographic Image Area') has twice the size of SLM. The images of letters CD-IJ give aliased images. However the aliased images appear outside of the 'Holographic Image Area' (Fig. 4). These aliased images are of small intensity. We have computed the ratio of intensities of the original images of letters C, D and their aliased images. These values are for letter C 0.34 and for D 0.22. The results justify extension of the holographic imaging area to region: CD-EFGH- IJ. The images of target part outside of 'Holographic Image Area' (letters AB-KL) give aliased images close to the image center.

 figure: Fig. 7

Fig. 7 The image of a generated target (a) and its image reconstruction (b) from computer generated hologram.

Download Full Size | PDF

3. Viewing of holographic image

The above considerations are focused on a reconstruction of a real holographic image. We have seen that image generated by the display system has a limited and spatially variant resolution. In this section we try to answer questions concerning a human perception (viewing) of a holographic image scene. The problem is a crucial one for an effective operation of holographic display systems. The analysis is performed by means of WD graphical representation of holographic imaging process. Accuracy of such an analysis is proven in Section 4.

3.1 Monocular viewing

To understand an eye viewing of holographic image we introduce a single eye perception to WD confinement of holographic image. We characterize eye perception in Wigner chart, when viewing holographic scene, by size (3-8 [mm]) and position of an eye aperture. This is an accurate approximation, the resolution of eye receptors is well correlated with an effect of eye aperture [22].

The eye aperture in one dimensional Wigner chart is approximated in such a way that it limits optical field at an aperture plane spatially only (not in spatial frequency). Within one WD chart we plot the bound of the eye aperture and that of the holographic image. We represent both of them in one plane, mainly at the holographic image plane. In order to obtain this representation we back propagate WD of the eye aperture to the holographic image plane within Fresnel approximation regime. This results in a slit of spatial width ϕo and tilt -λzo (Fig. 8 ). The parameters ϕ o and zo are representing a width of an eye aperture (its diameter) and a distance of an aperture to holographic image plane (observation distance). The analysis is one dimensional, as the asymmetric diffuser unaffecting X-perspective, only, is present in the system (Fig. 1). Moreover the analysis of WD of circular aperture [23] will not permit WD graphical study of full imaging system.

 figure: Fig. 8

Fig. 8 The WD representation of monocular visual perception of holographic image.

Download Full Size | PDF

In Fig. 8 the single eye observation of real on-axis holographic image is represented in WD chart. The eye slit is frequency unbounded, since its frequency bounds are much higher than these of holographic image. In Fig. 8 WD of the aperture of eye and of holographic image are in scale (the eye aperture 5.5 [mm] and the SLM size in horizontal direction 15.36 [mm]). From WD chart we can approximate the resolution of the viewed image and its field of view.

The observed image resolution is a function of two distances z and zo. We can select z and zo such that either eye or SLM limits the resolution. We assume however that an eye will limit resolution, i.e., the image can have some details that will not be observed.

Thus we formulate an observation condition βx/λz > ϕo/λzo, where the observed spatial resolution is ϕo/λzo. This specifies minimal observation distance between the viewer and a real holographic image:

zo>φoBxz.

In the experimental configuration (Fig. 1) ϕo/βx = 5.5/15.36 = 0.36. Therefore in order to limit the resolution by a human eye we have to view holographic image from a distance as close as 0.36 z. Using the minimum observation distance for the human as 250 [mm], we see that for the reconstruction distances closer than 694 [mm] the eye limits perceived resolution.

Analyzing WD chart in Fig. 8 we obtain formula giving the field of view for monocular observation:

FOV=Bxzo+φozz+zo.

3.2 Binocular viewing

The monocular observation of a limited frequency holographic image results in a limited observation field of view. Now we turn our attention to the binocular observation. We derive conditions for two observation cases: binocular observation, and binocular stereoscopic observation. In binocular observation the views do not have to be conjugated. Images observed by individual eyes can be different. For the binocular stereoscopic vision the common image regions are viewed with both eyes.

In Fig. 9 we present WD chart analyzing binocular observation of a holographic image. Two hatched areas present the views obtained for minimum binocular observation distance zob:

 figure: Fig. 9

Fig. 9 The WD representation of binocular visual perception of holographic image.

Download Full Size | PDF

zob=(do+ϕo)Bf1λ1z.

It means that viewing distance zo has to be larger than zob. We assume that the holographic reconstruction covers entire available field of view and individual eyes are seeing marginal areas of reconstruction. Every eye sees opposite marginal area of a holographic image. For smaller images observation distance zob will be larger.

Let now consider the case when we want to observe the same area with both eyes, i.e., stereoscopic binocular observation. The case is presented in Fig. 9 by an inclined hatched line (λ−1 zobs −1). In this case projections at x axis of areas of WD of the seen image regions have common parts. Each eye sees the same region, but different frequency information, thus different perspective. For representation in Fig. 9 the observation distance has to exceed minimum distance of stereoscopic observation:

zobs=zdoBx1wherez>(Bx+2xo)Δλ1,
xo is a transverse coordinate of observed point.

The condition is valid for holographic imaging area denoted in Fig. 4 as 'no aliasing'. For other imaging regions we have to consider holographic image resolutions given by Eq. (5). In Fig. 10 we relate the above discussion to our experimental display. In the figure the lines limiting both binocular visions are plotted. We see that for larger distances of hologram reconstruction the distance of binocular observation becomes close. For z ≈0.8 [m] it drops to 0.250 [m], which is the minimal viewing distance for humans. However stereoscopic limit is rapidly growing with reconstruction distance, for reconstruction distance close to 1 [m] zobs it exceeds 4 [m]. Visual perception from that distance is of no use for such a small image.

 figure: Fig. 10

Fig. 10 Binocular observation limitation: zob and zobs as a function of observation distance.

Download Full Size | PDF

From the above discussion we see that the binocular stereoscopic vision limit is the most critical feature of SLM based holographic display. The distance zobs can be minimized with extension of Bx, i.e., in a display configuration with multiple SLMs arranged planarly side by side. However validity condition (Eq. (4)) shows that minimum distance of holographic image dmin ( axial ) increases linearly with extension of Bx. Therefore in this configuration we can view more distant holographic images with improved binocular perception. Consider for example on-axis binocular observation of hologram generated for dmin(axial). In this case the binocular stereoscopic limit is independent of Bx, and equal to doΔλ−1 . For 8 [µm] pixel pitch and wavelength 533 [nm] the distance is equal to 976 [mm]. Therefore we have to decrease pixel pitch as well, extending angular field of view. This can be obtained in a circular display configuration.

To illustrate the above discussion the reconstructions of a synthetic hologram generated from photo projection of the numerical model of statue using Gerchberg-Saxton algorithm were performed in setup shown in Fig. 1. The hologram is reconstructed at distance z = 693 [mm] and it was viewed through asymmetric diffuser. The size of the image is: 31 x 41 [mm] (width and height). To present the view of images as an eye perceives it, we have utilized digital camera. The camera was placed in reconstruction system at eye position at observation distance zo = 500 [mm]. Moreover the entrance pupil diameter of digital camera is set to 8.2 [mm]. This value is close to human observation condition in the dark room (eye aperture diameter 8 [mm]). For the specified observation condition X-directional FOV is 11 [mm]. In Fig. 11 the reconstruction views taken with digital camera are presented: (a) left, (b) central and (c) right. The width of each captured image is 11 [mm], with three views we are able to assemble entire image. Photos (a-c) were taken with digital camera at transverse positions: xo = −19, 0, 19 [mm], respectively. For quality comparison we have captured the real image directly with CCD matrix in the setup shown in Fig. 5. The image is presented in Fig. 12 .

 figure: Fig. 11

Fig. 11 The views of holographic image captured with asymmetric diffuser and digital camera (entrance pupil diameter 8.2 [mm]): (a) left perspective, (b) central perspective, (c) right perspective (Media 1).

Download Full Size | PDF

 figure: Fig. 12

Fig. 12 Real image captured with CCD.

Download Full Size | PDF

For the image presented in Fig. 11 we were unable to view it with two eyes. The image is too small in X direction, it covers only a part of available imaging space at this distance. Therefore we have generated a wide test hologram giving reconstructed image at the same distance (z = 693 [mm]) of width 46.08 [mm]. Binocular observation was now possible. The observer viewing the image saw two narrow images of width 11 [mm] and separated by 34.5 [mm]. The observer separation of eyes was 60 [mm]. This gives theoretical separation of the seen pencil images 34.85 [mm], what agrees with experimental one.

In the central part of Figs. 11 and 12 there is a rectangle with an increased intensity. This is the result of reflected and non diffracted beam illuminating SLM. Our SLM diffraction efficiency is at the level of 80% of reflected beam.

4. Imaging with SLM in Wigner Domain

In this Section we analyze the focusing ability of a spherical beam generated by SLM. We shall prove that WD representation at the focus plane gives confinement of holographic image resolution. We consider sampling of a spherical beam for two cases separately: z ≥dmin and z<dmin. In the first case (Sec. 4.1) there is no aliasing and line representing a sampled spherical beam in WD is limited by lines x = ±Bx/2, in the second case (Sec. 4.2) by lines f = ±Bf/2.

4.1. Derivation of transverse distribution at focus for distances z ≥ dmin

Let us consider ideally sampled version of a spherical beam

g(ξ)=exp{iπ(ξξ0)2zλ}comb(ξBx)Π(ξBx),
where comb(...) is a train of Dirac delta function with interval Δ. The function g(x) at focal plane of spherical beam can be described in Wigner space as:
Wg(x,f)=g(xλzf+x'2,f)g*(xλzfx'2,f)exp{2πifx'}dx',
what introducing Eq. (10) and new variable f' = f - x/λz - x'/2λz gives

Wg(x,f')=comb(λzf'Δ)Π(λzf'Bx)comb(λzΔ(f'+x'λz))Π(λzΔ(f'+x'λz))exp{-2πix'(xx0)λz}dx'.

Integration for x' and then f' gives us an intensity distribution at focal plane of the sampled spherical beam:

|g(x)|2=comb(Δλz(x-x0))sinc2(Bxλz(x-x0)),
where ⊗ indicates convolution.

4.2. Derivation of transverse distribution at focus for distances z < dmin

The above discussion concerns the case of a frequency unlimited signal. We now analyze a spherical beam limited in frequency only (unlimited spatially), therefore it is convenient to represent the sampled point image in Fourier spectrum:

G(f)=exp{iπλzf2}comb(Δ1f)Π(Bf1f).
For simplicity of presentation image of axial point is analyzed here. Since convolution with comb gives train of function replicas, we analyze only a single replica:
G(f)=exp{iπλzf2}Π(Bf1f).
Effect of comb function is discussed above and it is identical for analyzed here case. The function G(f) represented in WD at focal plane is therefore
WG(x,f)=Π(Bf1(f+f'2))Π(Bf1(ff'2))exp(2πixf')df'.
what integrating results in
|g(x)|2=sinc2(Bfx) .
The effect of disregarding comb results in repetition of the above at spatial intervals Δ/λz.

5. Conclusions

The paper presents a new approach to the analysis of the Spatial Light Modulator based holographic display. Two major aspects are covered: imaging and viewing. The analysis of these aspects employs joint space-spatial frequency representation. Based on the WD analysis we derive formula for spatial evolution of space bandwidth product (resolution and field of view). The aliased imaging is studied as well. This provides valuable extension of the display field of view. All of the results are proven experimentally.

The holographic image WD representation analysis is extended to cover effect of visual perception. Angular resolution and field of view of human observation of reconstructed image are theoretically examined. Both monocular and binocular perception are studied. With the results we can predict how a human perceives holographically reconstructed scene. Moreover presented analysis can be easily extended to study multi SLM holographic display configurations. Given conclusions are accompanied with experimental results.

Acknowledgments

The authors would like to thank OGX group http://ogx.mchtr.pw.edu.pl for the access to digital model of the statue presented as holographic view in Figs. 11 and 12. The research leading to described results has received funding from the EU 7th Framework Programme FP7/2007-2013 under agreement 216105 ('Real 3D' Project), Ministry of Science and Higher Education within the projects N505 359536 and, in part, the statutory founds.

References and links

1. D. Gabor, “A new microscopic principle,” Nature 161(4098), 777–778 (1948). [CrossRef]   [PubMed]  

2. H. M. Ozaktas, and L. Onural, Three-Dimensional Television (Springer, Berlin, 2008).

3. D.P. Kelly, D.S. Monaghan, N. Pandey, T. Kozacki, A. Michałkiewicz, B. M. Hennelly, M. Kujawinska, “Digital Holographic Capture and Optoelectronic Reconstruction for 3D Displays,” International Journal of Digital Multimedia Broadcasting, Article ID 759323 (2010).

4. A. Michałkiewicz, M. Kujawińska, T. Kozacki, X. Wang, and P. J. Bos, “Holographic three-dimensional displays with liquid crystal on silicon spatial light modulator,” Proc. SPIE 5531, 85–94 (2004). [CrossRef]  

5. K. Choi, H. Kim, and B. Lee, “Full-color autostereoscopic 3D display system using color-dispersion-compensated synthetic phase holograms,” Opt. Express 12(21), 5229–5236 (2004). [CrossRef]   [PubMed]  

6. J. Otón, P. Ambs, M. S. Millán, and E. Pérez-Cabré, “Dynamic calibration for improving the speed of a parallel-aligned liquid-crystal-on-silicon display,” Appl. Opt. 48(23), 4616–4624 (2009). [CrossRef]   [PubMed]  

7. J. Hahn, H. Kim, Y. Lim, G. Park, and B. Lee, “Wide viewing angle dynamic holographic stereogram with a curved array of spatial light modulators,” Opt. Express 16(16), 12372–12386 (2008). [CrossRef]   [PubMed]  

8. K. Maeno, N. Fukaya, O. Nishikawa, K. Sato, and T. Honda, “Electro-holographic display using 15 mega pixels LCD,” Proc. SPIE 2652, 15–23 (1996). [CrossRef]  

9. S.-K. Kim, D.-W. Kim, Y. M. Kwon, and J.-Y. Son, “Evaluation of the monocular depth cue in 3D displays,” Opt. Express 16(26), 21415–21422 (2008). [CrossRef]   [PubMed]  

10. M. J. Bastiaans, “Wigner distribution function and its application to first-order optics,” J. Opt. Soc. Am. 69, 1710–1716 (1980). [CrossRef]  

11. J. W. Goodman, Introduction to Fourier Optics 2nd ed., (McGraw-Hill, New York, 1996).

12. A. W. Lohmann, R. G. Dorsch, D. Mendlovic, Z. Zalevsky, and C. Ferreira, “Space-bandwidth product of optical signals and systems,” J. Opt. Soc. Am. A 13, 470–473 (1996). [CrossRef]  

13. J. J. Stamnes, Waves in Focal Regions, (Hilger, Bristol, 1986).

14. T. Latychevskaia and H.-W. Fink, “Solution to the twin image problem in holography,” Phys. Rev. Lett. 98(23), 233901 (2007). [CrossRef]   [PubMed]  

15. I. Yamaguchi and T. Zhang, “Phase-shifting digital holography,” Opt. Lett. 22(16), 1268–1270 (1997). [CrossRef]   [PubMed]  

16. C. E. Shanon, “Communication in presence of noise”, Proc. IRE 37, 447–457 (1949).

17. A. Stern and B. Javidi, “Sampling in the light of Wigner distribution,” J. Opt. Soc. Am. A 21(3), 360–366 (2004). [CrossRef]  

18. A. Bergeron, J. Gauvin, F. Gagnon, D. Gingras, H. H. Arsenault, and M. Doucet, “Phase calibration and applications of a liquid-crystal spatial light modulator,” Appl. Opt. 34, 5133–5139 (1995). [CrossRef]   [PubMed]  

19. X. Xun and R. W. Cohn, “Phase calibration of spatially nonuniform spatial light modulators,” Appl. Opt. 43(35), 6400–6406 (2004). [CrossRef]   [PubMed]  

20. J. R. Fienup, “Phase retrieval algorithms: a comparison,” Appl. Opt. 21(15), 2758–2769 (1982). [CrossRef]   [PubMed]  

21. T. Kozacki, “Numerical errors of diffraction computing using plane wave spectrum decomposition,” Opt. Commun. 281, 4219–4223 (2008). [CrossRef]  

22. Y. Le Grand, Physiological Optics (Springer, New York, 1980).

23. M. J. Bastiaans and P. G. J. van de Mortel, “Wigner distribution function of a circular aperture,” J. Opt. Soc. Am. A 13, 1698–1703 (1996). [CrossRef]  

Supplementary Material (1)

Media 1: MOV (3776 KB)     

Cited By

Optica participates in Crossref's Cited-By Linking service. Citing articles from Optica Publishing Group journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (12)

Fig. 1
Fig. 1 The experimental setup for observation of holographic image generated by SLM based holographic display.
Fig. 2
Fig. 2 The evolution of WD of holographic signal during propagation.
Fig. 3
Fig. 3 Generation of holographic point images: (a) for distance z > d min, (b) for distance z < d min, (c) for highly off-axis points.
Fig. 4
Fig. 4 Transverse resolution and field of view of holographic image.
Fig. 5
Fig. 5 The experimental setup for capturing a real holographic image.
Fig. 6
Fig. 6 The resolution of a real holographic image (position of the first zero of point image as a function of reconstruction distance).
Fig. 7
Fig. 7 The image of a generated target (a) and its image reconstruction (b) from computer generated hologram.
Fig. 8
Fig. 8 The WD representation of monocular visual perception of holographic image.
Fig. 9
Fig. 9 The WD representation of binocular visual perception of holographic image.
Fig. 10
Fig. 10 Binocular observation limitation: zob and zobs as a function of observation distance.
Fig. 11
Fig. 11 The views of holographic image captured with asymmetric diffuser and digital camera (entrance pupil diameter 8.2 [mm]): (a) left perspective, (b) central perspective, (c) right perspective (Media 1).
Fig. 12
Fig. 12 Real image captured with CCD.

Equations (17)

Equations on this page are rendered with MathJax. Learn more.

u ( x , z ) = exp ( i k z ) i λ z u ( x s , 0 ) exp ( i k ( x x s ) 2 2 z ) d x s ,
W u ( z = d ) ( x , f ) = W u ( z = 0 ) ( x λ z f , f ) .
d min ( axial ) = B x Δ λ 1 .
d min ( offaxis ) = ( B x + 2 x 0 ) Δ λ 1 .
I ( x , z ) = { sin c 2 ( B f ( x - x 0 )) for | x 0 | ( B x B f λ z ) / 2 sin c 2 ( B x x λ z ( x x 0 )) for |x 0 | < ( B f λ z B x ) / 2 sin c 2 ( B x + B f λ z 2 x 0 2 λ z ( x - x 0 )) for | B f λ z B x 2 | > | x 0 | > B x + B f λ z 2 . 0 otherwise
z o > φ o B x z .
F O V = B x z o + φ o z z + z o .
z o b = ( d o + ϕ o ) B f 1 λ 1 z .
z o b s = z d o B x 1 where z > ( B x + 2 x o ) Δ λ 1 ,
g ( ξ ) = exp { i π ( ξ ξ 0 ) 2 z λ } comb ( ξ B x ) Π ( ξ B x ) ,
W g ( x , f ) = g ( x λ z f + x ' 2 , f ) g * ( x λ z f x ' 2 , f ) exp { 2 π i f x ' } d x ' ,
W g ( x , f ' ) = comb ( λ z f ' Δ ) Π ( λ z f ' B x ) comb ( λ z Δ ( f ' + x ' λ z )) Π ( λ z Δ ( f ' + x ' λ z ))exp{- 2 π i x ' ( x x 0 ) λ z }dx' .
| g ( x ) | 2 = comb ( Δ λ z (x-x 0 )) sin c 2 ( B x λ z (x-x 0 )),
G ( f ) = exp { i π λ z f 2 } comb ( Δ 1 f ) Π ( B f 1 f ) .
G ( f ) = exp { i π λ z f 2 } Π ( B f 1 f ) .
W G ( x , f ) = Π ( B f 1 ( f + f ' 2 ) ) Π ( B f 1 ( f f ' 2 ) ) exp ( 2 π i x f ' ) d f ' .
| g ( x ) | 2 = sin c 2 ( B f x) .
Select as filters


Select Topics Cancel
© Copyright 2024 | Optica Publishing Group. All rights reserved, including rights for text and data mining and training of artificial technologies or similar technologies.