Expand this Topic clickable element to expand a topic
Skip to content
Optica Publishing Group

Fresnel and Fourier hologram generation using orthographic projection images

Open Access Open Access

Abstract

A novel technique for synthesizing a hologram of three-dimensional objects from multiple orthographic projection view images is proposed. The three-dimensional objects are captured under incoherent white illumination and their orthographic projection view images are obtained. The orthographic projection view images are multiplied by the corresponding phase terms and integrated to form a Fourier or Fresnel hologram. Using simple manipulation of the orthographic projection view images, it is also possible to shift the three-dimensional objects by an arbitrary amount along the three axes in the reconstruction space or invert their depths with respect to the given depth plane. The principle is verified experimentally.

©2009 Optical Society of America

1. Introduction

Holography has been exploited for various applications since it was first proposed by Gabor in 1948. For three-dimensional (3D) displays, holography has been considered as the perfect technique, since it can provide flawless 3D images with complete human depth cues. One problem that needs to be solved for realizing holographic 3D displays is the complicated hologram capture process. In order to get a hologram of 3D objects, we need to build a coherent optical system with laser illumination, which is generally complicated. Moreover, optical hologram acquisition is possible only for moderate sized objects and is not possible for distant objects or background scenes due to the difficulties of laser illumination. Computer generated holograms (CGH) can be one solution [1,2]. CGH calculates the hologram numerically, eliminating the necessity of a coherent optic system. CGH, however, requires the full 3D information for the objects to perform the hologram calculation. Therefore CGH is available only for computer-generated (CG) objects and not for objects in the real world. The hologram acquisition of 3D objects in the real world still requires a complicated coherent optical system.

In order to address this problem, various hologram generation methods from multiple view images taken under regular incoherent white illumination have been reported [3-9]. Mishina et al. proposed a calculation method for holograms from elemental images captured by integral photography [3]. They numerically simulated the 3D image integration process of integral photography using Fresnel diffraction theory and obtained a complex field of an integrated 3D image from the elemental images. Their method is based on the elemental image, which has the perspective projection geometry. Their method, however, is limited to Fresnel holograms only. Abookasis et al. and Sando et al. proposed hologram calculation methods from angular projections of 3D objects [4, 5]. The angular projection used in their method is an image that we can get after rotating the 3D object about its local coordinate origin and projecting it onto the central transverse plane, i.e. the XY plane [4]. In order to capture those images with the camera, we need to move the camera on a curved surface whose curvature radius is much larger than the object thickness [5]. N.T. Shaked et al., Abookasis et al., and Sando et al. extended these methods to make angular view point image acquisition easier using a lens array [6], to generate a Fresnel hologram [7, 8], or to generate a full-color Fourier hologram [9]. All this previous research, however, has not produced an exact hologram but requires small-angle approximation, which originates inherently from the projection geometry of the angular view images they used. Especially, the Fresnel hologram method requires indirect synthesis [8] or generates not an exact but a modified hologram [7]. Also, the 3D location of the 3D image in the reconstruction space is fixed as the original location of the 3D object in their methods and no manipulation methods have been proposed.

In this paper, we propose a novel method for generating a hologram from multiple view images. The unique point in the proposed method is the use of orthographic projection geometry instead of the perspective projection geometry or angular projection used in previous methods [3-9]. We have first reported an initial idea for generating holograms using the orthographic projection images [10]. The use of the orthographic projection geometry enabled us to generate an exact Fourier hologram in a straightforward way without any approximations. Our previous report, however, was limited to the Fourier hologram only, and the orthographic projection images were obtained by a simulation. In this paper, we extend our previous method to generate not only a Fourier hologram but also a Fresnel hologram. The use of the lens array for capturing the orthographic projection images is also proposed and experimentally verified. Moreover, a novel method for manipulating the 3D image location in the reconstruction space is proposed. The shift of the reconstructed image in 3D space and the inversion along the depth direction are achieved by exchanging or shifting the orthographic projection geometry view images. To the best of the authors’ knowledge, this feature has not been addressed yet. In the following, we explain the principle of the proposed method and present the experimental results for its verification.

2. Orthographic projection geometry

Projection geometry is the geometrical relationship between 3D objects and the view image at the image plane. Figure 1 shows three different types of projection geometry. Figure 1(a) is the angular orthogonal projection geometry that is used by Abookasis et al. and Sando et al [4,5]. The projection lines are parallel to each other and the image plane is slanted with a normal vector having angles φ and θ as defined in Fig. 1(a). The projection image coordinates (xp, yp) and the object point (x, y, z) are related by

xp=xcosφzsinφ,
yp=ycosθzsinθcosφxsinφsinθ.
 figure: Fig. 1.

Fig. 1. Projection geometry

Download Full Size | PDF

Figure 1(b) is the perspective projection geometry used by N.T. Shaked et al. [6]. The projection lines converge at a vanishing point that corresponds to the principal point of the camera lens. The projection image coordinates (xp, yp) in this projection geometry is given by

xp=xo(xxo)f/z,
yp=yo(yyo)f/z,

where (xo, yo) is the camera position and f is the focal length of the camera lens. Figure 1(c) is the orthographic projection geometry that we use in the proposed method. The projection lines are all parallel as the angular orthogonal projection shown in Fig. 1(a) but the image plane is not slanted like the perspective projection geometry shown in Fig. 1(b). If we let r denote one of the projection lines, the angle φ shown in Fig. 1(c) is the angle that the projection of r onto the x-z plane makes with the z-axis. Similarly, the angle θ is the angle that the projection of r onto the y-z plane makes with the z-axis. The projection image coordinates (xp, yp) can be written as

xp=x+ztanφ=x+zs/l,
yp=y+ztanθ=y+zt/l,

where s, t, and l are defined as shown in Fig. 1(c) in order to represent the projection direction more conveniently [11,12]. The use of the orthographic projection geometry given by Eq. (3) leads to the exact Fourier or Fresnel hologram calculation and easy 3D manipulation of the reconstructed 3D images, as will be explained in the following sections.

One efficient way to capture the orthographic image is to use a lens array [11]. From single capture using the lens array, a number of orthographic projection images can be obtained. Figure 2 shows the configuration. When a 3D object is imaged through the lens array, each elemental lens of the lens array forms an image of the 3D object at its focal plane, which is called an elemental image. If the pixels are collected from each elemental image at the same local position, then the assembled pixels form an orthographic projection image. For example, in Fig. 2, pixel at the position of a red dot is collected from every elemental image to form an orthographic projection image with a projection angle of tan-1(s 1/l)=tan-1(s 1/fla) where fla is the focal length of the lens array. In the same manner, the pixels at the green dots are assembled to form another orthographic image with a projection angle of tan-1(s 2/l). Since one pixel is extracted from each elemental image, the number of the pixels in the synthesized orthographic image is the same as the number of the elemental lenses in the lens array. The total number of the synthesized orthographic images is given by the number of the pixels in one elemental image.

 figure: Fig. 2.

Fig. 2. Orthographic image acquisition using a lens array

Download Full Size | PDF

Although it provides a convenient way to obtain orthographic projection images at a single capture, it should be noted that this lens array method also brings several limitations. One limitation is the range of the projection angle. Due to the limited field of view and paraxial imaging of the lens array, the angular range that can be captured by the lens array is limited to a small value. The resolution of the captured orthographic projection images is also limited in the lens array method. The sampling rate of the captured orthographic projection images is determined dominantly by the elemental lens pitch that may not small enough to capture the high details of the object. This low sampling rate limits the maximum object spatial bandwidth that can be processed in the Fourier and Fresnel hologram generation.

3. Fourier hologram generation using multiple orthographic view images

Figure 3 illustrates Fourier hologram generation using the orthographic projection images. The Fourier hologram of 3D objects is generated by the following steps. First, the orthographic projection images of the 3D objects are captured. The second step is the multiplication of each orthographic projection image by the phase factor of the slanted plane wave. The slanting angle of the plane wave is determined by the projection angle of the orthographic projection image. Third, the product of the multiplication is integrated into a single complex value. This complex value is the complex field of the 3D object at a single point in the Fourier plane. Finally, by repeating the above steps for all orthographic projection images, the entire Fourier hologram is acquired.

The procedure of the proposed method is intuitively straightforward. Generally, the light field of the 3D object is diffracted along the Fresnel propagation and refracted by the lens, resulting in a redistributed light field in the Fourier plane. If we concentrate on the parallel rays as shown in Fig. 3, we can see that each point on the Fourier plane corresponds to the integration of the parallel rays. Since the orthographic projection image is the intensity distribution of the parallel projection rays, the complex field at a point in the Fourier plane can be calculated by integrating the orthographic projection image with the slanted plane wave phase factor that accounts for the slanted projection angle.

 figure: Fig. 3.

Fig. 3. Fourier hologram generation from the orthographic projection images

Download Full Size | PDF

Let us present the proposed method mathematically. If we denote the orthographic projection image corresponding to the projection angles φ and θ or equivalently, s, t, and l as defined in Fig. 1(c) and Fig. 3 as Pst(xp, yp), the proposed method calculates the Fourier hologram H by

H(s,t)=Pst(xp,xp)exp[j2πb(xps+ypt)]dxpdyp,

where l is assumed to be a constant so that the orthographic projection direction is completely described by s and t, and b is a positive constant that will be determined later.

The Fourier hologram of the 3D object O(x,y,z) is given by[13]

H(u,v)=O(x,y,z)exp[jπλf(zfu2+zfv22xu2yv)]dxdydz,

where f is the focal length of the lens and λ is the wavelength. Now we show that the proposed method given by Eq. (4) produces an exact Fourier hologram given by Eq. (5) without any approximation using single-source-point method [6]. Let us consider one infinitesimal object point with the size of (Δx, Δy, Δz), located at coordinates (x, y, z), and having the value of O(x, y, z). The orthographic projection image p SSP st(xp, yp) that corresponds to this infinitesimal object point is given by Eq. (3) as

PstSSP(xp,yp)=O(x,y,z)δ(xpxszl,ypytzl)ΔxΔyΔz,

where δ is the Dirac delta impulse function. Substituting Eq. (6) into Eq. (4) leads to

HSSP(s,t)=O(x,y,z)δ(xpxszl,ypytzl)ΔxΔyΔz
×exp[j2πb(xps+ypt)]dxpdyp
=O(x,y,z)exp[j2πb(xp+ys+zls2+zlt2)]ΔxΔyΔz

where HSSP (s,t) is the hologram that corresponds to the infinitesimal object point. The hologram for entire 3D object scene is the volume integral of HSSP(s,t) over all 3D object points. Hence we get

H(s,t)=HSSP(s,t)dxdydz
=O(x,y,z)exp[j2πb(xs+ys+zls2+zlt2)]dxdydz.

In practice, the orthographic projection images are captured at discrete s and t values. If we rewrite Eq. (8) using continuous coordinates u=Ms and v=Mt at the Fourier hologram plane, finally we get

H(u,v)=O(x,y,z)exp[j2πbM(xu+yv+zlMu2+zlMv2)]dxdydz,

where M is a magnification factor. Equations (5) and (9) are the same, provided that

M=2fl,b=2λl.

Therefore, the exact Fourier hologram of the 3D object can be generated using the orthographic projection images. Note that no approximation is necessary in the process, unlike other methods using different projection geometry.

4. Fresnel hologram generation using multiple orthographic view images

Figure 4 illustrates Fresnel hologram generation using the proposed method. The steps used for generating Fresnel hologram in the proposed method is as follows. First, the orthographic projection images of the 3D objects are captured as before. Second, each orthographic projection image is shifted and multiplied by a constant phase term. The amount of the shift and the phase angle of the constant phase term are determined by the projection angle, or equivalently s and t, of each orthographic projection image. Finally, the Fresnel hologram is obtained by adding all shifted and multiplied orthographic projection images. Like the case of the Fourier hologram, the proposed method for Fresnel hologram generation is also intuitively straightforward. The orthographic projection image represents the intensity distribution of one set of the parallel rays penetrating the projection image plane. Since the parallel rays undergo the same amount of phase change and lateral position shift when they propagate a distance D as shown in Fig. 4, we can estimate the complex field contributed by one set of the parallel rays by shifting laterally and multiplying a phase factor to the orthographic image. By repeating this process for all sets of parallel rays, or equivalently for all orthographic projection images, and adding them, we can get the complex field of 3D objects.

Letting the Fresnel hologram plane be at distance D from the projection image plane as shown in Fig. 4, the complex field Hs,t(u,v) contributed by an orthographic projection image Ps,t(xp, yp) of the projection angles s and t is calculated by the proposed method as

Hs,t(u,v)=Ps,t(ucsDl,vctDl)exp{j2πb[(sl)2+(tl)2]},

where b and c are the constants to be determined later. The proposed method generates the Fresnel hologram by adding all Hs,t(u,v), i.e.

H(u.v)=s,tHs,t(u,v).

Now we show that the Fresnel hologram calculated using the orthographic projection images by Eq. (12) is equivalent to the Fresnel hologram of the 3D object that is given by [13]

H(u,v)=1jλ(D+z)O(x,y,z)exp{jπλ(D+z)[(ux)2+(vy)2]}dxdydz,

where a phase term exp[jkz] is omitted since we can assume arbitrary phase distribution on the object surface. We start from Eq. (12). If we obtain the orthographic projection image with sufficiently small angular separation, or sufficiently small Δs and Δt, Eq. (12) can be represented as an integral form. From Eqs. (11) and (12), we get

H(u,v)=Ps,t(ucsDl,vctDl)exp{j2πb[(sl)2+(tl)2]}dsdt.

Again we consider an infinitesimal object point O(x, y, z) located at (x, y, z). Using Eq. (6), the hologram HSSP(u,v) for this infinitesimal object point is given by

HSSP(u,v)=Ps,tSSP(ucsDl,vctDl)exp{j2πb[(sl)2+(tl)2]}dsdt
=O(x,y,z)δ(ucsDlxszl,vctDlytzl)
×ΔxΔyΔzexp{j2πb[(sl)2+(tl)2]}dsdt
=l2(cD+z)2O(x,y,z)exp{j2πb(cD+z)2[(ux)2+(vy)2]}ΔxΔyΔz

The hologram for entire object scene is the volume integral of HSSP(u,v) over all 3D object points. Hence we get

H(u,v)=HSSP(u,v)dxdydz
=l2(cD+z)2O(x,y,z)exp[j2πb(cD+z)2[(ux)2+(vy)2]]dxdydz.

If we assume that cD>>z and the object function O(x,y,z) is slowly varying in comparison to the quadratic phase exponential function in Eq. (16), we can approximate Eq. (16) to

H(u,v)=1cD+2zO(x,y,z)exp{j2πbcD(cD+2z)[(ux)2+(vy)2]}dxdydz

where the constant term is ignored [13]. Equation (17) is the same as Fresnel hologram of the 3D object given by Eq. (13), provided that

b=2Dλ,c=2.

Therefore, the hologram generated by the proposed method is equivalent to the Fresnel hologram of the 3D object.

 figure: Fig. 4.

Fig. 4. Fresnel hologram generation from orthographic projection images

Download Full Size | PDF

5. Hologram generation for 3D location shifted or depth inverted object

A simple method to shift the 3D location or invert the depth of the captured 3D objects is proposed in this section. The depth inversion and the location shift are achieved by modifying the captured orthographic projection images. With the modified orthographic projection images, the method for generating Fourier or Fresnel holograms is performed as explained in sections 3 and 4, resulting in a hologram for the depth inverted or 3D location shifted 3D objects. Figure 5 shows the concept.

The capability of the 3D location shift and the depth inversion comes from the simple relation between the projection image coordinates and the 3D object coordinates of the orthographic projection geometry, which is given by Eq. (3). First, the lateral shift of the 3D objects in the reconstruction volume is achieved by shifting all orthographic projection images by the same amount. From Eq. (3), shifting the orthographic projection image Ps,t(xp, yp) by (δx, δy) to form the modified orthographic projection image P's,t(xp, yp)=Ps,t(xp-δx, yp-δy) as shown in Fig. 5(a) leads to shifted coordinates (x'p, y'p) given by

x'p=xp+δx=(x+δx)+zs/l,
y'p=yp+δy=(y+δy)+zt/l,

implying that the 3D image is reconstructed with lateral shift (δx, δy).

The longitudinal shift of the 3D image is achieved by shifting each orthographic image according to its projection view angle, i.e. s and t. For the longitudinal shift of δz, each orthographic projection image Ps,t(xp, yp) is shifted by (δzs/l, δzt/l) to form a modified orthographic projection image P's,t(xp, yp)=Ps,t(xp-δzs/l, yp-δzt/l) as shown in Fig. 5(b), giving new coordinates (x'p, y'p) as

x'p=xp+δzs/l=x+(z+δz)s/l,
y'p=yp+δzt/l=y+(z+δz)t/l,

Equation (20) reveals that the new coordinates correspond to the shifted object depth z+δz. Note that the shift of the orthographic projection image by a constant value results in the lateral shift of the 3D image; the projection angle, i.e. s and t, dependent shift provides longitudinal shift of the 3D image.

 figure: Fig. 5.

Fig. 5. Modification of orthographic images for manipulating 3D object (a)Lateral shift by (δx, δy), (b) depth shift by δz, and (c) depth inversion

Download Full Size | PDF

Finally, the depth inversion is performed by exchanging each orthographic image of the projection angle s and t with the orthographic image corresponding to −s and -t, i.e. P's,t(xp, yp)=P-s,-t(xp, yp) as shown in Fig. 5(c). The new coordinates are given by

x'p=xzs/l,
y'p=yzt/l,

which indicates that the new coordinates correspond to the inverted depth −z, resulting in a converted depth with respect to the z=0 plane. Depth inversion with respect to the arbitrary depth plane is also possible by sequentially applying the depth shift and the depth conversion. For example, shifting the depth by δz=-d using Eq. (20), inverting the depth with respect to z=0 plane, and shifting again the depth by δz=d gives a 3D image whose depth is inverted with respect to the z=d plane.

6. Experimental result

We verified the proposed method experimentally. In the experiment, two plane objects ‘C’ and ‘B’ at different depths are captured using a lens array. From the elemental images, the orthographic images are synthesized by collecting the pixels at the same position in each elemental image [11]. Using the synthesized orthographic images, the Fourier and Fresnel holograms are generated with and without depth shift and depth inversion based on the proposed method. Finally, the holograms are numerically reconstructed at various depths.

The experimental setup used to capture the elemental images is shown in Fig. 6. The objects are located away from the lens array at 30mm for ‘C’ and 50mm for ‘B’. The lens array consists of identical elemental lenses of 1mm (H) × 1mm (V) lens pitch and fla=3.3mm focal length. The valid number of elemental lenses is 67(H) × 59(V). The elemental images formed by the lens array are captured by the CCD of 3288(H) × 2470(V) resolution through the imaging lens system, Nikon AF Nikkor 28-80mm.

 figure: Fig. 6.

Fig. 6. Experimental setup to capture the elemental images

Download Full Size | PDF

Figure 7 shows the elemental images captured by the CCD. The resolution of each elemental image is 41(H) × 41(V) pixels. Since the elemental lens pitch is 1mm, the pixel size of the elemental image is given by Δst=1mm/41=24.4um. Due to a limited field of view of the elemental lens, each elemental image contains only a part of the object. Also there are elemental images that do not contain any object image since the object is out of the field of view of the corresponding elemental lenses. Figure 8 shows the orthographic images generated using the elemental images of Fig. 7. In Fig. 8, it is observed that the disparity of the closer object, object ‘C’, is smaller than that of the farther object, object ‘B’, which confirms that the generated image has orthographic projection geometry [12]. The resolution of the generated orthographic image is 67(H) × 59(V) pixels. The total number of the generated orthographic images is 41(H) × 41(V), but only 34(H) × 35(V) orthographic images in the central part are used in generating the holograms. The angular separation between the projection lines of the adjacent orthographic images is Δs/flat/fla=0.42° and the whole angular range is -7.2°~ +7.2° for the horizontal direction and -7.2°~ +7.6° for the vertical direction. The sampling rate of each orthographic image is given by the elemental lens pitch. Hence the object is sampled with 1 mm interval in the orthographic image.

 figure: Fig. 7.

Fig. 7. Captured elemental images

Download Full Size | PDF

 figure: Fig. 8.

Fig. 8. Generated orthographic images

Download Full Size | PDF

The Fourier and Fresnel holograms are generated using the orthographic images shown in Fig. 8. Since the number of orthographic images, i.e. 34(H) × 35(V), is not sufficient and the sampling rate of each orthographic image, i.e. 1mm (H) × 1mm (V), is low, two techniques were used in the experiment. First, each orthographic image is repeated, doubling the number of orthographic images to 68(H) × 70(V). Note that the use of the intermediate view reconstruction (IVR) technique, which synthesizes the intermediate image by interpolating the neighboring images, can enhance the result [14,15]. In our experiment, however, the intermediate image is for simplicity generated by repeating without any interpolation. By this image repeating, the angular separation between the projection lines of the adjacent orthographic images is decreased from 0.42° to 0.21°, or equivalently the pixel size of the elemental image is reduced from Δsf=24.4um to Δsf=12.2um, while the whole angular range is maintained unchanged, i.e. -7.2°~ +7.2° for the horizontal direction and -7.2°~ +7.6° for the vertical direction.

Second, different sets of parameters were used in the generation and the reconstruction of the holograms. In the case of Fourier hologram, the parameters used in the generation process with Eqs. (4) and (10) are λ=1064um, l=fla=3.3mm, b=2/(λl)=5.7×105 and Δst=12.2um. Note that the wavelength is set to a large value in order to alleviate any aliasing induced by low sampling rate of the orthographic images in the generation process. Assuming the pixel pitch of the generated hologram is Δuv=22.4um, the corresponding focal length f of the Fourier transform lens is given as f=lΔu/(2Δs)=3.03mm by Eq. (10). In the reconstruction stage, another sets of the parameters that have more realistic values, i.e. λ=532nm, f=135.5mm, Δuv=22.4um, are used. From Eq. (5), one can easily verify that the use of these different reconstruction parameters scales the lateral coordinates x and y of the object space, i.e. decreases the lateral size of the object, by a factor of 135.5/3.03=44.72 while leaving the axial coordinates z nearly unchanged. Note that these reconstruction parameters were chosen in the experiment such that the axial coordinate is kept unchanged for the purpose of the clear demonstration of the theory. One can choose different focal lengths f of the Fourier transform lens to control the lateral and axial magnifications of the object space.

In the case of Fresnel hologram, the parameters used in the generation process with Eqs. (11), (12), and (18) are λ=1064um, l=fla=3.3mm, D=350mm, b=2D/λ=657.9, c=2, Δuv=1mm and Δsf=12.2um. Again, the wavelength was set to a large value to avoid any aliasing in the generation process. Also note that Δuv=1mm are the same as the elemental lens pitch of the lens array used in the experiment since it determines the sampling rate of the orthographic images. In the reconstruction stage, the wavelength and the pixel pitch of the hologram are changed to λ=532nm, and Δuv=22.4um. One can verify from Eq. (13) that these change scales the lateral coordinates x and y of the object by a factor of 1064um/532nm ≈ 1mm/22.4um ≈ 44.7 while leaving the axial coordinate z and the distance D unchanged. Note that the use of the different sets of the parameters and the doubling of the orthographic projection images in our experiment are mainly due to low sampling rate of the lens array method. If the orthographic images are obtained with higher sampling rate using different methods, these processes will not be required.

Figure 9 shows the generated Fourier holograms. The resolution of the generated Fourier hologram is 68(H) × 70(V) pixels, which is the same as the number of the repeated orthographic images. The Fourier holograms are generated by the proposed method for three cases, i.e. (a) no shift (δx, δy, δz)=(0,0,0) and no depth inversion, (b) shift (δx, δy, δz)=(10mm,10mm,-20mm) and no depth inversion, and (c) depth shift (δx, δy, δz)=(0,0,80mm) after depth inversion. Figure 10 shows the numerical reconstruction results. In the numerical reconstruction, the focal length of the Fourier transform lens is assumed to be 135.5mm as explained above. Using the Fresnel diffraction formula and the lens function [13], the intensity at 135.5mm+z from the Fourier transform lens is calculated. Figure 10(a) shows that the Fourier hologram generated with the proposed method can reconstruct two plane objects successfully with correct depth order. The effect of the lateral and depth shift is shown in Fig. 10(b). We can see that the lateral shift and the depth shift are reflected in the results, as desired. The depth inversion result is shown in Fig. 10(c). Note that the depths of the objects are originally 30mm for object ‘C’ and 50mm for object ‘B’. By depth inversion, they are transferred to -30mm for ‘C’ and -50mm for ‘B’. Then, by depth shifting by 80mm, they are brought back to 30mm for ‘B’ and 50mm for ‘C’. Figure 10(c) shows this final result. As expected, object ‘C’ is focused at 50mm and object ‘B’ is focused at 30mm, which reveals that the depth order is inverted.

 figure: Fig. 9.

Fig. 9. Amplitude (upper figure) and phase (lower figure) of the generated Fourier hologram (a) without lateral and depth shift or inversion (b) with lateral shift by 10mm along x and y axis and depth shift of -20mm, (c) with depth inversion and depth shift of 80mm

Download Full Size | PDF

 figure: Fig. 10.

Fig. 10. Numerical reconstruction of the generated Fourier hologram (a) without lateral and depth shift or inversion (b) with lateral shift by 10mm along x and y axis and depth shift by -20mm, (c) with depth inversion and depth shift by 80mm

Download Full Size | PDF

Figures 11 and 12 show the Fresnel holograms generated by the proposed method and their numerical reconstruction results. The distance from the Fresnel hologram to the orthographic image plane, D, is set at 350mm. The resolution of the generated Fresnel holograms shown in Fig. 11 is 260(H) × 260(V) pixels including small zero padding around the active area. Note that, unlike the case of Fourier holograms, the resolution of the generated Fresnel hologram is not the same as the number of the orthographic projection images. In the case of Fresnel hologram, the orthographic images are shifted by csD/l and overlapped on the hologram plane as shown in Eqs. (11), (12) and Fig. 4. Hence the resolution of the generated hologram is determined by the area covered by the shifted orthographic images and the pixel size on the hologram plane. Since the distance is D=350mm, the angular range, i.e. tan-1 (s/l), is about -7.2°~ +7.2°, and the size of one orthographic image is Δu(value used in generation process) × (number of pixels in one orthographic image) = 1mm × 67 = 67mm, the covered area on the hologram plane can be estimated by using the first term in Eq. (11) as around 67+350×2×2×tan(7.2°)≈244mm. The hologram pixel pitch used in the generation step is Δu=1mm as explained before. Therefore, the resolution of the active area of the generated Fresnel hologram is around 244/1=244 pixels for u-axis. Similar estimation gives 59+350×2×{tan(7.2°)+ tan(7.6°)}≈241 pixels for v-axis.

Using the Fresnel diffraction formula [13], the intensity image at 350mm+z from the Fresnel hologram plane is calculated. Figures 11 and 12 reveal that the proposed method successfully generates a Fresnel hologram of the 3D objects from their orthographic projection images; their lateral/axial shift and depth inversion can also be performed with the given set of orthographic projection images.

 figure: Fig. 11.

Fig. 11. Amplitude (upper figure) and phase (lower figure) of the generated Fresnel hologram (a) without lateral and depth shift or inversion (b) with lateral shift by 10mm along x and y axis and depth shift by -20mm, (c) with depth inversion and depth shift by 80mm

Download Full Size | PDF

 figure: Fig. 12.

Fig. 12. Numerical reconstruction of the generated Fresnel hologram (a) without lateral and depth shift or inversion (b) with lateral shift by 10mm along x and y axis and depth shift by -20mm, (c) with depth inversion and depth shift by 80mm

Download Full Size | PDF

7. Conclusion

A novel method to generate Fourier and Fresnel holograms of 3D objects from their orthographic projection images is proposed. The lateral/axial shift and depth inversion of the 3D object can also be performed with the given set of the orthographic projection images using the proposed method, making it possible to locate 3D objects at any position in the reconstruction volume. The principle and the feasibility of the proposed method are verified experimentally by capturing the orthographic projection images using a lens array and generating Fourier and Fresnel holograms under various conditions. Consequently, the proposed method provides an efficient way to generate Fourier and Fresnel holograms of the real, existing 3D objects without any need for a coherent holographic capture process.

Acknowledgment

This research was partly supported by the MKE (The Ministry of Knowledge Economy), Korea under the ITRC (Information Technology Research Center) Support program supervised by the IITA (Institute for Information Technology Advancement) (IITA-2009-C1090-0902-0018)

This work was partly supported by the grant of the Korean Ministry of Education, Science and Technology. (The Regional Core Research Program / Chungbuk BIT Research-Oriented University Consortium)

References and links

1. A. W. Lohmann and D. P. Paris, “Binary Fraunhofer holograms generated by computer,” Appl. Opt. 6, 1739–1748 (1967). [CrossRef]   [PubMed]  

2. J. P. Waters, “Holographic image synthesis utilizing theoretical methods,” Appl. Phys. Lett. 9, 405–407 (1966). [CrossRef]  

3. T. Mishina, M. Okui, and F. Okano, “Calculation of holograms from elemental images captured by integral photography,” Appl. Opt. 45, 4026–4036 (2006). [CrossRef]   [PubMed]  

4. D. Abookasis and J. Rosen, “Computer-generated holograms of three-dimensional objects synthesized from their multiple angular viewpoints,” J. Opt. Soc. Am. A 20, 1537–1545 (2003). [CrossRef]  

5. Y. Sando, M. Itoh, and T. Yatagai, “Holographic three-dimensional display synthesized from three-dimensional Fourier spectra of real existing objects,” Opt. Lett. 28, 2518–2520 (2003). [CrossRef]   [PubMed]  

6. N. T. Shaked, J. Rosen, and A. Stern, “Integral holography: white-light single-shot hologram acquisition,” Opt. Express 15, 5754–5760 (2007), http://www.opticsinfobase.org/abstract.cfm?URI=oe-15-9-5754 [CrossRef]   [PubMed]  

7. N. T. Shaked and J. Rosen, “Modified Fresnel computer-generated hologram directly recorded by multiple-viewpoint projections,” Appl. Opt. 47, D21–D27 (2008). [CrossRef]   [PubMed]  

8. D. Abookasis and J. Rosen, “Three types of computer-generated hologram synthesized from multiple angular viewpoints of a three-dimensional scene,” Appl. Opt. 45, 6533–6538 (2006). [CrossRef]   [PubMed]  

9. Y. Sando, M. Itoh, and T. Yatagai, “Full-color computer-generated holograms using 3-D Fourier spectra,” Opt. Express 12, 6246–6251 (2004), http://www.opticsinfobase.org/oe/abstract.cfm?uri=OE-12-25-6246 [CrossRef]   [PubMed]  

10. M.-S. Kim, G. Baasantseren, N. Kim, and J.-H. Park, “Hologram generation of 3D objects using multiple orthographic view images,” J. Opt. Soc. Korea 12, 269–274 (2008). [CrossRef]  

11. J.-H. Park, J. Kim, and B. Lee, “Three-dimensional optical correlator using a sub-image array,” Opt. Express 13, 5116–5126 (2005), http://www.opticsinfobase.org/abstract.cfm?URI=oe-13-13-5116. [CrossRef]   [PubMed]  

12. J.-H. Park, S. Jung, H. Choi, Y. Kim, and B. Lee, “Depth extraction by use of a rectangular lens array and one-dimensional elemental image modification,” Appl. Opt. 43, 4882–4895 (2004). [CrossRef]   [PubMed]  

13. J. W. Goodman, Introduction to Foureir Optics, 2nd ed.(McGraw-Hill, New York, 1996), Chap. 4–5, p. 66–105.

14. L. Zhang, D. Wang, and A. Vincent, “Adaptive reconstruction of intermediate views from stereoscopic images,” IEEE Trans. Circuits Syst. Video Technol. 16, 102–113 (2006). [CrossRef]  

15. J.-H. Park, G. Baasantseren, N. Kim, G. Park, J.-M. Kang, and B. Lee, “View image generation in perspective and orthographic projection geometry based on integral imaging,” Opt. Express 16, 8800–8813 (2008), http://www.opticsinfobase.org/oe/abstract.cfm?URI=oe-16-12-8800 [CrossRef]   [PubMed]  

Cited By

Optica participates in Crossref's Cited-By Linking service. Citing articles from Optica Publishing Group journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (12)

Fig. 1.
Fig. 1. Projection geometry
Fig. 2.
Fig. 2. Orthographic image acquisition using a lens array
Fig. 3.
Fig. 3. Fourier hologram generation from the orthographic projection images
Fig. 4.
Fig. 4. Fresnel hologram generation from orthographic projection images
Fig. 5.
Fig. 5. Modification of orthographic images for manipulating 3D object (a)Lateral shift by (δx, δy), (b) depth shift by δz, and (c) depth inversion
Fig. 6.
Fig. 6. Experimental setup to capture the elemental images
Fig. 7.
Fig. 7. Captured elemental images
Fig. 8.
Fig. 8. Generated orthographic images
Fig. 9.
Fig. 9. Amplitude (upper figure) and phase (lower figure) of the generated Fourier hologram (a) without lateral and depth shift or inversion (b) with lateral shift by 10mm along x and y axis and depth shift of -20mm, (c) with depth inversion and depth shift of 80mm
Fig. 10.
Fig. 10. Numerical reconstruction of the generated Fourier hologram (a) without lateral and depth shift or inversion (b) with lateral shift by 10mm along x and y axis and depth shift by -20mm, (c) with depth inversion and depth shift by 80mm
Fig. 11.
Fig. 11. Amplitude (upper figure) and phase (lower figure) of the generated Fresnel hologram (a) without lateral and depth shift or inversion (b) with lateral shift by 10mm along x and y axis and depth shift by -20mm, (c) with depth inversion and depth shift by 80mm
Fig. 12.
Fig. 12. Numerical reconstruction of the generated Fresnel hologram (a) without lateral and depth shift or inversion (b) with lateral shift by 10mm along x and y axis and depth shift by -20mm, (c) with depth inversion and depth shift by 80mm

Equations (34)

Equations on this page are rendered with MathJax. Learn more.

x p = x cos φ z sin φ ,
y p = y cos θ z sin θ cos φ x sin φ sin θ .
x p = x o ( x x o ) f / z ,
y p = y o ( y y o ) f / z ,
x p = x + z tan φ = x + z s / l ,
y p = y + z tan θ = y + z t / l ,
H ( s , t ) = P s t ( x p , x p ) exp [ j 2 π b ( x p s + y p t ) ] d x p d y p ,
H ( u , v ) = O ( x , y , z ) exp [ j π λ f ( z f u 2 + z f v 2 2 x u 2 y v ) ] d x d y d z ,
P s t S S P ( x p , y p ) = O ( x , y , z ) δ ( x p x s z l , y p y t z l ) Δ x Δ y Δ z ,
H S S P ( s , t ) = O ( x , y , z ) δ ( x p x s z l , y p y t z l ) Δ x Δ y Δ z
× exp [ j 2 π b ( x p s + y p t ) ] d x p d y p
= O ( x , y , z ) exp [ j 2 πb ( x p + y s + z l s 2 + z l t 2 ) ] Δ x Δ y Δ z
H ( s , t ) = H S S P ( s , t ) d x d y d z
= O ( x , y , z ) exp [ j 2 πb ( x s + y s + z l s 2 + z l t 2 ) ] d x d y d z .
H ( u , v ) = O ( x , y , z ) exp [ j 2 π b M ( x u + y v + z l M u 2 + z l M v 2 ) ] d x d y d z ,
M = 2 f l , b = 2 λ l .
H s , t ( u , v ) = P s , t ( u c s D l , v c t D l ) exp { j 2 π b [ ( s l ) 2 + ( t l ) 2 ] } ,
H ( u . v ) = s , t H s , t ( u , v ) .
H ( u , v ) = 1 j λ ( D + z ) O ( x , y , z ) exp { j π λ ( D + z ) [ ( u x ) 2 + ( v y ) 2 ] } d x d y d z ,
H ( u , v ) = P s , t ( u c s D l , v c t D l ) exp { j 2 π b [ ( s l ) 2 + ( t l ) 2 ] } dsdt .
H SSP ( u , v ) = P s , t SSP ( u c s D l , v c t D l ) exp { j 2 π b [ ( s l ) 2 + ( t l ) 2 ] } dsdt
= O ( x , y , z ) δ ( u c s D l x s z l , v c t D l y t z l )
× Δ x Δ y Δ z exp { j 2 π b [ ( s l ) 2 + ( t l ) 2 ] } d s d t
= l 2 ( c D + z ) 2 O ( x , y , z ) exp { j 2 π b ( c D + z ) 2 [ ( u x ) 2 + ( v y ) 2 ] } Δ x Δ y Δ z
H ( u , v ) = H SSP ( u , v ) d x d y d z
= l 2 ( c D + z ) 2 O ( x , y , z ) exp [ j 2 π b ( c D + z ) 2 [ ( u x ) 2 + ( v y ) 2 ] ] d x d y d z .
H ( u , v ) = 1 c D + 2 z O ( x , y , z ) exp { j 2 π b c D ( c D + 2 z ) [ ( u x ) 2 + ( v y ) 2 ] } d x d y d z
b = 2 D λ , c = 2 .
x ' p = x p + δ x = ( x + δ x ) + z s / l ,
y ' p = y p + δ y = ( y + δ y ) + z t / l ,
x ' p = x p + δ z s / l = x + ( z + δ z ) s / l ,
y ' p = y p + δ z t / l = y + ( z + δ z ) t / l ,
x ' p = x z s / l ,
y ' p = y z t / l ,
Select as filters


Select Topics Cancel
© Copyright 2024 | Optica Publishing Group. All rights reserved, including rights for text and data mining and training of artificial technologies or similar technologies.