Expand this Topic clickable element to expand a topic
Skip to content
Optica Publishing Group

Calculation for computer generated hologram using ray-sampling plane

Open Access Open Access

Abstract

We introduce a new algorithm for calculating computer generated hologram (CGH) using ray-sampling (RS) plane. RS plane is set at near the object and the light-rays emitted by the object are sampled at the plane. Then the light-rays are transformed into the wavefront with using the Fourier transforms. The wavefront on the CGH plane is calculated by wavefront propagation simulation from RS plane to CGH plane. The proposed method enables to reproduce high resolution image for deep 3D scene with angular reflection properties such as gloss appearance.

©2011 Optical Society of America

1. Introduction

Hologram can reproduce very realistic three-dimensional (3D) images that satisfy all depth cues in the 3D perception of human vision without any special observation devices. For the electronic display of holography, the hologram pattern is calculated from 3D data, using the technique called computer generated hologram (CGH). The CGH for 3D image display requires large amount of computation, and many researches deal with horizontal parallax only (HPO) 3D display. However, full-parallax display which reproduces both horizontal and vertical parallax information is desirable, because it definitely surpasses other autostereoscopic 3D display techniques.

One of the common methods for CGH calculation is achieved by simulating the light propagation from point sources on the object surface (point-based method) [1,2]. Thanks to the progress of high-speed calculation techniques, it becomes possible to define huge number of point sources in high-density on the object surface, allowing to render smooth surfaces. However, there are some issues in the point-based technique; hidden surface removal, gloss reproduction, view-dependent texture, which are important for realistic 3D image display. There have been several proposals on the hidden surface removal and gloss reproduction by CGH [35], but still farther improvements are required in the rendering techniques for CGH, in comparison with the field of Computer Graphics (CG).

On the other hand, the CGH calculation methods based on the light-ray reproduction, similar to holographic stereogram (HS) [6] or integral photography (IP) [7], were also proposed. In Ref. [6], Yatagai originally presented the concept and demonstrated the HPO CGH based on the stereoscopic approach. Based on this approach, it is possible to apply computer graphics (CG) rendering techniques, such as the hidden surface removal, surface shading and gloss reproduction [810], which are important for realistic 3D display. Also the CGH from real scene can be easily generated from multi-view images captured by a camera array. It is possible to calculate the full-parallax CGH that reproduces light-rays by applying Fourier transform to the projection image generated by CG rendering technique [10]. In the reconstruction of the CGH, the light-rays from all elementary hologram cells are reproduced and the 3D image is reconstructed. However, the image far from the hologram plane is blurred due to the light-ray sampling and the diffraction at the hologram surface [1115]. Details of the analysis are described in the Section 2. These influences increase in proportional to the distance between the image and the CGH plane, thus it is not suitable for the display of deep scene. Even though some techniques that take account of the phase information in the stereogram to avoid the image degradation by the diffraction effect [10,16], it is still not clear to realize realistic reproduction of deep scene using CG rendering techniques.

In this paper, we propose a new algorithm for calculating full-parallax CGH with the use of virtual “light-ray sampling (RS) plane”. This method can be considered as a hybrid approach that integrates the advantage of a light-ray based method and wavefront based method. The proposed method can be applied to both virtual and real objects that are rendered by artificial computer graphics or multi-view image data captured by a camera array. Even if the objects are located distant from the CGH plane, the resolution of the reconstructed image is not degraded since the long-distance light propagation is calculated on the basis of diffraction theory. Therefore, the method makes it possible to display deep scene and objects far from the CGH plane. Additionally, the method can reproduce the angular reflection properties of the object surface, such as the glossy or metallic characteristics, using directional light-ray information, thus it will be possible to reproduce a realistic image using various CG rendering techniques.

Similar idea of using an intermediate plane for hologram computation have been presented in the previous papers [17,20]. In the proposal of reconfigurable image projection (RIP) technique [17], projective elements generated from holographic fringes projected parallax views. Different arrangements of projective elements were presented, and when the projective elements populate planar surface, the surface is quite similar to the RS plane proposed in this paper. However, RIP only dealt with HPO case, and was not the technique to convert parallax views to wavefront. Since the proposed method employs full-parallax ray information, it is easy to implement the conversion between the light-ray information and wavefront. In addition, there has not been reported the arrangement of projective elements suitable for deep 3D scene in RIP technique.

Muffoletto presented partitioned holographic computation [20], in which the objects were grouped into subsets, and an intermediate hologram was generated for each subset. Then the final hologram was computed superposing the reconstructed wavefronts from the intermediate holograms. The purpose of introducing intermediate hologram was reduction of the amount of calculation. Contrary to this, the important point of this paper is to demonstrate that the introduction of RS plane enables higher image resolution in deep 3D image when light-ray based method for CGH calculation is used. The intermediate plane is employed to derive wavefront, not holographic fringe. This leads higher potential of application. We can define a set of RS planes, and the wave propagation between RS planes can be used for occlusion processing as briefly described in Subsection 3.4.

Matsushima and Nakahara demonstrated extremely high-definition CGH, in which an intermediate plane was used for occlusion processing [21]. The method used the polygon-based CGH calculation, but still the rendering techniques for computer graphics provide us much more variation of methods for obtaining realistic images. Occlusion processing called “silhouette-masking technique” presented in Ref. [21] is quite simple and applicable to the proposed method, even though the proposed method generates the wavefront from the light-ray information.

In Section 2 we analyze the influence of light-ray sampling and diffraction on image resolution in the ray-based 3D displays, and describe how wavefront-based approach can improve the image resolution in the object far from the hologram plane. Section 3 introduces a new algorithm for calculating CGH using RS plane. Section 4 shows the results of simulated and optical reconstruction of the CGH synthesized by the proposed method. Finally, discussions and conclusions are given in Section 5.

2. Resolution limit in the image of CGHs based on the light-ray reconstruction

There have been reported the results of analysis on the imaging properties of 3D displays based on light-ray reconstruction in IP, pinhole and HS. It is known that the IP and the full-parallax HS are quite similar if the image and the parallax information are recorded in high-resolution [1014], and they can reproduce all the light-rays from the objects within a certain viewing angle, which sometimes called “light-field”. However, in the ray-based 3D displays, the resolution of the reconstructed image is degraded if the objects are located far from the hologram plane, comparing with the wavefront-based approach.

3D display by CGH is capable of reconstructing wavefront, thus that the limitations in the ray-based method can be overcome. Despite of its capability, the methods for calculating CGH based on HS, which enables to employ conventional rendering techniques of CG [610], [16,17], have same limitation as the ray-based 3D displays mentioned above. Figure 1 shows a typical model of full-parallax CGH calculation based on the principle of HS. This section reviews the factors that limit the image resolution in ray-based 3D displays.

 figure: Fig. 1

Fig. 1 The CGH calculation model based on light-ray information. In (a), a projection image is obtained using light-ray based rendering, where the center of projection is each sampling point on the hologram plane. Each projection image is Fourier transformed to derive the wavefront at each elementary hologram cell shown in (b). Tiling each hologram cell calculated in this manner, the whole hologram pattern is obtained. In reconstruction, the light-rays from the objects are reconstructed from all the elementary hologram cells, and 3D image can be observed.

Download Full Size | PDF

2.1 The influence of light-ray sampling on image resolution

In the method based on the light-ray reconstruction, the resolution of the reproduced images is limited due to the light-ray sampling as shown in Fig. 2 . The light-rays are usually sampled at the display plane which corresponds to the hologram plane in Fig. 2(a), while the angle of the light-ray direction is also sampled [Fig. 2(b)]. When the distance of the observer from the display plane, denoted as W, is large comparing to the depth of the image, denoted as z, the resolution of the image is affected mainly by the sampling on the display plane. Let p rs be the sampling pitch of the elementary cell on the display plane, then the resolution limit of the reconstructed image δrs is given by

δrs=prsW(z+W).
The image resolution also depends on the angular resolution of the light-rays, Δθa as shown in Fig. 2(b). The influence of the angular resolution becomes serious especially when the image is far from the display plane. From Fig. 2(b), the resolution limit due to the angular resolution of light-rays, δa, is given by
δa=|z|Δθa.
The limitation of image resolution is the mixture of the influences given in Eqs. (1) and (2) in the real situations. If, z << W the angular sampling becomes critical. In any cases, it is clear that the image resolution decreases with increasing the image distance z from the display plane.

 figure: Fig. 2

Fig. 2 Influence of light-ray sampling for the image resolution.

Download Full Size | PDF

2.2 The influence of diffraction

When the light-rays are reconstructed by small cells that reproduce light-rays, the reconstructed light-rays are broadened by the diffraction at the sampling plane and the reconstructed image is blurred. Let λ be the wavelength, then the resolutionδd is given by

δdλ|z|ars,
where ars is the size of the elementary hologram. The resolution of the reconstructed image is decreased in proportion to the distance z, thus the light-ray based 3D displays including HS-based CGH cannot reproduce high resolution image located at far from the display plane.

3. Method for CGH calculation

In this section, the principle and the algorithm of CGH calculation using RS plane is described. Figure 3 illustrates the model for this method. RS plane is defined near the object location and light-ray information from the object is sampled by light-ray sampling points on the RS plane likewise the method based on the light-ray reconstruction. The light-ray information corresponds to the projection images shown in Fig. 1; these images can be generated using conventional CG techniques such as ray tracing and image based rendering (IBR) from multi-view images of real or virtual object. Then each image is Fourier transformed and the complex amplitude distribution is centered at each light-ray sampling point, thus, the wavefront of the RS plane is obtained. It is considered as the conversion of the light-ray information into the wavefront on the RS plane. The wavefront propagation from RS plane to CGH plane is calculated by two-dimensional Fresnel diffraction. Finally simulating the interference of the wavefront propagated from the RS plane and the reference wave at the CGH plane, we obtain the hologram pattern.

 figure: Fig. 3

Fig. 3 Principle of the proposed CGH calculation using RS plane.

Download Full Size | PDF

Proposed method can reproduce high resolution images even for the object located far from CGH plane, because the light-rays are sampled near the object and the long-distance wavefront propagation is calculated based on the diffraction theory thus δrs, δa, and δd in Eqs. (1)(3) can be kept small, as z is substituted with the image distance from the RS plane. Therefore, the method can suit the display of deep scene and the object at far from CGH plane. Since the image on the RS plane can be calculated by ray-based rendering techniques, it is easy to implement the hidden-surface-removal, surface shading, texture mapping, and gloss appearance, hence the proposed method can reproduce image realistically. The details of the proposed method are described in the following subsections.

3.1 Calculation of the projection images on the RS plane

The The model of RS plane is illustrated in Fig. 4 . On the RS plane, light-rays are sampled at I × J points, where I and J are the total number of light-ray sampling points in horizontal and vertical directions. In the first step, the projection images for all light-ray sampling points are calculated. The center of projection for the projection image pij[m,n] (in M × N pixels) corresponds to the sampling point (xi, yj), where i = 0, 1, …I-1 and j = 0, 1, …J-1. In the following explanation, M and N are assumed to be powers of 2 for simplicity, since FFT (fast Fourier transform) is applied to pij[m,n]. If the target is virtual object, the projection image can be rendered directly using common rendering software for CG as shown in Fig. 5(a) . In case that the target is a real object, it is often difficult to directly capture pij[m,n] since the camera should be placed very near the object. In this case, the projection images can be calculated from multi-view images captured by camera array as shown in Fig. 5(b), based on the application of IBR technique.

 figure: Fig. 4

Fig. 4 The model of RS plane and projection images at each light-ray sampling point.

Download Full Size | PDF

 figure: Fig. 5

Fig. 5 Schematic of obtaining the projection images. (a) The projection images can be obtained by direct projection using general CG rendering technique; (b) The projection image can be obtained from multi-view images by applying IBR technique.

Download Full Size | PDF

3.2 Conversion the light-ray information into the wavefront

The process of conversion of light-ray information into wavefront is shown in Fig. 6 . The projection image pij[m,n] at the point (xi, yj) is multiplied by discrete random phase distribution φij[m,n], where φij[m,n] uniformly distributes in the range [0,2π). Then it is transformed into the complex amplitude distribution Pij[k,l] by using FFT as follows;

Pij[k,l]=FFT{pij[m,n]exp{jϕij[m,n]}}=Pij((kM2)ΔkRS,(lN2)ΔlRS),
where Pij[k, l] is a discrete version of Pij(x,y), which is the complex amplitude distribution of the small region around (xi, yj) on the RS plane, ΔkRS and ΔlRS are the sampling pitch on the RS plane, and m and k = 0, 1, … M-1, n and l = 0, 1, … N-1, respectively. As shown in Fig. 7 , discrete two-dimensional wavefront WRS[kRS, lRS] is obtained by tiling Pij[k,l] as
WRS[kRS,lRS]=i=0I1j=0J1Pij[kRSxiΔkRS+M2,lRSyjΔlRS+N2],
where kRS = 0, 1, … IM-1, lRS = 0,1, … JN-1. As Pij[kRSxi/ΔkRS+M/2,lRSyj/ΔlRS +N/2] represents the wavefront within the small region (M/2kRSxi/ΔkRS<M/2, N/2lRSyj/ΔlRS<N/2), the wavefront on the RS plane is obtained by tiling Pij[k,l] without overlapping. This process corresponds to the recording process for the HS shown in Fig. 1.

 figure: Fig. 6

Fig. 6 A diagram of converting the light-ray information into the wavefront on the RS plane.

Download Full Size | PDF

 figure: Fig. 7

Fig. 7 Discrete wavefront WRS[kRS, lRS] on the RS plane.

Download Full Size | PDF

3.3 Wavefront propagation from RS plane to CGH plane

Wavefront propagation from RS plane to CGH plane is calculated by two-dimensional Fresnel diffraction. Let the complex amplitude at (x, y) on the RS plane be defined as WRS(x, y), and the complex amplitude at (X, Y) by the object wave on the CGH plane be defined as WO(X, Y). When the RS plane is at a distance R from the CGH plane, the Fresnel diffraction equation is given by

WO(X,Y)=1jλRexp[j2πRλ]WRS(x,y)exp[j2πλ(Xx)2+(Yy)22R]dxdy,
where λ is the wavelength. WRS(x, y) is sampled by sampling pitch ΔkRS and ΔlRS mentioned above. Let the sampling pitch be ΔkH and ΔlH on the CGH plane, then the discrete expression of the Fresnel diffraction becomes
WO[kH,lH]=kRS=0IM1lRS=0JN1WRS[kRS,lRS]×exp[j2πλ(kHΔkHkRSΔkRS)2+(lHΔlHlRSΔlRS)22R]=AO[kH,lH]exp{jφO[kH,lH]},
where kH = 0, 1, … KH, lH = 0, 1, … LH, KH and LH are the number of pixels along horizontal and vertical directions on CGH plane, and a constant coefficient is omitted. We can calculate the wavefront on the CGH plane based on Eq. (7) and finally, by simulating the interference with the reference wave, the hologram pattern H[kH, lH] are calculated as
H[kH,lH]=c+2Re{AO[kH,lH]exp[jφO[kH,lH]j2πlHΔlHλsinθref]}=c+2AOcos{φ[kH,lH]2πlHΔlHλsinθref},
where θRef. is a reference wave angle from z-axis toward y-axis. It is possible to decrease the calculation cost by using look up table or discrete Fresnel transform algorithm by using FFT and also Shifted Fresnel diffraction [18], if the two-dimensional wavefront of RS plane is set in parallel with CGH plane.

3.4 Application to the objects at different distance from CGH plane

In the case when there exist plural objects at different distance from CGH plane and/or background scene, it is required to set the RS plane near each object, since otherwise the image resolution is degraded in the object far from the RS plane. In such a case as shown in Fig. 8 , the wavefront propagation by Fresnel diffraction from the RS plane 1 to the RS plane 2 is calculated first. The wavefront propagated from RS plane 1 is masked at the region of the object 2 on the RS plane 2 and the wavefront of the object 2 is substituted for masking region, we can obtain the wavefront combined the object 1and the object 2 on the RS plane 2. We can also perform this procedure by using the “silhouette-masking technique” presented in Ref. [21]. Then same procedure is calculated on RS plane 3 and finally the diffraction from the RS plane 3 to CGH plane is calculated, the wavefront on the CGH plane is obtained. The details of the method that accounts occlusion with plural RS planes will be reported in another paper.

 figure: Fig. 8

Fig. 8 Calculation of light propagation for the objects located at various distances.

Download Full Size | PDF

4. Experiment

In this section, we demonstrate the result of computer simulation and optical reconstruction of CGH calculated with the use of proposed RS plane. The experiment consists of two parts. In the first part, to confirm that the proposed method is applicable for deep scene, we calculated the CGH based on the proposed method and the conventional method based on light-ray reproduction, which corresponds to the CGH of HS (referred to as HSCGH hereafter), and compared the simulated reconstruction images of deep scene. In the second part of the experiment, the objects that have glossy surface were recorded in CGH based on the proposed method, and the reconstructed images were verified both numerically and optically. Glossy object was chosen because it is especially difficult to be produced by conventional wavefront based method. The optical reconstruction was carried out by using the CGH printer described in Section 4.1.

4.1 CGH printing system

Figure 9 shows the system configuration of the CGH printer used in this experiment. The CGH was divided into sub-images to adapt the resolution of a DLP (Digital Light Processing) projector (marantz. Model VP-12S2). Each sub-hologram was displayed on the DLP projector whose projection lens was removed and substituted by the one suitable for image reduction. The camera lens formed the demagnified image, which was exposed on a holographic recording material (ULTIMATE Materials. U-08). A mechanical shutter was used for the exposure. The holographic material was translated by using the x-y translation stage to record the next sub-image, and then an entire CGH is recorded on a holographic plate. A color filter transmitted only the wavelength of blue light to avoid a chromatic aberration. Table 1 shows the parameters of DMD (Digital Mirror Device) in the DLP projector and the CGH printer. The resolution of the hologram pattern determines the visible angle, within which the observer can see the reconstructed image. It is determined by the maximum angle of diffracted light with respect to the normal of the hologram plane. In Table 1, the maximum visible angle θx_ max and θy_ max were estimated as

θx_max=sin1λ2ΔkRS,θy_max=sin1λ2ΔlRS,
where λ is the wavelength of reconstructed light, and λ = 532nm in the following experiment. The system was similar to the fringe printer system presented in Ref. [19], but the system is rather simple since lamp installed in the projector was used instead of the external laser light source.

 figure: Fig. 9

Fig. 9 Schematic of the CGH printer.

Download Full Size | PDF

Tables Icon

Table 1. The Parameters of the CGH Printer

4.2 Reproduction of deep scene

The planar objects 1 and 2 were arranged at different distances from the CGH plane, to demonstrate the resolution of deep 3D image, as illustrated in Fig. 10 . Table 2 shows the parameters for this CGH calculation. 128 × 128 projection images, each of which was in 32 × 32 pixels, were Fourier transformed by FFT. Then the total number of pixels in RS plane is 4096 × 4096. The pixel pitch on the RS plane was ΔkRS = ΔlRS = 64μm/32 = 2μm, and the RS plane size is IM⋅ΔkRS = 4096 × 2μm = 256 × 64μm = 8.2mm. The RS plane was placed at 5mm front of each object, while in the conventional HSCGH, the light-rays are directly sampled at the CGH plane. The RS plane could be located behind the object, but placed at 5mm front of each object in all experiments for simplicity. Note that the best location of the RS plane can be determined so that the distance between the object and RS plane becomes minimal. However, the RS plane was defined at the plane apart from the planar objects in this experiment, otherwise the generation of projected images had not been needed to derive the wavefront on the RS plane. The distances of the two objects from the CGH plane were 10mm and 200mm, respectively, as shown in Fig. 10.

 figure: Fig. 10

Fig. 10 The object model of calculated CGH by using RS plane.

Download Full Size | PDF

Tables Icon

Table 2. Parameters of CGH Calculation

In the simulation of image reconstruction, initially, wavefront propagation from CGH plane to the imaging lens was calculated by discrete Fresnel diffraction. The imaging lens corresponds to the human eye at 200mm from the CGH plane. Then, the wavefront inside the lens pupil was multiplied by lens phase function, finally, calculate the wavefront propagation from lens pupil to image plane. The pupil size of the imaging lens was 7mm, which almost equivalent to the human eye.

The results of reconstructed images are shown in Fig. 11 . Comparing the image of object 2 that is far from the CGH plane in (c) and (d), in (c) the edge of the star is blurred, whereas a sharp image is reproduced in (d). When the object is near the CGH plane, the difference was small, though the reconstructed image of HSCGH (a) was slightly blurred. Note that, if the distance of the object would be 5mm from the CGH plane, the HSCGH and the proposed method would become equivalent.

 figure: Fig. 11

Fig. 11 Reconstructed images by numerical simulation. (a) object 1 reproduced by HSCGH; (b) object 1 reproduced by proposed method; (c) object 2 reproduced by HSCGH; (d) object 2 reproduced by proposed method.

Download Full Size | PDF

4.3 Reproduction of 3D objects with glossy surface

In the second experiment, 3D objects shown in Fig. 12 were recorded as CGH by the proposed method. There can be observed the glossy surface, rendered by conventional CG software, was also verified. Again the RS plane was defined 5mm front of the object and distance from the RS plane to the CGH plane was 200mm in this case. Table 3 shows the parameters for these CGHs calculation.

 figure: Fig. 12

Fig. 12 The object models of calculated CGH by proposed method.

Download Full Size | PDF

Tables Icon

Table 3. Parameters of CGH Calculation

Reconstructed images were obtained numerically and optically. Numerical simulation was in the same manner as explained in 4.2. In the optical reconstruction, the CGHs were recorded by CGH printer, and reconstructed by plane wave of laser light (wavelength = 532nm). The result of reconstruction is shown in Fig. 13 . The reconstructed images including the shades, glossed and highlights were observed and the advantage of the proposed method were confirmed, even though the size and viewing angle were not enough for realistic display. Speckle noise and defects were observed in the reconstructed image of Fig. 13. The defects were mainly due to the error in the CGH printer system, such as the lens aberrations and the nonuniformity of recording light intensity. It remains as an issue for future work.

 figure: Fig. 13

Fig. 13 Reconstructed image by numerical simulation and optical reconstruction. (a) and (d) Perspective image of each object; (b) and (e) Reconstructed image by simulation; (c) and (f) Optical reconstructed image.

Download Full Size | PDF

5. Discussion

5.1 Discussion on the reconstructed image resolution

Figure 11 shows that the conventional method based on the light-ray reconstruction (HSCGH) cannot reproduce high resolution image for the objects far from the hologram plane, while the proposed method achieves high resolution. In this section, let us quantitatively examine the results with using the Eqs. (1)(3).

Viewing distance W is 200mm in the same manner as 4.2. In HSCGH, z in Eqs. (1)(3) corresponds to the distance between the object and CGH plane, and the resolutions for the object 1 and object 2 are estimated as Table 4 . The dominant component that limits the resolution for object 1 is δrs, which is equal to 0.06mm, and that for the object 2 is δa and δd, which are equal to 0.83mm. The reason why δa and δd becomes the same amount is that the sampling pitch of light-rays was set equal to the spread caused by the diffraction. In the experiment of Subsection 4.2, the object size was about 8mm, and the resolution limit of HSCGH was about 1/10 of the object size, thus the shape of the object located far from CGH plane was hardly recognized in the reconstructed image.

Tables Icon

Table 4. Estimated Resolution of the Image in Subsection 4.2

On the other hand in the proposed method, the light-rays are sampled at the RS plane, 5mm apart from each object, thus z equals 5mm for both objects. The dominant components in both objects are given by δrs, which is much smaller than the case of HSCGH. This result shows that the proposed method can reproduce significantly high resolution images especially for the images distant from the CGH plane.

5.2 Application of the RS plane defined on the object surface

As another application of the proposed approach, the RS plane can be also defined on the object surface, instead of the plane parallel to the CGH plane mentioned above. In this case, the light-rays encoded at the RS plane represent the angular distribution of the light reflected from the surface (Fig. 14 ). For example, if we characterize a surface of an object with bidirectional reflection distribution function (BRDF) and define the illumination light source, then we have the directional distribution of reflected light intensity, which is equivalent to the image of directional light-rays. The image of light-rays can be transformed to the wavefront on the surface by Fourier transform as the same process explained in the Subsection 3.2. To obtain the wavefront propagating to the direction of hologram plane, the phase of the wavefront should be modified considering the inclination angle of the surface to the optical axis, similar to the polygon-based method [4]. Thus the CGH can be calculated by the Fresnel diffraction of the wavefront to the CGH plane. In this case, each sampling point on the object surface represents the point light source, and the amplitude and the phase of each point source is defined for the surfaces with arbitrary angular reflection properties, such as glossy or luster characteristics. The implementation of this method has not been done in this paper, and it is remained as a future work.

 figure: Fig. 14

Fig. 14 The case of setting the RS plane on the object surface.

Download Full Size | PDF

5.3 Discussion on the calculation cost

The computational cost for CGH calculation is not the main subject of this paper, and the calculation algorithm and implementation should be optimized in future. In this subsection, we tentatively discuss the calculation time of the experiments explained in Section 4. All CGHs in the experiments were calculated by using a PC with a CPU of Intel Westmere-EP (2.93GHz) processor and a shared memory of 24Gbytes.

In the first experiment, the calculation costs of the proposed method and HSCGH were compared. The total computational times of the proposed method and HSCGH were 682sec and 660sec, respectively. In both methods, calculation times about 655sec were consumed by the generation of projection images at the ray sampling points, as the calculation costs were same when the resolutions of projection images were same. 5sec were dedicated for mostly 32 × 32 2D-FFT repeated 128 × 128 times for all projection images; also they were common for both methods. The proposed method required about 22sec more time than HSCGH. Most of this additional time was spent by the calculation of discrete Fresnel diffraction. In this experiment, the discrete Fresnel diffraction was calculated with using 4096 × 4096 2D-FFT, but the algorithm would be able to be improved for better efficiency.

In the CGH calculation of Subsection 4.3, 3D object was used. Calculation time required for rendering 256 × 256 projection images was 2621sec, and for converting light-ray to wavefront and propagating wavefront was about 93sec. Total time for the CGH calculation was 2715sec.

In both experiments, the generation of projection images and the calculation of discrete Fresnel diffraction were calculated without applying the technique of the parallel processing and GPU computing. By application of these techniques, it is possible to decrease the calculation cost. Additionally, if it is possible to obtain the projection images as light-ray information of the object in advance, more time decrease is expected. From these results, we can say that the investigation of reducing the computation time for both projection image generation and Fresnel diffraction should be conducted in future.

5.4 Distance range of objects

If the object is located very far from the hologram plane, i.e., such as the case that Fraunhofer approximation is applicable, it is not required to apply Fresnel diffraction. Calculating Fraunhofer diffraction is appropriate in such case. If we define a background object located at very far from the hologram plane, similar to the case shown in Fig. 8, the wavefront propagation from the object at infinity to the hologram plane (or, RS plane #2 when occlusion processing is applied) can be calculated by taking Fourier transform of the background object, instead of Fresnel diffraction. This will be useful when the viewing field is large, because far objects are easily observable. In the experiment presented in Section 4, this was not applied since the viewing field is small due to the limitation in the resolution of hologram recording system.

On the other hand, the depth range of near objects should be also considered. If it is not significantly large, such that δRS, δa, and δd determined by Eqs. (1)(3) are smaller than a certain size required for the image resolution, they can be included into a single RS plane. In addition, in the case that all the objects are near the hologram plane, ray-based calculation gives reasonable resolution. Thus the proposed method is meaningful when a RS plane should be placed at different depth from the hologram plane. Optimizing the depth range of objects that should be included in a single RS plane is one of the future issues.

6. Summary

Although CGH’s are capable of high-resolution 3D image reproduction, it is not easy to implement the hidden surface removal and the gloss reproduction like a traditional CG techniques by conventional methods for calculating CGH. In this paper we proposed a new algorithm for the calculation of CGH using the RS plane virtually defined near the object. The proposed method can reproduce high resolution image of deep scene with gloss appearance, while conventional 3D displays with light-ray reproduction cannot realize high-resolution in the object located far from the display plane. In the experiment, we demonstrated that the proposed method for CGH calculation enables high resolution image display with gloss appearance in a deep scene by simulated image reconstruction and optical reconstruction.

As a future work, it is required to record and reconstruct the large-sized CGH optically for confirming the advantage of the proposed method, and demonstrating the highly-realistic 3D display by holographic technique.

References and links

1. M. Lucente, “Optimization of hologram computation for real-time display,” Proc. SPIE 1667, 32–43 (1992).

2. J. P. Waters, “Holographic image synthesis utilizing theoretical methods,” Appl. Phys. Lett. 9(11), 405–406 (1966).

3. J. S. Underkoffler, “Occlusion processing and smooth surface shading for fully computed synthetic holography,” Proc. SPIE 3011, 53–60 (1997).

4. K. Matsushima, “Exact hidden-surface removal in digitally synthetic full-parallax holograms,” Proc. SPIE 5742, 25–32 (2005).

5. K. Yamaguchi and Y. Sakamoto, “Computer generated hologram with characteristics of reflection: reflectance distributions and reflected images,” Appl. Opt. 48(34), H203–H211 (2009). [PubMed]  

6. T. Yatagai, “Stereoscopic approach to 3-D display using computer-generated holograms,” Appl. Opt. 15(11), 2722–2729 (1976). [PubMed]  

7. T. Mishina, M. Okui, and F. Okano, “Calculation of holograms from elemental images captured by integral photography,” Appl. Opt. 45(17), 4026–4036 (2006). [PubMed]  

8. P. W. McOwan, W. J. Hossack, and R. E. Burge, “Three-dimensional stereoscopic display using ray traced computer generated holograms,” Opt. Commun. 82(12), 6–11 (1993).

9. H. Yoshikawa and H. Kameyama, “Integral holography,” Proc. SPIE 2406, 226–234 (1995).

10. M. Yamaguchi, H. Hoshino, T. Honda, and N. Ohyama, “Phase-added stereogram: calculation of hologram using computer graphics technique,” Proc. SPIE 1914, 25–31 (1993).

11. J. T. McCrickerd, “Comparison of stereograms: pinhole, fly’s eye, and holographic types,” J. Opt. Soc. Am. A 62(1), 64–70 (1972).

12. L. E. Helseth, “Optical transfer function of three-dimensional display systems,” J. Opt. Soc. Am. A 23(4), 816–820 (2006).

13. P. S. Hilaire, “Modulation transfer function and optimum sampling of holographic stereograms,” Appl. Opt. 33(5), 768–774 (1994). [PubMed]  

14. I. Glaser and A. A. Friesem, “Imaging properties of holographic stereograms,” Proc. SPIE 120, 150–162 (1977).

15. M. Yamaguchi, N. Ohyama, and T. Honda, “Imaging characteristics of holographic stereogram,” Jpn. J. Opt. 22(11), 714–720 (1993) (in Japanese).

16. H. Kang, T. Yamaguchi, and H. Yoshikawa, “Accurate phase-added stereogram to improve the coherent stereogram,” Appl. Opt. 47(19), D44–D54 (2008). [PubMed]  

17. W. Plesniak, M. Halle, V. M. Bove Jr, J. Barabas, and R. Pappu, “Reconfigurable image projection holograms,” Opt. Eng. 45(11), 115801 (2006).

18. R. P. Muffoletto, J. M. Tyler, and J. E. Tohline, “Shifted Fresnel diffraction for computational holography,” Opt. Express 15(9), 5631–5640 (2007). [PubMed]  

19. H. Yoshikawa and K. Takei, “Development of a compact direct fringe printer for computer-generated holograms,” Proc. SPIE 5290, 114–121 (2004).

20. R. P. Muffoletto, “Numerical techniques for Fresnel diffraction in computational holography,” Ph. D. Thesis (Louisiana State University, 2006).

21. K. Matsushima and S. Nakahara, “Extremely high-definition full-parallax computer-generated hologram created by the polygon-based method,” Appl. Opt. 48(34), H54–H63 (2009). [PubMed]  

Cited By

Optica participates in Crossref's Cited-By Linking service. Citing articles from Optica Publishing Group journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (14)

Fig. 1
Fig. 1 The CGH calculation model based on light-ray information. In (a), a projection image is obtained using light-ray based rendering, where the center of projection is each sampling point on the hologram plane. Each projection image is Fourier transformed to derive the wavefront at each elementary hologram cell shown in (b). Tiling each hologram cell calculated in this manner, the whole hologram pattern is obtained. In reconstruction, the light-rays from the objects are reconstructed from all the elementary hologram cells, and 3D image can be observed.
Fig. 2
Fig. 2 Influence of light-ray sampling for the image resolution.
Fig. 3
Fig. 3 Principle of the proposed CGH calculation using RS plane.
Fig. 4
Fig. 4 The model of RS plane and projection images at each light-ray sampling point.
Fig. 5
Fig. 5 Schematic of obtaining the projection images. (a) The projection images can be obtained by direct projection using general CG rendering technique; (b) The projection image can be obtained from multi-view images by applying IBR technique.
Fig. 6
Fig. 6 A diagram of converting the light-ray information into the wavefront on the RS plane.
Fig. 7
Fig. 7 Discrete wavefront WRS [kRS , lRS ] on the RS plane.
Fig. 8
Fig. 8 Calculation of light propagation for the objects located at various distances.
Fig. 9
Fig. 9 Schematic of the CGH printer.
Fig. 10
Fig. 10 The object model of calculated CGH by using RS plane.
Fig. 11
Fig. 11 Reconstructed images by numerical simulation. (a) object 1 reproduced by HSCGH; (b) object 1 reproduced by proposed method; (c) object 2 reproduced by HSCGH; (d) object 2 reproduced by proposed method.
Fig. 12
Fig. 12 The object models of calculated CGH by proposed method.
Fig. 13
Fig. 13 Reconstructed image by numerical simulation and optical reconstruction. (a) and (d) Perspective image of each object; (b) and (e) Reconstructed image by simulation; (c) and (f) Optical reconstructed image.
Fig. 14
Fig. 14 The case of setting the RS plane on the object surface.

Tables (4)

Tables Icon

Table 1 The Parameters of the CGH Printer

Tables Icon

Table 2 Parameters of CGH Calculation

Tables Icon

Table 3 Parameters of CGH Calculation

Tables Icon

Table 4 Estimated Resolution of the Image in Subsection 4.2

Equations (9)

Equations on this page are rendered with MathJax. Learn more.

δ r s = p r s W ( z + W ) .
δ a = | z | Δ θ a .
δ d λ | z | a r s ,
P i j [ k , l ] = FFT { p i j [ m , n ] exp { j ϕ i j [ m , n ] } } = P i j ( ( k M 2 ) Δ k R S , ( l N 2 ) Δ l R S ) ,
W R S [ k R S , l R S ] = i = 0 I 1 j = 0 J 1 P i j [ k R S x i Δ k R S + M 2 , l R S y j Δ l R S + N 2 ] ,
W O ( X , Y ) = 1 j λ R exp [ j 2 π R λ ] W R S ( x , y ) exp [ j 2 π λ ( X x ) 2 + ( Y y ) 2 2 R ] d x d y ,
W O [ k H , l H ] = k R S = 0 I M 1 l R S = 0 J N 1 W R S [ k R S , l R S ] × exp [ j 2 π λ ( k H Δ k H k R S Δ k R S ) 2 + ( l H Δ l H l R S Δ l R S ) 2 2 R ] = A O [ k H , l H ] exp { j φ O [ k H , l H ] } ,
H [ k H , l H ] = c + 2 Re { A O [ k H , l H ] exp [ j φ O [ k H , l H ] j 2 π l H Δ l H λ sin θ r e f ] } = c + 2 A O cos { φ [ k H , l H ] 2 π l H Δ l H λ sin θ r e f } ,
θ x _ max = sin 1 λ 2 Δ k R S , θ y _ max = sin 1 λ 2 Δ l R S ,
Select as filters


Select Topics Cancel
© Copyright 2024 | Optica Publishing Group. All rights reserved, including rights for text and data mining and training of artificial technologies or similar technologies.