Expand this Topic clickable element to expand a topic
Skip to content
Optica Publishing Group

The modulation function and realizing method of holographic functional screen

Open Access Open Access

Abstract

The modulation function of holographic functional screen (HFS) in the real-time, large-size full-color (RLF), three-dimensional (3D) display system is derived from angular spectrum analysis. The directional laser speckle (DLS) method to realize the HFS is proposed. A HFS by the DLS method was fabricated and used in the experiment. Experimental results show that the HFS is valid in the RLF 3D display, and that the derived modulation function is valuable for the design of the HFS. The research results are important to realize the RLF 3D display system which will find many applications such as holographic video.

©2010 Optical Society of America

1. Introduction

Three-Dimensional (3D) display has attracted much attention because of its many applications, such as scientific research, industry, medical operation, military, architecture design, and virtual reality. Several 3D display techniques have been demonstrated, such as binocular parallax (including parallax barrier) [1], the integral photography (including multi-view imaging) by lens-board [2], and holography [38]. The binocular parallax and integral photography belong to parallax imaging. Both their image resolution and brightness are much lower, and there are blind areas between the vision regions of eyes. Recently, digital technology-based holography has captured people’s wide interest, such as computer-generated holography [911], 3D holographic video with space light modulator (SLM) [1214], and others [1519]. However, it is necessary to improve image qualities, such as resolution, image depth, chromatic aberration, and fidelity. For the conventional holography, it is difficult to realize real-time, full-color, large-size 3D display due to the limitations of recording materials and techniques. Volume displays for realizing 3D images are based on rotation of mechanical components and duration in vision, but they cannot provide a fully convincing 3D experience because of their limited color reproduction and small body display space. In order to overcome above shortages and reduce the system complexity, we proposed and developed a practical real-time, large-size full-color (RLF), 3D display system [20,21]. However, the problems associated with the RLF 3D display system are not fully solved, especially the operating principle of the holographic function screen (HFS). To further investigate the proposed technique, we study the HFS’ modulation function and its realizing method.

2. Modulation function of the HFS

As shown in Fig. 1 , the RLF 3D display system is mainly composed of a color digital cameras array (CA), a video server (VS), a color projector array (PA), and a HFS. The 3D information of the object is picked up and recorded by the CA. The VS connects the CA and the PA through conventional AV cables. The information transmitted by AV cables is processed in the VS and is then projected by the PA. 3D information is modulated and displayed by the HFS. Here we analyze both the original and recovered 3D information of the 3D object in the system with angular spectrum which is a representation form of space frequency. The light waves scattered from the object are illustrated by Eq. (1)

ψ(x,y,z;λl;t)=ϕ0(αλl,βλl;t)exp[i2π(αλlx+βλly)]d(αλl)d(βλl)(l=1,...,k)
where αλl,βλl are the continuous angular spectrum components. ϕ0(αλl,βλl;t)is the angular spectrum distribution. These waves spread abroad and are respectively picked up by each digital camera in the CA, according to different vision angles [α mn, β mn], and can be represented as Eq. (2)
ψs(x,y,z;λl;t)=mMnNϕmn(αmnλl,βmnλl;t)exp[i2πλl(1αmn2βmn2)1/2z]exp[i2π(αmnλlx+βmnλly)](αmn,βmn0)
where exp[i2πλl(1αmn2βmn2)1/2z] is an additional phase because of different vision angles and propagating distance z. The αmnλl,βmnλl are the separate angular spectrum components and the ϕmn(αmnλl,βmnλl;t)is the angular spectrum distribution at the vision angle [α mn, β mn]. The relation between them can be described as
ϕ0(αλl,βλl;t)=mMnNϕmn(αmnλl,βmnλl;t)
The intensities and colors of the waves picked up by each digital camera are recorded, and made up of a 2D-imagef(x',y';λl;t)mn, wherex', y' are the coordinates of the camera image plane. Essentially f(x',y';λl;t)mncontains the partial 3D information of the object, similar to the 2D-image used by computer-generated hologram [8]. ϕmn(αmnλl,βmnλl;t)is the Fourier transforming off(x',y';λl;t)mn which is transmitted to the processing unit in the VS. The information processing includes information coding, distortions correcting, synchronizing of image frames from different cameras. Then the information processed is projected by the PA, i.e., the projected waves carrying the information f(x',y';λl;t)mn spread abroad to point R’, can also be represented by angular spectrum as Eq. (4).
ψs'(x,y,z;λl;t)=mMnNϕmn(αmnλl,βmnλl;t)exp[i2πλl(1αmn2βmn2)1/2z]exp[i2π(αmnλlx+βmnλl)y]
Here, the x, y, z are simply used for representing the coordinated of the image space. The angular spectrum information is along the direction [α mn, β mn], and the carriedf(x',y';λl;t)mn can be respectively observed with a diffusion screen. However, this superposed 3D-image can’t be recognized with the same screen. Thus, it is necessary to expand each angular spectrum and to make all angular spectra synthesized as continuous spectra – that is to recover the original spatial angular spectrum distribution ϕ0(αλl,βλl;t). We designed a HFS which has a special modulation function T (x, y, 0). The angular spectrum expansion in a small solid angle ω mn can be represented by integral operation ωmnexp[i2π(αλlx+βλly)]d(αλl)d(βλl). The convolution operation (*) of the integral and δ(αλlαmnλl,βλlβmnλl) function can represent all expanded angular spectra in the reconstructed image. So the T (x, y, 0) represented by angular spectrum is shown in Eq. (5).
T(x,y,0)=mMnN[δ(αλlαmnλl,βλlβmnλl)ωmnexp[i2π(αλlx+βλly)]d(αλl)d(βλl)]=mMnNωmnexp{i2π[(αλlαmnλl)x+(βλlβmnλl)y]}d(αλl)d(βλl)
All ω mn are combined into a bigger solid visual angle Ω which is a sum of M × N ω mn.,
Ω=mMnNωmn
Here, the ω mn is determined by the distance z0 from the recorded object to CA and the space Λ between two neighborhood color digital cameras in the system. Thus, it can be approximately described as ω mn = arctan(Λ/ z0). The light waves modulated by the HFS are given by Eq. (7)
ψr(x,y,z;λl;t)=ψs'(x,y,z;λl;t)T(x,y,0)={mMnNϕmn(αmnλl,βmnλl;t)exp[i2πλl(1αmn2βmn2)1/2z]exp[i2π(αmnλlx+βmnλly)]}{mMnNωmnexp{i2π[(αλlαmnλl)x+(βλlβmnλl)y]}d(αλl)d(βλl)}=mMnNϕmn(αmnλl,βmnλl;t)exp[i2πλl(1αmn2βmn2)1/2z]ωmnexp[i2π(αλlx+βλly)]d(αλl)d(βλl)
where the multiplication symbol·means that theψs'(x,y,z;λl;t) is modulated by the T (x, y, 0). Substituting Eq. (3) and (6) into Eq. (7), we can obtain the Eq. (8) as
ψr(x,y,z;λl;t)=Ωϕ0(αλl,βλl;t)exp[i2πλl(1αmn2βmn2)1/2z]exp[i2π(αλlx+βλly)]d(αλl)d(βλl)
Compared with Eq. (1), only an additional phase factor of exp[i2πλl(1αmn2βmn2)1/2z] exists in Eq. (8), i.e., the functions of modulation and synthesizing angular spectrum information in Eq. (4) are realized by the HFS. Thus we can observe the perfect recovered 3D information of the recorded object within the visual angle Ω

 figure: Fig. 1

Fig. 1 Configuration of 3D scene pickup and display.

Download Full Size | PDF

From the above, the HFS is one of the key devices to realize the RLF 3D-display and the Eq. (5) present the HFS’ functions of spreading beam (or transferring wavefronts) and synthesizing light waves.

3. Implementation of the HFS

As the human eyes are capable of acclimatizing to the horizontal parallax only (HPO) 3D display, we study the HFS suited for this display. A method to realize the HFS, called as the directional laser speckle (DLS), is given as follows.

When a diffusing plate with the size a × b and diffusing grain size of δu and δv as shown in Fig. 2(a) is illuminated by a laser, a laser speckle field appears behind the diffusing plate. The average size of the speckle at X-Y plane is δx = λz 0/a, δy = λz 0/b. Assuming a«b and δx»δy, a narrow strip-shaped speckle pattern at X-Y plane can be achieved. We record the speckle pattern at a distance z = z 0, and the overlay area of the speckles beingΔx×Δy, where Δx = λz 0/δu, Δy = λz 0/δv. When the speckle pattern is illuminated by a light wave, it diffuses and limits the light in a solid diffused angle θ = λ/δx = a/z 0 and θ = λ/δy = b/z 0, as shown in Fig. 2(b). It is evident that the diffused angle θ is much bigger than θ ‖, so we can consider that the diffused light wave is scattered only along a direction perpendicular to the strip speckles. The speckle screen can facilitate direction selectivity and certain diffusing angle for diffusing the light waves, so it is called as the DLS screen and can spread beam and synchronize light waves in the region of its diffusing angle.

 figure: Fig. 2

Fig. 2 The speckle generation (a) and the diffusing of strip-shaped speckle (b).

Download Full Size | PDF

With the analysis above, we propose the DLS method to realize the HFS. According to Eq. (5), the designed HFS is composed of the J × K volume pixels (VP)s, as shown in Fig. 3(a) . Each VP is a small-square DLS screen with a side D n and the horizontal diffusing angle θ = ω n/2 as shown in Fig. 3(b) (here ω n is 1D form of ω mn). Its vertical diffused angle θ satisfies the requirements of viewing by human eyes. When each VP receives the light waves from all projectors, it can spread these light waves to a spatial angle Ω = n in the horizontal direction. Thus the intensity and color information from all projectors can be modulated and recovered with the HFS.

 figure: Fig. 3

Fig. 3 Diagram of the HFS modulation light beams (a) and realizing sub-HML (b).

Download Full Size | PDF

In addition, the holographic micro-cylinder lens (HML) and the holographic micro-cylinder lens (HMCL) can spread beam and synchronize light waves, so we considered the implement of the HFS with them. Through theoretical analysis and primary experimental research, we have shown that the HML method is suitable for the full-parallax 3D display, and the HMCL method can be applied on the HPO 3D display.

4. Experiment and analysis

In order to demonstrate the proposed scheme and to simplify the experimental system, a 3D HPO display system with 64 color digital cameras and color projectors is constructed. Here each digital camera or projector has the resolution of 480 × 640 pixels. An HFS is designed according to Eq. (5), Eq. (6), etc., and realized with DLS method due to its advantages of simple design and fabrication over the HML method and HMCL method. The setup of manufacturing the VP is with the digital holographic technology and shown in Fig. 4 [22]. The VP parameters (D n, ω n/2, f’) are realized by selecting proper diaphragm aperture at the D, adjusting devices in the setup. With the computer control, the VP master using the photo-resist recording medium is first fabricated, and then the HFS is fabricated by embossing holographic technology. The different magnitude of the HFS can also be realized based on the requirements of the 3D display reconstructed.

 figure: Fig. 4

Fig. 4 The diagram of setup to fabricate the DS screen where Li (i = 1,2,3)-Lens, S-Shutter, DP-Diffusion plate, D-Diaphragm, RP-Recording plate, BF-Brace flat, SD-Shutter drive, DPD-Diffusion plate drive, DD-Diaphragm drive, BFSD-Brace flat shifting drive, CH-Control switch, 1-Optics system and 2-Service system.

Download Full Size | PDF

The HFS fabricated by us is with the total horizontal visual angle of 45°and the size of 1.8 × 1.3 m2. When it is placed at the plane near the point R’, two group pictures of 3D-image displayed of a kangaroo and an object recorded are taken as shown in Fig. 5 . And the pictures (a) and (b) are respectively composed of five pictures, which are separately taken at five visual angles from left to right. We can see that the recovered 3D images have high definition, plenty of color, high brightness and high diffractive efficiency, which is may be seen as a reasonable approximation to full-parallax 3D display. The experiment results show that the DLS method can effectively implement large size HFS and the established display system is effectual to the RLF 3D display.

 figure: Fig. 5

Fig. 5 (a) and (b) correspond to the pictures of 3D-object displayed by the HFS (color online).

Download Full Size | PDF

5. Conclusions

In summary, the HFS in the RLF 3D display system is analyzed in detail. The modulation function of the HFS is firstly derived, which reveals the basic principle of its modulation waves. The DLS method to realize the HFS is proposed, and used to design and fabricate the HFS. The 3D display experiments with the fabricated HFS are successfully demonstrated. The experimental results show that the 3D information of the recorded object can be well recovered. Therefore, the methods of analyzing and realizing the HFS are effectual, and the derived modulation function Eq. (5) is valuable for the design of the HFS. Moreover, the presented results help to implement the RLF 3D display system, and will find many applications such as the holographic video system.

Acknowledgments

The research was supported by the Fundamental Research Funds for the Central Universities under no. 2009RC0414. The authors thank all the staff from AFC Technology Co. Ltd. for their support.

References and links

1. S. A. Benton, ed., Selected Papers on Three-Dimensional Displays, SPIE-Int’l Soc. for Optical Eng., 2001.

2. C. B. Burckhardt and E. T. Doherty, “Beaded plate recording of integral photographs,” Appl. Opt. 8(11), 2329–2331 (1969). [CrossRef]   [PubMed]  

3. M. Lucente, “Interactive three-dimensional holographic displays,” Comput. Graph. 31(2), 63–67 (1997). [CrossRef]  

4. L. Huff and R. L. Fusek, “Color holographic stereograms,” Opt. Eng. 19, 691–695 (1980).

5. D. L. Marks and D. J. Brady, “Three-dimensional source reconstruction with a scanned pinhole camera,” Opt. Lett. 23(11), 820–822 (1998). [CrossRef]  

6. N. T. Shaked and J. Rosen, “Multiple-viewpoint projection holograms synthesized by spatially incoherent correlation with broadband functions,” J. Opt. Soc. Am. A 25(8), 2129–2138 (2008). [CrossRef]  

7. M. L. Huebschman, B. Munjuluri, and H. R. Garner, “Dynamic holographic 3-D image projection,” Opt. Express 11(5), 437–445 (2003). [CrossRef]   [PubMed]  

8. http://www.holografika.com/Company/Awards.html

9. D. Abookasis and J. Rosen, “Computer-generated holograms of three-dimensional objects synthesized from their multiple angular viewpoints,” J. Opt. Soc. Am. A 20(8), 1537–1545 (2003). [CrossRef]  

10. C. W. Slinger, C. D. Cameron, S. D. Coomber, R. J. Miller, D. A. Payne, A. P. Smith, M. G. Smith, M. Stanley, and P. J. Watson, “Recent development in computer generated holography: toward a practical electroholography system for interactive 3D visualization,” Proc. SPIE 5290, 27–41 (2004). [CrossRef]  

11. Zebra Imaging, Inc (http://www. Zebraimaging.com),2008.

12. M. Alcaraz-Rivera, J. J. Báez-Rojas, and K. Der-Kuan, “Development of a fully functioning digital hologram system,” Proc. SPIE 6912, 69120S (2008). [CrossRef]  

13. V. M. Bove, “Progress in holographic video displays based on guided-wave acousto-optical device” Practical Holography XXII: Materials and Applications, Proc. of SPIE 6912, 69120H–1 (2008).

14. H. D. Zheng, Y. J. Yu, and C. X. Dai, “A novel three-dimensional holographic display system based on LC-R2500 spatial light modulator,” J. Light Electron. Opt 120(9), 431–436 (2009). [CrossRef]  

15. P. St. Hilaire, S. A. Benton, M. Lucente, J. Underkoffler, and H. Yoshikawa, “Real-time holographic display: Improvements using a multichannel acousto-optic modulator and holographic optical elements”. In Practical Holography V, Proc. SPIE 1461–37, 254–261 (1991).

16. N. T. Shaked, J. Rosen, and A. Stern, “Integral holography: white-light single-shot hologram acquisition,” Opt. Express 15(9), 5754–5760 (2007). [CrossRef]   [PubMed]  

17. M. A. Klug, C. Newswanger, Q. Huang, and E. Holzbach, “Active digital hologram display,” U. S. Patent, PAT6859293, Feb 22, 2005.

18. S. Tay, P. A. Blanche, R. Voorakaranam, A. V. Tunç, W. Lin, S. Rokutanda, T. Gu, D. Flores, P. Wang, G. Li, P. St Hilaire, J. Thomas, R. A. Norwood, M. Yamamoto, and N. Peyghambarian, “An updatable holographic three-dimensional display,” Nature 451(7179), 694–698 (2008). [CrossRef]   [PubMed]  

19. St. Reichelt, H. Sahm, N. Leister, and A. Schwerdtner, “Capability of diffractive optical elements for real-time holographic displays,” Proc. SPIE 6912, 69120P (2008). [CrossRef]  

20. X. Z. Sang, F. C. Fan, C. C. Jiang, S. Choi, W. H. Dou, C. X. Yu, and D. X. Xu, “Demonstration of a large-size real-time full-color three-dimensional display,” Opt. Lett. 34(24), 3803–3805 (2009). [CrossRef]   [PubMed]  

21. F. C. Fan, S. Choi, and C. C. Jiang, “Use of spatial spectrum of light to recover three-dimensinal holographic nature,” Appl. Opt. 49(14), 2676–2685 (2010). [CrossRef]  

22. F. C. Fan, and S. Choi, “Equipment and method to make digital speckle hologram,” Chinese patent no. ZL200410022193.7.

Cited By

Optica participates in Crossref's Cited-By Linking service. Citing articles from Optica Publishing Group journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (5)

Fig. 1
Fig. 1 Configuration of 3D scene pickup and display.
Fig. 2
Fig. 2 The speckle generation (a) and the diffusing of strip-shaped speckle (b).
Fig. 3
Fig. 3 Diagram of the HFS modulation light beams (a) and realizing sub-HML (b).
Fig. 4
Fig. 4 The diagram of setup to fabricate the DS screen where Li (i = 1,2,3)-Lens, S-Shutter, DP-Diffusion plate, D-Diaphragm, RP-Recording plate, BF-Brace flat, SD-Shutter drive, DPD-Diffusion plate drive, DD-Diaphragm drive, BFSD-Brace flat shifting drive, CH-Control switch, 1-Optics system and 2-Service system.
Fig. 5
Fig. 5 (a) and (b) correspond to the pictures of 3D-object displayed by the HFS (color online).

Equations (8)

Equations on this page are rendered with MathJax. Learn more.

ψ ( x , y , z ; λ l ; t ) = ϕ 0 ( α λ l , β λ l ; t ) exp [ i 2 π ( α λ l x + β λ l y ) ] d ( α λ l ) d ( β λ l ) ( l = 1 , ... , k )
ψ s ( x , y , z ; λ l ; t ) = m M n N ϕ m n ( α m n λ l , β m n λ l ; t ) exp [ i 2 π λ l ( 1 α m n 2 β m n 2 ) 1 / 2 z ] exp [ i 2 π ( α m n λ l x + β m n λ l y ) ] ( α m n , β m n 0 )
ϕ 0 ( α λ l , β λ l ; t ) = m M n N ϕ m n ( α m n λ l , β m n λ l ; t )
ψ s ' ( x , y , z ; λ l ; t ) = m M n N ϕ m n ( α m n λ l , β m n λ l ; t ) exp [ i 2 π λ l ( 1 α m n 2 β m n 2 ) 1 / 2 z ] exp [ i 2 π ( α m n λ l x + β m n λ l ) y ]
T ( x , y , 0 ) = m M n N [ δ ( α λ l α m n λ l , β λ l β m n λ l ) ω m n exp [ i 2 π ( α λ l x + β λ l y ) ] d ( α λ l ) d ( β λ l ) ] = m M n N ω m n exp { i 2 π [ ( α λ l α m n λ l ) x + ( β λ l β m n λ l ) y ] } d ( α λ l ) d ( β λ l )
Ω = m M n N ω m n
ψ r ( x , y , z ; λ l ; t ) = ψ s ' ( x , y , z ; λ l ; t ) T ( x , y , 0 ) = { m M n N ϕ m n ( α m n λ l , β m n λ l ; t ) exp [ i 2 π λ l ( 1 α m n 2 β m n 2 ) 1 / 2 z ] exp [ i 2 π ( α m n λ l x + β m n λ l y ) ] } { m M n N ω m n exp { i 2 π [ ( α λ l α m n λ l ) x + ( β λ l β m n λ l ) y ] } d ( α λ l ) d ( β λ l ) } = m M n N ϕ m n ( α m n λ l , β m n λ l ; t ) exp [ i 2 π λ l ( 1 α m n 2 β m n 2 ) 1 / 2 z ] ω m n exp [ i 2 π ( α λ l x + β λ l y ) ] d ( α λ l ) d ( β λ l )
ψ r ( x , y , z ; λ l ; t ) = Ω ϕ 0 ( α λ l , β λ l ; t ) exp [ i 2 π λ l ( 1 α m n 2 β m n 2 ) 1 / 2 z ] exp [ i 2 π ( α λ l x + β λ l y ) ] d ( α λ l ) d ( β λ l )
Select as filters


Select Topics Cancel
© Copyright 2024 | Optica Publishing Group. All rights reserved, including rights for text and data mining and training of artificial technologies or similar technologies.