Expand this Topic clickable element to expand a topic
Skip to content
Optica Publishing Group

Interactive floating full-parallax digital three-dimensional light-field display based on wavefront recomposing

Open Access Open Access

Abstract

Advanced three-dimensional (3D) imaging techniques can acquire high-resolution 3D biomedical and biological data, but available digital display methods show this data in restricted two dimensions. 3D light-field displays optically reconstruct realistic 3D image by carefully tailoring light fields, and a natural and comfortable 3D sense of real objects or scenes is expected. An interactive floating full-parallax 3D light-field display with all depth cues is demonstrated with 3D biomedical and biological data, which are capable of achieving high efficiency and high image quality. A compound lens-array with two pieces of lens in each lens unit is designed and fabricated to suppress the aberrations and increase the viewing angle. The optimally designed holographic functional screen is used to recompose the light distribution from the lens-array. The imaging distortion can be decreased to less than 1.9% from more than 20%. The real time interactive floating full-parallax 3D light-field image with the clear displayed depth of 30 cm can be perceived with the right geometric occlusion and smooth parallax in the viewing angle of 45°, where 9216 viewpoints are used.

© 2018 Optical Society of America under the terms of the OSA Open Access Publishing Agreement

1. Introduction

Recently, three dimensional (3D) display technology has attracted much attention, and various efforts have been made to achieve a natural and realistic 3D sense [1–11]. Recent developments in advanced imaging techniques make it possible to acquire 3D imaging data of micro-3D structure, little creatures and organs at high resolution [12–15]. Unfortunately, current 3D technology has not met the expectation. A high quality 3D display requires a high amount of 3D spatial information, and a suitable mechanism is required to optically re-distribute the 3D information in space uniformly and efficiently, which can produce the 3D image similar to how humans observe the real 3D scene. Most 3D displays are based on stereoscopic vision [1], which provides two slightly different images with parallax for the viewer’s eyes. The stereoscopic 3D displays also suffer from convergence–accommodation conflict, which may cause discomfort and visual fatigue in the audience due to the discrepancy between the actual focal distance and the perceived depth. In addition, the right geometric occlusion with smooth full parallax in a large viewing angle is expected, which is difficult to be realized for the stereoscopic 3D display. Spatially accurate representations of 3D image with full parallax and a large viewing angle are difficult to be realized due to complex requirements. The holographic 3-D display has been considered as an alternative way to current stereoscopic displays, but it’s still challenging due to unavailable dynamic devices with high information throughput and limited information processing capability [16]. A 3D light-field display supports all depth cues, which can be as the vector light-field, and light are distributed with different colors and variable brightness in different directions [8–11]. If the relative direction and intensity information of light-field originated from a 3-D scene is recorded, the 3-D light-field information of the scene can be recovered by generating the beams with the same relative directions and intensities based on the recorded information. Light-field can realize the true 3D display for multi-users that provide a natural and comfortable experience of the realistic 3D information of objects or scenes [9–11], which is normally based on scanning, multi-projector, or multi-layer liquid crystal panel. However, now available 3D light-field display methods suffer from the low display quality due to the inherent tradeoff among viewing angle, resolution, depth range, and spatial information capacity (viewpoint number).

Here, an interactive floating full-parallax 3D light-field display with 96 × 96 viewpoint perspectives for 3D biomedical and biological data is demonstrated with acceptable resolution. To the best of our knowledge, here the largest number of viewpoint perspectives is used for the 3D light-field display based on wavefront recomposing, which ensures the right 3D perception and occlusion in 45°viewing angle along vertical and horizontal directions.

2. Experimental configuration

Our 3D light-field display configuration is similar to integral imaging(InI), but it’s different from InI. For InI display as shown in Figs. 1(a), elemental images are loaded on a two-dimensional liquid crystal (LC) display panel in front of a lens-array. Each pixel of the LC display panel generates a conical light ray bundle through the lens-array, and intersection of many light ray bundles produces a 3D optical reconstruction. However, the low spatial resolution 3D image is perceived because the pattern of the lens-array is noticeably observable. In addition, when an elemental image is observed through a lens, serious distortion occurs for the observer’s eye, as shown in Figs. 1(a), which causes the small viewing angle and the small continuous displayed depth. In addition, in order to improve the lateral spatial resolution, normally the micro-lens is used for the lens-array, and the 3D information capacity is small. Different from the traditional InI system, in the demonstrated 3D light-field display, the compound lens is used to suppress the aberrations and the holographic functional screen [9] is used to re-modulate the light distribution from the compound lens-array. To increase the viewpoint number and the viewing angle, the compound lens diameter is increased to 9 mm and the pitch of the lens-array is 11.20 mm, which cannot be accepted for InI because the observer’s eyes cannot distinguish the 3D image. In our experimental setup, the continuous and clear 3D light-field image can be perceived since the light wavefronts are recomposed by the holographic functional screen and the aberrations are well suppressed by the compound lens-array. In the simulation, the observed distorted elemental image is corrected in the observed plane, as shown in Figs. 1(b). The coded light-field elemental image array on the 27 inch LC panel with the resolution of 5120 × 2880 is based on the back-ward ray tracing technique, where each elemental image consists of 96 × 96 pixels in a matrix format from 96 × 96 viewpoint perspectives. Different perspectives are integrated as a 3D image with the 53 × 30 compound lens-array assisted with the holographic functional screen.

 figure: Fig. 1

Fig. 1 Comparison of the integral imaging display and the light-field display. (a) The observed discontinuous 3D image and distorted elemental image (b) The clear 3D image and corrected elemental image.

Download Full Size | PDF

Light through the lens-array from all pixels in the LC panel illuminate every point on the holographic functional screen. The point of the holographic functional screen emits multiple light beams of various intensities and colors in different directions in a controlled way, as if they were emitted from the point of the real 3D object at a fixed spatial position. As shown in Fig. 2, when light beams from the 53 × 30 lens-array simultaneously project 96 × 96 viewpoint perspectives of the 3D object onto one position Hij on the holographic functional screen with the specifically spreading function, the light beams are generated in a specific arranged geometry, and the holographic functional screen processes the necessary optical transformation. To recover the 3D information correctly, it is important to spread the light beams from the lens- array with the right spatial angle ωmn. The output light beams from Hij are the integral result as given in Eq. (1). All light though all compound lens originating from 96 × 96 viewpoint perspectives contributes to the reconstructed 3D scene without sharp boundary between viewpoints. So the full-parallax 3D display provides continuous and smooth change at different directions.

 figure: Fig. 2

Fig. 2 Schematic diagram of the holographic functional screen to recompose light.

Download Full Size | PDF

Ωij=n=1Nm=1Mωmn

The diffusion characteristic of the holographic functional screen is important to precisely recompose the light from the lens-array according to the system geometry. The holographic functional screen is holographically printed with speckle patterns exposed on proper sensitive material [17]. The screen’s diffused angle is determined by the shape and size of speckles through controlling the mask aperture, and angular distributions of the light beams with the diffused angle close to ωmn is realized. The fully random speckle structure is wavelength-independent and without chromatic aberration, which enables high transmission efficiency.

The aberrations of the lens-array reduce the reconstruction accuracy of the light beam from the lens-array, which degrade the 3D imaging quality. Thus, the aberrations of the lens-array should be suppressed. The compound lens consisting of two lenses is designed to decrease aberrations. The damped least-squares method is used to optimize the primary aberrations and the high order aberrations [18]. After the aberrations are balanced, an optimized structure and corresponding parameters are achieved, as shown in Figs. 3(a). As shown in Figs. 3(b), compared with the traditional single lens with the same focus distance and the same diameter, the modulation transfer function of the compound lens is obviously improved, which shows that aberrations for the compound lens are well suppressed in the designed viewing angle of 45°.

 figure: Fig. 3

Fig. 3 (a) The optimized structure and corresponding parameters for the compound lens, (b) comparison of modulation transfer function for the compound lens and the single lens.

Download Full Size | PDF

With the recomposing functionality of the holographic functional screen and the aberration-suppressed compounded lens-array, the imaging distortion can be decreased. Here, the imaging distortion is calculated as [D(x,y)-D(xI,yI)]/ D(xI,yI) × 100% at a given imaging distance, which means the percentage of the imaging position difference between the actual point (x,y) and the ideal point (xI,yI) relative to the ideal imaging point position distance D(xI,yI) = [ (xI)2 + (yI)2]1/2. For the correct imaging, comparison of the imaging distortion for the different imaging distance from the lens-array is given in Fig. 4. We can see that the distortion for the traditional single lens-array without the holographic functional screen is above 20% at the boundary of the viewing area for different distances from the lens-array. With the same parameters, the distortion is decreased to less than 1.9% for the compound lens-array with the holographic functional screen in the whole viewing area, which facilitates the correct 3D light-field display with the large viewing angle and the large displayed depth. The holographic functional screen shapes the light distribution and also contributes to the correction of the imaging distortion In addition, the presence of the holographic functional screen makes the structure pattern of the compound lens-array invisible for the observers, and only the clear 3D light-field image is perceived. Compared experiments are carried out for the light-field display with two kinds of lens array. Figures. 5(a) shows the blurry 3D image with the traditional single lens-array. With the compound lens-array, the more clear 3D image is achieved as shown in Figs. 5(b).

 figure: Fig. 4

Fig. 4 Comparison of distortion for two cases with different imaging distances: (a) distortion for the traditional single lens-array without the holographic functional screen, (b) distortion for the compound lens-array with the holographic functional screen.

Download Full Size | PDF

 figure: Fig. 5

Fig. 5 (a) The 3D image with the traditional single lens-array, (b) the 3D image with the compound lens-array.

Download Full Size | PDF

3. Experimental results

In the demonstrated 3D light field display, light beams emitted from the LC panel pixels are recomposed through the compound lens-array and the holographic functional screen. The distance between LCD and lens-array is 8.3 mm and the distance between lens-array and HFS is 200 mm. They are determined by the optical parameters of compound lens-array and imaging relationship. The coded elemental image array is generated from the mapping relation between elemental images and viewpoint images, which is based on the back-ward ray tracing technique. Intuitively, the large number of viewpoints is required to generate the encoded light field image to achieve the high quality of floating 3D image in the large viewing angle. Here 96 × 96 viewpoint parallax images are used for each elemental image. When the coded elemental image array is loaded on the LC panel, the correct 3D light-field display can be presented to the observers. The example elemental image array and its corresponding 3D light-field display results of a 3D image of Buddha head are shown in Figs. 6(a) and Figs. 6(b).

 figure: Fig. 6

Fig. 6 (a) Coded elemental image array, and (b) its corresponding 3D light-field display results of a 3D image of Buddha head captured from upper 22°, left 22°, center, right 22°, and lower 22°positions(see Visualization 1).

Download Full Size | PDF

One of important applications of the floating full-parallax 3D light-field is for medical and biological analysis and diagnosis. Static and interactive 3D displays of a medical data for heart blood vessels with our proposed display are show in Figs. 7 (a) and Figs. 7 (b). We can clearly see the structure and the relative 3D position relation of different parts, and more than 30 cm displayed depth can be perceived. Interactive operations such as rotation and zoom can be realized in real-time at the frame rate of 25 fps. The full-color structured illumination microscopy can obtain optical sectioning images in 3D with full natural color of fluorescent little creature specimens [15]. With the 3D reconstruction data of a red stained mite with 105 layers at 800 nm axial intervals, the 3D light-field display result observed from different angles is shown in Figs. 8(a) Visualization 4. Figures. 8(b) Visualization 5 shows an interactive display of a mixed pollen grain specimen from the 3D reconstruction data with 251layers at 200 nm axial intervals, which shows different colors of auto-fluorescence captured under the same excitation.

 figure: Fig. 7

Fig. 7 (a) 3D light-field display of static 3D image of heart blood vessels (see Visualization 2), and (b) interactive dynamic 3D light-field display of heart blood vessels (see Visualization 3).

Download Full Size | PDF

 figure: Fig. 8

Fig. 8 (a) 3D light-field display of a red stained mite (see Visualization 4), and (b) interactive dynamic 3D light-field display of a mixed pollen grain specimen (see Visualization 5) with the 3D data of multi-layers from full-color structured illumination microscopy.

Download Full Size | PDF

4. Conclusion

It is normally difficult to realize the full-parallax 3D light-field display with high spatial information capacity and acceptable resolution in a relative large viewing angle. Here, a real-time interactive floating full-parallax 3D light-field display with 96 × 96 viewpoint perspectives is demonstrated in a 45 degree viewing angle. The holographic functional screen is used to recompose the light distribution from the lens-array to achieve the clear and natural 3D light field image without the influence of the pattern of the lens-array. Real-time interactive 3D light-field displays with medical and scientific 3D data are realized. We believe that the demonstrated interactive full-parallax 3D display system can find many potential applications in medical and scientific areas.

Funding

National Key Research and Development Program (2017YFB1002900); National Natural Science Foundation of China (NSFC) (61575025); BUPT Excellent Ph.D. Students Foundation (CX2016306).

Acknowledgment

We thank Professor Baoli Yao’s Group at Xi’an Institute of Optics and Precision Mechanics for providing the slice data of little creatures acquired by the full-color structured illumination optical sectioning microscopy.

References and links

1. J. Geng, “Three-dimensional display technologies,” Adv. Opt. Photonics 5(4), 456–535 (2013). [CrossRef]   [PubMed]  

2. D. Fattal, Z. Peng, T. Tran, S. Vo, M. Fiorentino, J. Brug, and R. G. Beausoleil, “A multi-directional backlight for a wide-angle, glasses-free three-dimensional display,” Nature 495(7441), 348–351 (2013). [CrossRef]   [PubMed]  

3. X. Li, H. Ren, X. Chen, J. Liu, Q. Li, C. Li, G. Xue, J. Jia, L. Cao, A. Sahu, B. Hu, Y. Wang, G. Jin, and M. Gu, “Athermally photoreduced graphene oxides for three-dimensional holographic images,” Nat. Commun. 6(1), 6984 (2015). [CrossRef]   [PubMed]  

4. H. Yu, K. Lee, J. Park, and Y. Park, “Ultrahigh-definition dynamic 3D holographic display by active control of volume speckle fields,” Nat. Photonics 11(3), 186–192 (2017). [CrossRef]  

5. P.-A. Blanche, A. Bablumian, R. Voorakaranam, C. Christenson, W. Lin, T. Gu, D. Flores, P. Wang, W.-Y. Hsieh, M. Kathaperumal, B. Rachwal, O. Siddiqui, J. Thomas, R. A. Norwood, M. Yamamoto, and N. Peyghambarian, “Holographic three-dimensional telepresence using large-area photorefractive polymer,” Nature 468(7320), 80–83 (2010). [CrossRef]   [PubMed]  

6. D. E. Smalley, Q. Y. Smithwick, V. M. Bove Jr, J. Barabas, and S. Jolly, “Anisotropic leaky-mode modulator for holographic video displays,” Nature 498(7454), 313–317 (2013). [CrossRef]   [PubMed]  

7. K. Wakunami, P.-Y. Hsieh, R. Oi, T. Senoh, H. Sasaki, Y. Ichihashi, M. Okui, Y.-P. Huang, and K. Yamamoto, “Projection-type see-through holographic three-dimensional display,” Nat. Commun. 7, 12954 (2016). [CrossRef]   [PubMed]  

8. S. Xing, X. Sang, X. Yu, C. Duo, B. Pang, X. Gao, S. Yang, Y. Guan, B. Yan, J. Yuan, and K. Wang, “High-efficient computer-generated integral imaging based on the backward ray-tracing technique and optical reconstruction,” Opt. Express 25(1), 330–338 (2017). [CrossRef]   [PubMed]  

9. X. Sang, F. C. Fan, C. C. Jiang, S. Choi, W. Dou, C. Yu, and D. Xu, “Demonstration of a large-size real-time full-color three-dimensional display,” Opt. Lett. 34(24), 3803–3805 (2009). [CrossRef]   [PubMed]  

10. D. Chen, X. Sang, X. Yu, X. Zeng, S. Xie, and N. Guo, “Performance improvement of compressive light field display with the viewing-position-dependent weight distribution,” Opt. Express 24(26), 29781–29793 (2016). [CrossRef]   [PubMed]  

11. N. Balram and I. Tosic, “Light-Field Imaging and Display Systems,” Inf. Disp. 32(4), 2–9 (2016).

12. X. Liu and H. Li, “The progress of light field 3-D displays,” Inf. Disp. 30(6), 6–14 (2014).

13. M. Holler, M. Guizar-Sicairos, E. H. Tsai, R. Dinapoli, E. Müller, O. Bunk, J. Raabe, and G. Aeppli, “High-resolution non-destructive three-dimensional imaging of integrated circuits,” Nature 543(7645), 402–406 (2017). [CrossRef]   [PubMed]  

14. K. Xu, H. P. Babcock, and X. Zhuang, “Dual-objective STORM reveals three-dimensional filament organization in the actin cytoskeleton,” Nat. Methods 9(2), 185–188 (2012). [CrossRef]   [PubMed]  

15. J. Qian, M. Lei, D. Dan, B. Yao, X. Zhou, Y. Yang, S. Yan, J. Min, and X. Yu, “Full-color structured illumination optical sectioning microscopy,” Sci. Rep. 5, 14513 (2015).

16. Y. Zhao, L. Cao, H. Zhang, D. Kong, and G. Jin, “Accurate calculation of computer-generated holograms using angular-spectrum layer-oriented method,” Opt. Express 23(20), 25440–25449 (2015). [CrossRef]   [PubMed]  

17. X. Sang, F. Fan, S. Choi, C. Jiang, C. Yu, B. Yan, and W. Dou, “Three-dimensional display based on the holographic functional screen,” Opt. Eng. 50(9), 091303 (2011). [CrossRef]  

18. R. Siew, “Practical automated glass selection and the design of apochromats with large field of view,” Appl. Opt. 55(32), 9232–9236 (2016). [CrossRef]   [PubMed]  

Supplementary Material (5)

NameDescription
Visualization 1       3D light-field display results of a 3D image of Buddha head captured from different positions
Visualization 2       3D light-field display of static 3D image of heart flood vessels
Visualization 3       Interactive dynamic 3D light-field display of heart flood vessels
Visualization 4       3D light-field display of a red stained mite
Visualization 5       Interactive dynamic 3D light-field display of a mixed pollen grain specimen

Cited By

Optica participates in Crossref's Cited-By Linking service. Citing articles from Optica Publishing Group journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (8)

Fig. 1
Fig. 1 Comparison of the integral imaging display and the light-field display. (a) The observed discontinuous 3D image and distorted elemental image (b) The clear 3D image and corrected elemental image.
Fig. 2
Fig. 2 Schematic diagram of the holographic functional screen to recompose light.
Fig. 3
Fig. 3 (a) The optimized structure and corresponding parameters for the compound lens, (b) comparison of modulation transfer function for the compound lens and the single lens.
Fig. 4
Fig. 4 Comparison of distortion for two cases with different imaging distances: (a) distortion for the traditional single lens-array without the holographic functional screen, (b) distortion for the compound lens-array with the holographic functional screen.
Fig. 5
Fig. 5 (a) The 3D image with the traditional single lens-array, (b) the 3D image with the compound lens-array.
Fig. 6
Fig. 6 (a) Coded elemental image array, and (b) its corresponding 3D light-field display results of a 3D image of Buddha head captured from upper 22°, left 22°, center, right 22°, and lower 22°positions(see Visualization 1).
Fig. 7
Fig. 7 (a) 3D light-field display of static 3D image of heart blood vessels (see Visualization 2), and (b) interactive dynamic 3D light-field display of heart blood vessels (see Visualization 3).
Fig. 8
Fig. 8 (a) 3D light-field display of a red stained mite (see Visualization 4), and (b) interactive dynamic 3D light-field display of a mixed pollen grain specimen (see Visualization 5) with the 3D data of multi-layers from full-color structured illumination microscopy.

Equations (1)

Equations on this page are rendered with MathJax. Learn more.

Ω i j = n = 1 N m = 1 M ω m n
Select as filters


Select Topics Cancel
© Copyright 2024 | Optica Publishing Group. All rights reserved, including rights for text and data mining and training of artificial technologies or similar technologies.