Expand this Topic clickable element to expand a topic
Skip to content
Optica Publishing Group

Pixel-density and viewing-angle enhanced integral 3D display with parallel projection of multiple UHD elemental images

Open Access Open Access

Abstract

This paper presents an integral three-dimensional (3D) display that efficiently enhances both the pixel densities and viewing angles of 3D images with parallel projection of elemental images. In the proposed method, ultra-high-definition (UHD) elemental images are projected and superimposed as parallel light rays from densely arranged compact UHD projectors onto a lens array. Three-dimensional images with enhanced pixel densities and viewing angles can be displayed by optimizing the projector positions and system design. The prototype yielded a horizontal pixel density of 63.5 ppi, approximately 97,000 pixels, and a viewing angle of approximately 30°, making it superior to previous integral 3D display systems.

© 2020 Optical Society of America under the terms of the OSA Open Access Publishing Agreement

1. Introduction

The integral three-dimensional (3D) display method is an autostereoscopic display method based on integral photography [1], which was invented as a 3D photography technique by Lipmann in 1908. This method involves reconstructing an optical image of objects in space by faithfully reproducing light rays from the objects. Viewers can perceive natural 3D images with full and smooth motion parallax without the need for glasses. Consequently, applications in various fields such as 3D television, medicine, education, and advertisement are expected. In a typical integral 3D display method, elemental images are displayed on a display device and a two-dimensional (2D) micro lens array consisting of small convex lenses is placed in front of the display device. The distance between the lens array and display plane of the elemental images set to be the focal length of the lens array. Light rays emitted from each pixel of the elemental images are collimated by the elemental lenses and refracted in the predetermined direction. These high-density light rays form an optical image of the objects in space. A liquid crystal or organic light emitting diode display is generally used as the display device. A pinhole or point light source array can also be used instead of a lens array. Various display techniques based on the integral 3D display method have been reported [27].

An issue in integral 3D display methods is the difficulty of enhancing the display characteristics of 3D images, such as their pixel densities and viewing angles. In typical integral 3D display methods, the number of pixels in each 3D image is the same as the number of lenses in the lens array. Therefore, both an increased number of lenses and decreased lens size are required to enhance the pixel density of a 3D image while keeping the same display size. However, decreasing the lens size results in significant degradation of the resolution characteristics of 3D images due to diffraction [8,9]. Additionally, the display device must have a large number of pixels and high pixel density. In conventional research, 3D display systems with approximately 100,000 pixels and horizontal pixel densities of approximately 35 pixels per inch (ppi) have been reported using an ultra-high-definition (UHD) display device with full 8K resolution [5]. By using image information exceeding 8K resolution, the pixel density of a 3D image can be enhanced further; however, few display devices with more pixels than 8K exist. Therefore, when using one display device, it is difficult to reproduce high-quality 3D images. To solve these issues and enhance the display characteristics of 3D images, methods involving multiple direct-view display panels [1012] or the application of time-division techniques to direct-view display panels [13,14] have been reported. The former methods using multiple direct-view display panels can enhance the display size or pixel density of a 3D image by combining elemental images or displayed 3D images using a dedicated optical system [10,11] or by superimposing displayed 3D images using a half mirror [12]. The latter methods using time-division techniques can enhance the pixel density or viewing angle of a 3D image by controlling a mask layer [13] or point light source backlight [14] at high speed while synchronously reproducing corresponding elemental images. However, in these methods, the side lobe occurring in the displayed 3D images restricts the viewing zone. This side lobe occurs in integral 3D display methods using direct-view display panels because each elemental image is incident on not only the corresponding elemental lens, but also adjacent elemental lenses. Consequently, 3D images are repeatedly displayed outside the correct viewing zone. When superimposing 3D images with different viewing zones, multiple images are seen because of the side lobe. Therefore, in methods using multiple direct-view display panels, it is necessary to make the viewing-zone direction and viewing angle of the 3D image reproduced by each display panel almost the same. Similarly, in methods using time-division techniques, it is necessary to make the viewing-zone direction and viewing angle of the 3D image reproduced in each frame almost the same. In summary, in integral 3D display methods using direct-view display panels, the viewing zones of the displayed 3D images need to be designed considering the side lobe.

In contrast, when using a projector as a display device and projecting elemental images onto a lens array directly, an integral 3D image can be reproduced without the side lobe because each projected elemental image is incident on only the corresponding elemental lens. Taking advantage of this characteristic, methods of enhancing the display characteristics of 3D images using multiple projectors have been proposed [1518]. In these methods, each projector projects elemental images onto a lens array at different angles and the reproduced 3D images with different viewing zones are superimposed or combined. For example, Fig.  1(a) shows a method of enhancing the pixel density of a 3D image. This method involves multiple projectors placed close to each other to generate multiple condensed light spots that become the pixels of a 3D image in each elemental lens. A display system that generates 12 condensed light spots in each square elemental lens with a side length of 2.54 mm has been reported [15]. However, with the conventional method, the pixel density of a 3D image cannot be enhanced efficiently because the position of each condensed light spot is not controlled and the interval between the condensed light spots is uneven. In addition, the size of each condensed light spot is different because the focal plane of each elemental image is tilted with respect to the lens array. Furthermore, the system adjustment and image processing operations are complicated because the elemental images need to be generated based on ray tracing by actually measuring the positions of the condensed light spots. On the other hand, with the method shown in Fig.  1(b), the viewing angle of a 3D image can be enlarged by combining the viewing zones continuously [17]. However, in this method, the pixel density of a 3D image is not enhanced and a coarse 3D image is obtained because the number of condensed light spots in each elemental lens cannot be increased. As mentioned above, conventional methods using multiple projectors aimed at enhancing either the pixel density or viewing angle of a 3D image cannot enhance both characteristics efficiently. In addition, the 3D image display quality is not high because the techniques for projecting multiple elemental images with more than full 4K resolution do not exist.

 figure: Fig. 1.

Fig. 1. Conventional integral 3D display with multiple projectors for enhancing (a) pixel density and (b) viewing angle.

Download Full Size | PDF

Therefore, we propose a novel integral 3D display method for enhancing both the pixel density and viewing angle efficiently. In the proposed method, multiple condensed light spots are generated at equal intervals in each elemental lens by projecting elemental images onto a lens array in predetermined angle intervals as collimated light rays and optimizing the installation positions of the projectors and the design of the display optical system. The display optical system consists of a collimator lens and a lens array as in the conventional method [17] described above. In the proposed method, the projectors are places close to each other and the positional relationship with the collimator lens is adjusted to partially overlap the viewing zones of the reproduced 3D images. Therefore, although the effect of enlarging the total viewing angle is less than in the conventional method [17], there is an advantage in that the pixel density can be enhanced in the overlapping viewing zones. Consequently, a 3D image can be reproduced with the highest pixel density at the central viewpoint and enlarged viewing zone at the peripherical viewpoints.

To enhance the display characteristics of 3D images using the proposed method, the parallel projection of high-definition elemental images onto a lens array with a narrow angular interval is important. In addition, a highly accurate technique for correcting the projection position errors of elemental images due to the projection distortions and installation errors is needed. Firstly, we set the target values of the 3D image display characteristics, then developed both dedicated compact full 4K projector units with the lens-shift function and a highly accurate elemental image automatic correction method using 3D markers to achieve these values. We constructed a prototype of the display system using six projector units based on the proposed method and obtained a 3D image with a horizontal pixel density of 63.5 ppi, approximately 97,000 pixels, and viewing angles of 32.8° and 26.5° in the horizontal and vertical directions, respectively. This pixel density is higher than those achievable using previously developed integral 3D display systems. This paper presents in detail the principle of the proposed method, design policy of the display system, elemental technology developments, and experimental evaluation results of the display characteristics of the prototype display system.

2. Proposed integral 3D display with parallel projection of multiple elemental images

2.1 Basic principle

Figure  2 shows the basic configuration of the proposed system. The display system consists of multiple projectors, a collimator lens, and a lens array. The projectors are placed at the focal length of the collimator lens F. Elemental images emitted from each projector are incident on the corresponding elemental lenses as collimated light rays at different angles and are condensed at the focal length of the lens array f. Each condensed point is observed as a light spot and becomes a pixel constituting a 3D image. Each condensed light spot is generated at a different position according to the projection angle of the elemental image. With the proposed method, the pixel density of a displayed 3D image is enhanced efficiently by generating multiple condensed light spots at equal intervals in each elemental lens. The interval between condensed light spots d is determined by the installation interval between projectors D and the focal lengths of the lenses, as follows:

$$d = \frac{f}{F}D.$$
For example, to generate three condensed light spots using three projectors in the horizontal direction, as shown in Fig.  2, the installation interval between projectors D needs to be set as D = pF/3f, where p is the lens pitch of the lens array. In the proposed method, the pixel density of a 3D image is enhanced when observing the 3D image from regions in which the viewing zones are overlapped. For example, in viewing zone (C) in Fig.  2, the 3D images reproduced by all three projectors are superimposed and the pixel density of the displayed 3D image is enhanced three times. In viewing zones (B) and (D), the 3D images reproduced by two projectors are superimposed and the pixel number of the displayed 3D image is doubled.

 figure: Fig. 2.

Fig. 2. Basic principle of proposed integral 3D display with parallel projection of multiple elemental images.

Download Full Size | PDF

In addition, the viewing angle of a 3D image can be enlarged with the proposed method. For example, in the configuration shown in Fig.  2, viewing zones (A) and (E) are enlarged by the left and right projectors compared to when only the central projector is used. The viewing zone of a 3D image reproduced by a projector is as follows:

$$\textrm{ta}{\textrm{n}^{ - 1}}\left( {\frac{g}{f} - \frac{p}{{2f}}} \right) \le \theta \le \textrm{ta}{\textrm{n}^{ - 1}}\left( {\frac{g}{f} + \frac{p}{{2f}}} \right),$$
where $\theta$ is the angle in the polar coordinate system when the position of the condensed light spot is translated to the origin of the x-z axis in Fig.  2 and g is the position of the condensed light spot in the x direction when the optical axis of the elemental lens is translated so that it overlaps the z axis. The summed area of the viewing zones of all projectors becomes the overall viewing angle of the 3D image φ, as shown in Fig.  2.

With the proposed method, a 3D image with an enhanced pixel density at the central viewpoint and an enlarged viewing zone at the peripherical viewpoints can be reproduced by generating condensed light spots in equal intervals in each elemental lens. In addition, the elemental images corresponding to the proposed method can be generated easily because the generation positions of condensed light spots and viewing zones are uniquely determined. For example, elemental images for reproducing 3D images of 3D computer graphics (CG) models can be generated by obtaining light rays passing through the positions of the condensed light spots in a lump using virtual oblique projection cameras and by reordering the pixels of the captured multi-viewpoint images [19].

2.2 Resolution limitation

We consider the resolution limitation of a 3D image in the proposed method. The Nyquist frequency and viewing spatial frequency are the indices that represent the resolution limitation when a viewer observes an integral 3D image [8]. Figures  3(a) and 3(b) show the Nyquist frequency and viewing spatial frequency when three condensed light spots are generated using three projectors, respectively. The Nyquist frequency ${\beta _n}$ [cycles per radian (cpr)] is a spatial frequency based on the light rays sampled with the condensed light spots, and is given by

$${\beta _n} = N\frac{L}{{2p}},$$
where N is the number of projectors, p is the lens pitch of the lens array, and L is the viewing distance. In the proposed method, L is the distance between the condensed light spot generation plane and viewer. The viewing spatial frequency $\beta $ [cpr] is mainly based on the light rays sampled with the pixels of the elemental image, and is given by
$$\beta = \frac{{\alpha ({L - z} )}}{{|z |}},$$
where α is the projectable frequency [8] determined by the pixel pitch of the elemental images and characteristics of the lens array. z is the depth distance of a 3D image when the condensed light spot generation plane is set as the origin, and the viewer side direction is positive. When using only one projector, as shown by the black line in Fig.  3(c), the maximum spatial frequency of a 3D image ${\beta _{\textrm{max}}}$ [cpr] is equal to the Nyquist frequency within a predetermined depth range in front of and behind the light spot generation plane ze. When a 3D image is reproduced at a distance exceeding ze, the influence of the viewing spatial frequency becomes dominant, and the maximum spatial frequency decreases as the depth of the 3D image increases. In summary, the maximum spatial frequency of the 3D image is the minimum of the Nyquist frequency and viewing spatial frequency, and is given by
$${\beta _{\textrm{max}}} = \textrm{min}({{\beta_n},\beta } ).$$

 figure: Fig. 3.

Fig. 3. Resolution characteristics: (a) explanation of Nyquist frequency ${\beta _n}$, (b) explanation of viewing spatial frequency $\beta $ and projectable frequency $\alpha $, and (c) maximum spatial frequency ${\beta _{\textrm{max}}}$ of 3D image reproduced by proposed integral 3D display method using N projectors.

Download Full Size | PDF

When using multiple projectors, the Nyquist frequency ${\beta _n}$ expressed as Eq.  (3) increases. The red line in Fig.  3(c) shows the maximum spatial frequency when the density of condensed light spots is enhanced N times using N projectors. The maximum spatial frequency is enhanced N times within a predetermined depth range in front of and behind the light spot generation plane because the Nyquist frequency is enhanced N times. However, the maximum spatial frequency is not enhanced outside the depth range ze because the viewing spatial frequency $\beta $ expressed as Eq.  (4) is invariant regardless of the density of the condensed light spots. The depth range ze in which the pixel density of a 3D image can be enhanced is given by

$$\frac{{2\alpha pL}}{{2\alpha p - L}} < {z_e} < \frac{{2\alpha pL}}{{2\alpha p + L}}.$$
In the proposed method, based on these principles, it is important to consider the balance between the pixel density and viewing angle and to design the display system so that the display characteristics of 3D images can be effectively enhanced. Section 3 explains the design policy of the display optical system and elemental technology developments.

3. Display system design policy and elemental technology development

3.1 Optical design for reproducing high-pixel-density 3D image

We set the target values of the 3D image display characteristics with the objective of prototyping an integral 3D display system that can be observed by multiple viewers at a viewing distance of 1.5 m and considered the display specifications and design of the display optical system. First, considering viewer eyesight, we aimed to make the Nyquist frequency in Eq.  (3) 30 cycles per degree (cpd) or more. To make the Nyquist frequency 30 cpd, the pixel pitch and resolution of a 3D image need to be set as 0.4 mm and 63.5 ppi, respectively. However, when a lens array with a lens pitch of several hundred micrometers is used, significant degradation of the 3D image resolution characteristics occurs [8,9]. Therefore, we used a lens array with a lens pitch of 1.2 mm and square arrangement. When this lens array is used, a 3D image with a pixel pitch of 0.4 mm can be reproduced by generating three condensed light spots at equal intervals in each elemental lens using three projectors based on the proposed method.

Next, considering simultaneous viewing by multiple viewers, we set the target values of the display size and viewing angle to approximately 10 inches diagonally and approximately 30°, respectively. In addition, we aimed to make the Nyquist frequency of the pixel pitch of the elemental images [8] approximately 0.5 cpd, which is equivalent to that in conventional integral 3D display systems. This Nyquist frequency indicates the angle density of the reproduced light rays. Considering a configuration that satisfies these conditions, the focal length of the lens array, number of pixels per projector, and projection size of the elemental images were set to 3.4 mm, full 4K resolution, and approximately 10 inches diagonally. In this configuration, both the horizontal and vertical viewing angles become approximately 20° when a 3D image is reproduced by a single projector. The viewing angle can be enhanced to approximately 30° using three projectors in the proposed method.

Based on these designs, we finally considered the focal length of the collimator lens and installation interval between projectors. In this step, we aimed to make the depth of the display system no more than 2 m. To generate condensed light spots in 0.4 mm intervals in each elemental lens with the proposed method using three projectors and a lens array with a lens pitch of 1.2 mm and focal length of 3.4 mm, the elemental images need to be projected onto the lens array with a narrow angle interval of 6.7°. When the projection distance is made longer, the installation interval between projectors can be made wider, but the display system becomes larger. Here, we allocated half the depth of the display system to projector casings and wiring and the remaining space to the projection distance and display optical system. Based on these designs, we set the focal length of the collimator lens to 1,000 mm. To realize this display system, very compact UHD projectors are required because the installation distance between projectors was calculated to be 118 mm from Eq.  (1). Therefore, we designed and developed dedicated projector units.

3.2 Development of compact UHD projector unit

In the previous section, we explained that a 3D image with a pixel density of 63.5 ppi and viewing angle of approximately 30° can be realized by placing three projectors close to each other in 118 mm intervals. Therefore, we newly developed compact UHD projector units to realize this configuration. The appearance and specifications of the projector unit are shown in Fig.  4 and Table  1, respectively. This projector unit implements three liquid crystals on silicon panels with the full 4K resolution and a size of 0.7 inches diagonally. To realize a compact casing, three RGB color light emitting diodes are used as light sources and a small color synthesizing optical block is applied in the image reading part. Furthermore, by using a high-performance projection lens, a full 4K image with a minimum size of 5.5 inches diagonally (800 ppi) can be projected. The projection direction can be adjusted using the built-in lens-shift function. The width of the projector unit is 115 mm and multiple projector units can be placed close to each other horizontally. In addition, two projector units can be placed close to each other vertically because a suspended ceiling mount is possible. Because simultaneous viewing by multiple viewers is assumed in the prototype display system, the viewing angle should be wide in the horizontal direction. Therefore, we installed three projector units in the horizontal direction to achieve a horizontal pixel density of 63.5 ppi and horizontal viewing angle of approximately 30°. Furthermore, we placed two projector units in the vertical direction to enhance the vertical pixel density and viewing angle of a 3D image. Section 4 explains the details of the prototype display system.

 figure: Fig. 4.

Fig. 4. Compact UHD projector unit.

Download Full Size | PDF

Tables Icon

Table 1. Specifications of compact UHD projector unit.

3.3 Development of highly accurate elemental image correction method

This section explains the elemental image correction technique used in the proposed method. In projection-type integral 3D display systems, distortions occur in the displayed 3D images due to errors in the positional relationship between elemental images and elemental lenses caused by projection distortions [20]. For example, when the projection position error of each elemental image with respect to the elemental lens is $\mathrm{\Delta }X$, as shown in Fig.  5, the reproduction position error of a 3D image $\mathrm{\Delta }x$ is given by

$$\Delta X ={-} \frac{z}{f}\Delta x,$$
where z is the depth of the 3D image and f is the focal length of the lens array. Considering the required accuracy of 3D image position reproduction, because the maximum spatial frequency of the 3D image is determined by the Nyquist frequency in Eq.  (3), it is desirable for the reproduction position error not to be more than the sampling interval of this Nyquist frequency. This sampling interval is equal to the interval between condensed light spots. When the sampling interval, 3D image depth, and focal length of the lens array are 0.4 mm, 50 mm, and 3.4 mm, respectively, and these parameters are applied in Eq.  (7), the required accuracy of the projection position of each elemental image becomes 24.3 µm. Because the pixel pitch of elemental images becomes 57.6 µm when projecting 4K resolution elemental images at the size of 10 inches diagonally, the required accuracy is approximately 0.5 pixels. Hence, each elemental image needs to be aligned with the corresponding elemental lens with subpixel accuracy.

 figure: Fig. 5.

Fig. 5. Reproduction position error of 3D image due to projection position error of elemental images.

Download Full Size | PDF

We developed an automatic correction method using 3D markers to correct the projection position errors of elemental images with high accuracy. The 3D markers consist of multiple circular planar patterns arranged in a 2D array. The configuration of the correction system and flowchart of the correction process are shown in Figs.  6(a) and 6(b), respectively. In this method, a digital camera is placed in front of the display screen and the camera is calibrated. Next, elemental images are projected so that 3D markers of one of the RGB colors are displayed at a predetermined depth, and the reproduction position error of each marker is detected. Because each 3D marker has a mountain-like luminance distribution, the central position can be detected by fitting the curved surface to a captured image. Reverse geometrical correction is performed using the obtained reproduction position errors, so that each 3D marker is reproduced at the correct position. By repeating this process a predetermined number of times while increasing the depth of the 3D markers in increments, the elemental image correction accuracy is improved according to Eq.  (7). The same process is performed for all RGB colors, and finally a geometrical correction table of elemental images is generated.

 figure: Fig. 6.

Fig. 6. Highly accurate elemental image distortion correction method: (a) basic configuration, (b) flowchart of correction process, and (c) experimental results of distortion correction.

Download Full Size | PDF

In this method, the depth of the 3D markers is very important. Firstly, the 3D markers are displayed at ${z_1}$ in Fig.  6(a), which is the lens array plane, and coarse adjustment is performed. Through this process, the projection position of each elemental image is corrected with an accuracy of several pixels. Subsequently, correction with subpixel accuracy can finally be achieved by repeating the fine adjustment by increasing the depth of the 3D markers in increments as ${z_2}$ and ${z_3}$. By performing the correction process separately for all RGB colors, the projection position errors of elemental images due to both the color aberration in the optical system and the geometric errors between the RGB images of the projector unit can be corrected together.

Figure  6(c) shows the correction accuracy measured experimentally in the verification display system. In this experiment, elemental images were projected with sizes of 7.2 inches diagonally onto the lens array with a lens pitch of 1.2 mm and focal length of 3.4 mm. There were 11 3D markers in the horizontal direction and six in the vertical direction. Three steps were used for the depth of the 3D markers, −3.4 mm, 20.0 mm, and 50.0 mm. The average of the projection position errors of the elemental images was 7.51 pixels before correction and 0.16 pixels after correction. This result indicates that the projection position errors of elemental images can be corrected with subpixel accuracy and a correction accuracy sufficient for the proposed display method can be obtained. This correction process needs to be performed separately for each projector. By placing the digital camera at the central viewpoint at which the viewing zones of all projectors overlap, correction can be performed for all projectors automatically without moving the camera.

4. Prototype display system

 figure: Fig. 7.

Fig. 7. Prototype display system with six UHD projector units: (a) appearance of prototype display system and (b) condensed light spots on display screen.

Download Full Size | PDF

Tables Icon

Table 2. Specifications of prototype display system.

Based on the design policy of the optical system and elemental technologies explained in Section 3, we fabricated a prototype of the display system consisting of six compact UHD projector units, with three in the horizontal direction and two in the vertical direction. The appearance and specifications of the prototype display system are shown in Fig.  7(a) and Table  2. Each projector is placed so that the interval between projection lenses is 118 mm in both the horizontal and vertical directions, and the elemental images are projected onto the center of the collimator lens at a projection distance of 1,000 mm. The divergence angle of the light rays on the imaging plane 1,000 mm away from the projection lens is 1.5 degrees, and the light rays with this narrow divergence angle is projected onto the collimator lens to reproduce parallel light rays. We accurately adjusted the distances between the projector units and the collimator lens while confirming the spread of the light rays so that the light rays after passing through the collimator lens would be the closest to parallel light rays. In addition, the projection angle was adjusted by using the lens-shift function implemented in the projector so that the elemental images could illuminate the collimator lens uniformly without distortion. In the projector unit, deterioration of the resolution in the peripheral area of the projected image is suppressed maximally by using the high-resolution projection lens that is exclusively designed and manufactured. However, defocus occurs slightly when the projected image passes through the collimator lens. Therefore, we adjusted the focus of each projector so that the resolution of the 3D image is highest in the center of the display screen while confirming the resolution characteristics of the displayed 3D image. We also designed and fabricated an aspheric Fresnel lens as the collimator lens to generate parallel light rays with reduced influence of the aberrations. The groove pitch of the Fresnel lens was set to 0.3 mm, which is about half the pixel pitch of the projected elemental images, to prevent the 3D image resolution from decreasing. Each elemental image converted into parallel light rays by the collimator lens was incident on the lens array placed 10 mm from the collimator lens, and a condensed light spot was generated at a position according to the projection angle. Figure  7(b) shows the picture obtained by capturing the condensed light spots on the display screen. We confirmed that the condensed light spots were generated at equal intervals of 0.4 mm in the horizontal direction and at 0.4 mm or 0.8 mm intervals in the vertical direction. The 3D image display size was 9.2 inches diagonally, and there were approximately 97,000 pixels in the central viewpoint. The viewing angles were 31.5° and 25.8° in the horizontal and vertical directions, respectively.

We developed a dedicated video signal source for the prototype display system. This video signal source consists of three workstations implementing 3G-SDI frame memory boards. Each workstation can output two systems of full 4K video with a frame rate of 59.94p/60p and color space of YPbPr 4:2:2, 10 bit. Each 4K video is output as four 3G-SDI signals. By inputting sync signals into each workstation, it is possible to play multiple 4K videos synchronously. The video data are recorded in the large capacity solid state drives built into the workstations, and up to approximately 30 min of video can be played. This video source signal also supports remote control for switching the displayed video from an external personal computer via a local area network. We input full 4K video signals into six compact UHD projector units using this video source signal.

5. Experimental results

5.1 Resolution characteristics

This section presents the evaluation results of the resolution characteristics of a 3D image in the prototype display system. We displayed 3D images of sine wave charts with periodic structures in the horizontal direction and measured the horizontal resolution characteristics. In the experiments, a digital camera (Nikon D810) was placed 1,375 mm from the condensed light spot generation plane on the display screen. The Nyquist frequency in Eq.  (3) becomes 30 cpd at this distance when the pixel density of the 3D image is 0.4 mm. Raw images were obtained using dcraw ver. 9.2.7 with the linear characteristic (γ = 1). We generated 1,260 types of 3D images of resolution charts in total, 60 types in which the viewing spatial frequency could change from 1 cpd to 30 cpd in 0.5 cpd steps, and 21 types in which the depth distance of a resolution chart could change from −50 mm to + 50 mm in 5 mm steps. The resolution charts were captured in two cases in which using six projectors and one projector, and a total of 2,520 images were obtained. We calculated the resolvable spatial frequency for each captured image where the modulation factor of a resolution chart was 50% or more. Figure  8 shows the maximum resolvable spatial frequency for each depth. The spatial frequency was the highest when the 3D image depth was 0 mm, which was the condensed light spot generation plane. The resolvable spatial frequency was enhanced three times when displaying a 3D image using three projectors in the horizontal direction, compared to when only one projector was used. This enhancement occurred because the pixel density of the 3D image was increased three times and the Nyquist frequency in Eq.  (3) was improved three times in the horizontal direction. In contrast, when the depth was ± 20 mm or more, there was almost no difference between the resolvable spatial frequencies using six projectors and one projector. This lack of difference probably occurred because, as the 3D image distance becomes longer, the effect of the projectable frequency [8] mainly determined by the pixel pitch of the elemental images becomes larger than the pixel density improvement provided by the proposed method. To improve the projectable spatial frequency, it is fundamentally necessary to increase the pixel densities of the elemental images. However, enhancement of the projectable spatial frequency can be expected by improving the aberrations and diffractions in the display optical system. For example, in the prototype display system, the Fresnel lens was used as a collimator lens, but by replacing it with a convex lens, the projectable frequency could probably be enhanced and the 3D image pixel density improvement provided by the proposed method could be obtained over a wider depth range. We intend to improve the display optical system to display higher quality 3D images.

 figure: Fig. 8.

Fig. 8. Horizontal resolution characteristics of the prototype display system.

Download Full Size | PDF

5.2 Viewing-zone angle characteristics

Figure  9 shows the viewing zone in the prototype display system at a viewing distance of 1.5 m. The viewing zone has dimensions of 0.9 m horizontally and 0.7 m vertically. The viewing zone was enlarged horizontally by using three projectors in the horizontal direction and two in the vertical direction. This approach allows simultaneous viewing by multiple viewers. At the central viewpoint (c), 3D images reproduced by all of the projector units are superimposed and the pixel density of the perceived 3D image is the highest. This viewing zone has dimensions of 0.2 m horizontally and 0.4 m vertically. At the upper and lower viewpoints (a) and (e), respectively, the 3D images reproduced by the three horizontal projector units are superimposed and the horizontal pixel density is enhanced. At the left and right viewpoints (b) and (d), respectively, the 3D images reproduced by two vertical projector units are superimposed and the vertical pixel density is enhanced. At the four corner viewpoints, the 3D image from one projector is reproduced. At the other viewpoints, the pixel density in the horizontal or vertical direction, or both, is improved according to the number of projectors whose viewing zones are overlapped. In this manner, both the pixel density and viewing angle are enhanced by increasing the pixel density at viewpoints closer to the center and enlarging the viewing angle at peripheral viewpoints.

 figure: Fig. 9.

Fig. 9. Viewing zone of the prototype display system at a viewing distance of 1.5 m.

Download Full Size | PDF

 figure: Fig. 10.

Fig. 10. Displayed 3D image from different viewpoints: (a) upper view, (b) left view, (c) center view, (d) right view, and (e) lower view.

Download Full Size | PDF

Figure  10 shows the results of displaying the 3D image of real objects and observing it from the different viewpoints corresponding to (a)-(e) in Fig.  9. The 3D image could be reproduced with wide viewing angles of 32.8° and 26.5° in the horizontal and vertical directions, respectively. Large motion parallax could be confirmed, such as by the changes in the positional relations between the 3D logo and mascot character and between the white flower and lamp. When observing the 3D image while moving the viewpoint, smooth motion parallax could be confirmed without the occurrence of a multiple image. Additionally, fine shapes such as flower leaves could be reproduced with high definition. As an issue in the prototype display system, a moiré pattern occurred slightly in the displayed 3D image depending on the viewpoint, which is thought to be the moiré pattern due to the interference between the periodic structures of the lens array and Fresnel lens used as a collimator lens. This issue can be solved by using a convex lens instead of a Fresnel lens.

6. Conclusion

We developed an integral 3D display system that can enhance both the pixel densities and viewing angles of 3D images with parallel projection of multiple UHD elemental images. In this method, condensed light spots that become the pixels of a 3D image are generated at equal intervals in each elemental lens by projecting elemental images from multiple projectors placed at optimal positions to the lens array as parallel light rays in a superimposed manner. We set the target values of the 3D image display characteristics, then designed the display optical system and developed the dedicated compact UHD projector units and an elemental image correction technique to achieve the values. The prototype display system consisted of six UHD projector units and could reproduce a 3D image with a horizontal pixel pitch of 0.4 mm, approximately 97,000 pixels, and viewing angles of 32.8° and 26.5° in the horizontal and vertical directions, respectively. The pixel density in the horizontal direction is higher than those achievable using previously developed integral 3D display systems. Additionally, we measured the horizontal resolution characteristics of the prototype display system and confirmed that the resolvable spatial frequency of a 3D image was enhanced three times compared to when it was displayed using a single projector. Furthermore, we displayed the 3D images of real objects and confirmed that natural 3D images with smooth and wide motion parallax could be reproduced. With the proposed method, the display characteristics of 3D images such as the pixel density, viewing angle, and display size can be adjusted flexibly based on the display optical system design and projector positions. This advantage is useful for the development and evaluation of 3D television systems. We will improve the display device and light ray reproduction method to realize higher quality reproduction of 3D images in the future.

Disclosures

The authors declare no conflicts of interest.

References

1. G. Lippmann, “Épreuves réversibles donnant la sensation du relief,” J. Phys. Theor. Appl. 7(1), 821–825 (1908). [CrossRef]  

2. F. Okano, H. Hoshino, J. Arai, and I. Yuyama, “Real-time pickup method for a three-dimensional image based on integral photography,” Appl. Opt. 36(7), 1598–1603 (1997). [CrossRef]  

3. H. Arimoto and B. Javidi, “Integral three-dimensional imaging with digital reconstruction,” Opt. Lett. 26(3), 157–159 (2001). [CrossRef]  

4. F. Okano, J. Arai, K. Mitani, and M. Okui, “Real-time integral imaging based on extremely high resolution video system,” Proc. IEEE 94(3), 490–501 (2006). [CrossRef]  

5. J. Arai, F. Okano, M. Kawakita, M. Okui, Y. Haino, M. Yoshimura, M. Furuya, and M. Sato, “Integral three-dimensional television using a 33-megapixel imaging system,” J. Disp. Technol. 6(10), 422–430 (2010). [CrossRef]  

6. J. Park, K. Hong, and B. Lee, “Recent progress in three-dimensional information processing based on integral imaging,” Appl. Opt. 48(34), H77–H94 (2009). [CrossRef]  

7. X. Xiao, B. Javidi, M. Martinez-Corral, and A. Stern, “Advances in three-dimensional integral imaging: sensing, display, and applications,” Appl. Opt. 52(4), 546–560 (2013). [CrossRef]  

8. H. Hoshino, F. Okano, H. Isono, and I. Yuyama, “Analysis of resolution limitation of integral photography,” J. Opt. Soc. Am. A 15(8), 2059–2065 (1998). [CrossRef]  

9. T. Okoshi, “Optimum design and depth resolution of lens-sheet and projection-type three-dimensional displays,” Appl. Opt. 10(10), 2284–2291 (1971). [CrossRef]  

10. N. Okaichi, M. Miura, J. Arai, M. Kawakita, and T. Mishina, “Integral 3D display using multiple LCD panels and multi-image combining optical system,” Opt. Express 25(3), 2805–2817 (2017). [CrossRef]  

11. N. Okaichi, M. Kawakita, H. Sasaki, H. Watanabe, and T. Mishina, “High-quality direct-view display combining multiple integral 3D images,” J. Soc. Inf. Disp. 27(1), 41–52 (2019). [CrossRef]  

12. Y. Kim, J. Jung, J. Kang, Y. Kim, B. Lee, and B. Javidi, “Resolution-enhanced three-dimensional integral imaging using double display devices,” in Proceedings of IEEE Lasers and Electro-Optics Society Annual Meeting Conference (IEEE, 2007), pp. 356–357.

13. Y. Oh, D. Shin, B. Lee, S. Jeong, and H. Choi, “Resolution-enhanced integral imaging in focal mode with a time-multiplexed electrical mask array,” Opt. Express 22(15), 17620–17629 (2014). [CrossRef]  

14. Y. Kim, J. Kim, J. Kang, J. Jung, H. Choi, and B. Lee, “Point light source integral imaging with improved resolution and viewing angle by the use of electrically movable pinhole array,” Opt. Express 15(26), 18253–18267 (2007). [CrossRef]  

15. M. Yamasaki, H. Sakai, K. Utsugi, and T. Koike, “High-density light field reproduction using overlaid multiple projection images,” Proc. SPIE 7237, 723709 (2009). [CrossRef]  

16. M. D. Alam, G. Baasantseren, M. Erdenebat, N. Kim, and J. Park, “Resolution enhancement of integral-imaging three-dimensional display using directional elemental image projection,” J. Soc. Inf. Disp. 20(4), 221–227 (2012). [CrossRef]  

17. N. Okaichi, M. Miura, H. Sasaki, H. Watanabe, J. Arai, M. Kawakita, and T. Mishina, “Continuous combination of viewing zones in integral three-dimensional display using multiple projectors,” Opt. Eng. 57(6), 061611 (2018). [CrossRef]  

18. M. A. Alam, K. Kwon, Y. Piao, Y. Kim, and N. Kim, “Viewing-angle-enhanced integral imaging display system using a time-multiplexed two-directional sequential projection scheme and a DEIGR algorithm,” IEEE Photonics J. 7(1), 1–14 (2015). [CrossRef]  

19. Y. Iwadate and M. Katayama, “Generating integral image from 3D object by using oblique projection,” inProceedings of IDW (2011), pp. 269–272.

20. M. Kawakita, H. Sasaki, J. Arai, M. Okui, F. Okano, Y. Haino, M. Yoshimura, and M. Sato, “Projection-type integral 3-D display with distortion compensation,” J. Soc. Inf. Disp. 18(9), 668–677 (2010). [CrossRef]  

Cited By

Optica participates in Crossref's Cited-By Linking service. Citing articles from Optica Publishing Group journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (10)

Fig. 1.
Fig. 1. Conventional integral 3D display with multiple projectors for enhancing (a) pixel density and (b) viewing angle.
Fig. 2.
Fig. 2. Basic principle of proposed integral 3D display with parallel projection of multiple elemental images.
Fig. 3.
Fig. 3. Resolution characteristics: (a) explanation of Nyquist frequency ${\beta _n}$, (b) explanation of viewing spatial frequency $\beta $ and projectable frequency $\alpha $, and (c) maximum spatial frequency ${\beta _{\textrm{max}}}$ of 3D image reproduced by proposed integral 3D display method using N projectors.
Fig. 4.
Fig. 4. Compact UHD projector unit.
Fig. 5.
Fig. 5. Reproduction position error of 3D image due to projection position error of elemental images.
Fig. 6.
Fig. 6. Highly accurate elemental image distortion correction method: (a) basic configuration, (b) flowchart of correction process, and (c) experimental results of distortion correction.
Fig. 7.
Fig. 7. Prototype display system with six UHD projector units: (a) appearance of prototype display system and (b) condensed light spots on display screen.
Fig. 8.
Fig. 8. Horizontal resolution characteristics of the prototype display system.
Fig. 9.
Fig. 9. Viewing zone of the prototype display system at a viewing distance of 1.5 m.
Fig. 10.
Fig. 10. Displayed 3D image from different viewpoints: (a) upper view, (b) left view, (c) center view, (d) right view, and (e) lower view.

Tables (2)

Tables Icon

Table 1. Specifications of compact UHD projector unit.

Tables Icon

Table 2. Specifications of prototype display system.

Equations (7)

Equations on this page are rendered with MathJax. Learn more.

d = f F D .
ta n 1 ( g f p 2 f ) θ ta n 1 ( g f + p 2 f ) ,
β n = N L 2 p ,
β = α ( L z ) | z | ,
β max = min ( β n , β ) .
2 α p L 2 α p L < z e < 2 α p L 2 α p + L .
Δ X = z f Δ x ,
Select as filters


Select Topics Cancel
© Copyright 2024 | Optica Publishing Group. All rights reserved, including rights for text and data mining and training of artificial technologies or similar technologies.