Expand this Topic clickable element to expand a topic
Skip to content
Optica Publishing Group

Dynamic depth-of-field projection mapping method based on a variable focus lens and visual feedback

Open Access Open Access

Abstract

Dynamic projection mapping is an interactive display technology, which is capable with multiplayers with naked eyes for augmented reality. However, the fixed and shallow depth-of-field of the projector optics limits its potential applications. In this work, a high-speed projection mapping method with a dynamic focal tracking technology based on a variable focus lens will be illustrated. The proposed system included a high-speed variable focus lens, a high-speed camera, and a high- speed projector, so that the depth and rotation information would be detected and then served as feedback to correct the focal length and update the projection information in real time. As a result, the information would be well-focused projected even on a 3D dynamic moving object. The response speed of the high-speed prototype could reach around 5 ms, and the dynamic projection range covered from 0.5 to 2.0 m.

© 2023 Optica Publishing Group under the terms of the Optica Open Access Publishing Agreement

1. Introduction

From the last century, the projection technology has been regarded as one of the display devices that projects information on a large screen with a fixed distance. The projection display applications are mainly on classroom education, conferences, movie theaters and other application scenarios. Projectors are considered to projected images onto a flat surface in a static and relative dark environment. The technical innovations are towards to improve its high resolution, high brightness, large angle of view and so on.

Nowadays, the application of the projection is not only limited onto a flat surface, but can also be projected onto the surface with from concave to convex irregular shape [1]. With the help of changing the projection contents, the appearance of the objects can show a new visual expression form [2]. The technique of dynamically changing its visual form through projection display while maintaining the physical shape of the object in the real world is also projection mapping. The projection mapping display has the augmentation effect by multiple people without wearing glasses or any other devices. It is one of the typical immersive augmented reality technologies.

Traditional projection mapping project information on the surface of a static object. In the augmented environment applications, the projection displays will play an interactive role with the dynamic objects. To realize these demands, researches developed several computational algorithms to project geometrically and photometrically correct images by applying projector-camera systems [1,3]. The cameras are used to measure the state of bending surface, and the algorithms will compute to warp the prepared images, so that as a result the projected image will be self-adapted on the irregular surface. Meanwhile, with the help of high-speed vision technology, the read-time deformation and movement could be detected, so that a latency-less dynamic projection mapping could be realized on a high-speed moving object [2,4].

However, the projection focus and the target distance has been adjusted in the initial stage of the projection. The range of depth-of-focus (DOF) is narrow, and the dynamic response performance of the projector is slow, due to the mechanism of the optics unit [5]. In the real world, an object may have a depth dimension, which sometimes the scale becomes out of the tolerance of the DOF. On the other side, if the projection mapping is conducted on a moving object, the focus plane of the projection might be out of the DOF field. As a result, the narrow DOF and low dynamic of the projector’s response will limit the potential application of the projection mapping. Therefore, extending DOF of projectors is highly demanding issue especially in dynamic projection mapping applications where projection objects and/or projectors are moving in large spaces.

In this work, our purpose is to improve the DOF and dynamic performance of the projection mapping for a high-speed moving object. The proposed system included machine vision, variable focus unit and high-speed projector. Two prototypes of the proposed system were introduced in the following, which were version 1 to verify the proposal’s feasibility and version 2 to realize a milli-second order prototype. The dynamic projection range covered from 0.5 to 2.0 m. The depth variation and rotation of the moving object was recognized by high-speed visual feedback, and the focus plane was adjusted and the projection information was updated correspondingly. In turn, even the object could be moved fast, the projection information was shown as printed on the object without any blurry. This is an improvement on the human interactivity for the augmented dynamic projection mapping.

2. Design principle

2.1 Conceived principle

A conventional projector has a fixed focal length, and it could be tunable by adjusting the lens. However, due to the limitation of the response speed of the traditional lenses, it cannot meet the needs of the dynamic interaction environment. Human flicker threshold shows that the response speed would be higher than 60 Hz, so that the display flicker can be un-noticeable [6,7]. The traditional method of the focusing approach cannot reach that value. Moreover, in order to refocus the target, a depth sensing and a feedback loop are necessary. Thus, the focusing method should be reconsidered.

A variable focus lens was employed in the optics unit of the projector. Traditional solid lens has a fixed refractive curvature, so its focal length cannot be changed after manufacturing. Hence, it is typical method to perform a back and forth motion of the lenses to realize focus or zoom. In this paper, the membrane-based liquid-filled variable focus will be discussed. A transparent liquid will be filled into a chamber, and then it is sealed by a transparent elastomer membrane. The surface of the membrane acts as the refractive surface. The membrane is fixed with a circular boundary condition, so the deflection profile of the membrane will perform as a paraboloid-like surface, when a uniform pressure is applied on the membrane [8,9]. The variable focus lens can control the bending curvature of the refractive surface, so that its focal length can be adjusted. The response speed of the variable focus lens can research millisecond order, so it will be used in the following prototype.

When the variable focus lens was working at non-stop swapping state, an extended DOF projection with an edge-enhanced image processing was built [6]. The volumetric display was realized by synchronizing the oscillation between the liquid lens and the DMD-based projector, but the projection was static and without interactivity [10,11]. Hence, the response speed of the variable focus liquid lens can reach a higher control order to meet our proposed requirement.

A variable focus liquid lens can be integrated into an optical system whether as an add-on or as add-in element [9,1214]. The add-on option is placed the liquid lens at the front of the optical system. This solution is easy to implement, but the exiting light will be vignetted due to the small aperture of the liquid lens diameter [9,15]. Hence, the following work will place the variable focus lens integrated within the optical unit, which will close to the aperture area. This method might be more complex but it would have a better projection quality.

Common projector has a refresh rate about 30 - 120 Hz, which means that the time from ready to project till the image appeared on the target will cost 8.33 - 33.33 ms [5]. This latency scale will be sensitive and noticeable for human in the projection interaction. In the following work, a specialized commercial high-speed projector will be employed in this project, and its optical unit cannot be removed. To adopt the high-speed projector, an add-on variable focus unit will be considered, and the liquid lens will be add-into the unit.

Object will be high-speed moved in the 3D space, a high-speed camera was used to measure the variation of the target movement. The depth variation of the target needs to be measured and then a control function should be built. Meanwhile, the rotation of the target could be detected, so that a projection content can be adjusted and warped to map the target variation.

As motioned the above, compare to the traditional projector, which has a low refresh rate and a noticeable latency in projection mapping. The proposed system will be built by a low latency loop from sensing to displaying. A sketch of the proposed projection system was shown in Fig. 1.

 figure: Fig. 1.

Fig. 1. Sketch of the proposed projection system was illustrated. A variable focus lens was employed in the optics unit. A depth sensor was used to detect the target motion, so that the system could control the projection focal length and update the projection information in real-time.

Download Full Size | PDF

2.2 System composition

According to Fig. 1, the system includes four units, which are a random moving target, a depth sensor, and signal processing unit and a high-speed projector. A target was moved back and forth randomly, and the feedback updating information included the focal length of the projection and the projection contents update. A camera sensor detected the motion of the target, which in detail the depth variation of the target. The depth information was then sent to the information processing unit, and an appropriate control command would be computed to manage the optics unit of the projector. On the other side, the measured depth and rotation information could also be used to update the projection information, the generated information would be transfer to the high-speed projector. Therefore, two factors, the focal length of the projection and the projection information, would be updated according to the upper unit. The variable focus lens would be modified, in turn the projection focal length would be changed. Figure 2 illustrates the information process flow of the system.

 figure: Fig. 2.

Fig. 2. Process flow of the dynamic focal tracker projection system was illustrated. The system includes four units, which are a random moving target, a depth sensor, and signal processing unit and a high-speed projector.

Download Full Size | PDF

3. Research result

3.1 Prototype setup

A photo of the experiment prototype version 1 was shown in Fig. 3. A high-speed projector (TI DLP, LightCrafter 4500, 120 FPS) was employed and placed tilt-up to align the optical axis. As shown in the sketch in Fig. 1, a light beam was collected by the first solid lens (focal length = 50 mm) and converged by the second solid lens (focal length = 60 mm). The beam was controlled to pass through the variable focal lens (Optoune, EL-10-30-Ci-VIS-LD-MV [16]), and then expend by the last solid lens (focal length = 100 mm). The variable focus lens was driven by a controller (Gardasoft, TR-CL180), which was controllable by the external voltage. The control voltage was computed by PC, and then the command voltage was outputted via DA board (Interface, LPC-361316) to the lens controller. A stereo camera (Intel, Real Sense D435) was employed as a depth sensor, and it was placed on the side-by-side position with the last lens. It could provide a sampling rate at 90 fps. Because the prototype version 1 was built to verify to feasibility of the proposal, the projector and sensor was not employed with high-speed devices. A photo of the prototype version 1 was in Fig. 3.

 figure: Fig. 3.

Fig. 3. Photos of the proposed system prototypes version 1(a) and 2(b). A high-speed projector, three solid lenses and a variable focus lens was placed along the optical axis. In version 1(a), a stereo camera was placed on the side-by-side position with the last lens and a 120 fps projector was used. In version 2(b), a high-speed camera was employed with a dichroic mirror and a high-speed projector was used.

Download Full Size | PDF

3.2 Calibration for the variable focus lens and the projection focal length

A liquid-filled variable focus lens was employed in the proposed system, and its curvature of the refractive surface could be controlled so that the focal length would be changed [9,16]. Its optical power has a function with the command voltage. The distance of the projection range can cover from 2.0 m to 0.5 m, when the control voltage varies from 0.0 V to 10.0 V, which leads the optical power of the variable focus lens changing. A calibration for the focal length and the command voltage was conducted. The voltage was increase from 0.0 V to 10.0 V by a step of 0.5 V. A 2*2 monochrome chessboard was projected so that the edge boundary would be used to analysis the focusing status. A series of the measured data was plotted as dots in the Fig. 4. A curve fitting toolbox in Matlab was employed to search the appropriate coefficients, and the result function equation was shown below.

$$Distance = 8.89\ast ({1/({Voltage + 4.258} )} )- 0.02958$$

 figure: Fig. 4.

Fig. 4. Curve fitting illustrating the relationship between the projection focal length and the applied voltage on the variable focus lens.

Download Full Size | PDF

3.3 Feasibility confirmation experiment

The target movement could be detected by the stereo camera. In the prototype version, a target was moved mainly along the optical axis and in the center of the camera’s field-of-view. The function of the stereo camera returned a raw image and depth map, and the average depth value of the target center was used. The dynamic projection range was from 0.5 m to 2.0 m. The following experiments were conducted to confirm the focus performance.

A target screen (a white board) was used as a target and moved back and forth along the projection depth. Because our focal tracker would dynamic recognize the variation of the depth changing, the focal length of the projection would be controlled in high-speed around 10 milliseconds. A 2*2 monochrome chessboard was projected so that the edge boundary would be used to analysis the focusing status. At the same time, in the middle of the projection image, the depth distance from the projector to the target was shown and update in real-time on the screen as an example of animation.

In the confirmation experiment, a whiteboard was moved back and forth. Three experiments were compared, which were two projections with fixed distance on far and near and one projection of the proposed system.

The experiment result shows when the target moved from 0.5 m to 2.0 m, the projection information formed a well-focused image on the screen and the varies distance information as an amine content updated in real-time, as the right column in Fig. 5. As a comparison, when the projection was fixed on the 2.0 m, the projection was focused when the target in 2.0 m, and acceptable when it was in 1.5 m and 1.0 m, but became blur when it was in 0.5 m, as the left column in Fig. 5. In the same way, in the middle of Fig. 5, when the projection was focused in 0.5 m initially, the projection information became blur when the target was 1.0 m, 1.5 m and 2.0 m. The experiment was recorded in videos, and a series of the photos was captured and shown in Fig. 5. A video of the experimental is included in Visualization 1.

 figure: Fig. 5.

Fig. 5. The comparison experiment result of the fixed projection and the proposed system. The left column shows the projection images when the projector was in fixed in 2.0 m, and the middle column shows the projection images when the projector was in fixed in 0.5 m. The photos in the right column showed that the system could track the depth changing of the target and projected a well-focused projection information between 0.5 m to 2.0 m (Visualization 1).

Download Full Size | PDF

3.4 High-speed prototype

The second version of the high-speed focal tracking projection system has been developed, as shown in Fig. 3(b). In this version, we incorporated the technologies of high-speed vision and high-speed projector in order to increase in speed of feedback and update the projected images. The optics in this prototype was using the same setup with the first prototype, which could be found in section 3.1. A sketch image of the prototype version 2 was shown in Fig. 6. In detail, a high-speed projector (DynaFlash, Tokyo Electro Device Inc.) was employed [2,3], which could update its projection image with 1000 fps. The depth and rotation variation of the whiteboard was tracked by a high-speed camera (acA720-520um, Basler) based on the markers on the whiteboard. The high-speed camera captured image at 500 fps. The markers were prepared by inferred LEDs inset behind the whiteboard. A dichroic mirror was place 45 degree against to the optics axis, so that the visible projection light beam could pass through the mirror and IR light could be refracted to the high-speed camera for detecting. The layout of the markers could be found in Fig. 6(a). The object’s distance and rotation can be detected within 2 ms by the high-speed visual feedback. The feedback results served to control the focal length changing and update the projection image. As a result, a well-focused dynamic projection mapping was achieved even when the motion involved large depth range movement. Videos of the experimental results are included in Visualization 2 (for a projection on a rotation object) and Visualization 3 (for a 3D slicing data projection).

 figure: Fig. 6.

Fig. 6. (a) Sketch of the high-speed prototype version 2 was illustrated. A high-speed projector and a high-speed camera were employed. (b) A series of the content was to be projected. (c) The experiment result confirmed that the proposed system could always project a well-focused and well-mapping image on the target. (d) A volumetric CT data can be projected without blury in a virtual 3D space. Visualization 2 and Visualization 3.

Download Full Size | PDF

Because the rotation and the depth of the target can be detected, this function is suitable for the AR (augmented reality) projection. We assume that when a people jogging on a running machine, an AR projection could be possible to project a beating heart on his T-shirt. An augmented heart beating dynamic image was used in this demonstration, which was in Visualization 2. A series of screenshots were shown in Fig. 6(c). the proposed system can detect the homograph transformation from the camera and compute the projection image array based on the transformation, in turn, a wall-mapping warped image will be projected on the object.

A 3D slicing images from a 3D CT data was used in the experiment demonstration. The traditional way is to use a 2D monitor to see a 3D data by rolling the mouse wheel. In this experiment, a 3D data could be projected in a virtual 3D space, and the viewer could see the cross-section image by moving the board. The proposed system shows another solution to view a 3D data. A demonstrate video was in Visualization 2. A series of content to be projected were shown in Fig. 6(b), and the experiment result was shown in Fig. 6(d). In detail, three screenshots of the demonstration illustrated that our proposed projection could project always-well-focused images on the screen, but, in contrast, the images of the traditional projection became blurry when the whiteboard moved out of the DOF.

4. Discussion and conclusion

A projection target would be randomly moved in a 3D space, and it would be detected by a high-speed camera, which served as visual feedback to control the projection focal length via changing the focal length of the liquid lens, and then the projection information would be updated on a high-speed projector. The high-speed projector worked at 1000 fps, the high-speed camera worked at 500 fps and the liquid lens’s response time was about 2.5 ms according to the technical report [16]. The response speed of the high-speed prototype could reach around 5 ms, and the dynamic projection range covered from 0.5 to 2.0 m.

The system is also capable and robust for the whiteboard that suddenly flew into the project area at a random distance. Compared to project a volumetric image by pre-stored data source on the memory of a DMD-based projector via oscillating a liquid lens, our proposed system can capture the object’s variation in high-speed, in turn, change the focal length and update the projection information. This is a great improvement on the interactivity for the projection mapping.

In the prototype, the depth-of-field of the projection is controllable. The field-of-view projection is 18° × 12°. When the projector has a resolution of 1280 × 800, the target is a part of the projection range, as a result, the user can only see a part of the resolution. This cut-down resolution issue will decrease the enjoy experience. It should be improved with a steerable or zoomable optics system, so that the user can see the whole resolution. Meanwhile, if a pan and tilt mechanism to control the optical gazing direction can be implemented, the field-of-view projection can be improved [17,18].

This setup is to confirm the feasibility of the proposal. In detail, a variable focus liquid lens can be integrated into an optical system as an add-in element, but the exiting light was still vignetted due to the small aperture. The aberration, such as spherical and chromatic aberration, was noticeable on the boundary area, but we believe it could be improved by an appropriate optical design and a large-aperture liquid lens [8,19,20].

Funding

Accelerated Innovation Research Initiative Turning Top Science and Ideas into High-Impact Values (JPMJAC15F1); Guangdong Academy of Sciences (2021GDASYL- 20210102006); Natural Science Foundation of Guangdong Province (2021A1515012596, 2021B1515120064).

Acknowledgments

The authors give thanks for the support of Digital Content Association of Japan (DCAJ) and the Special Interest Group on Computer Graphics of Association for Computing Machinery (ACM). The authors also acknowledge the partial support from Konica Minolta Inc.

Disclosures

The authors declare no conflicts of interest.

Data availability

Data underlying the results presented in this paper are not publicly available at this time but may be obtained from the authors upon reasonable request.

References

1. A. Grundhöfery and D. Iwai, “Recent Advances in projection mapping algorithms, hardware and applications,” Comput. Graph. Forum 37(2), 654–675 (2018). [CrossRef]  

2. L. Miyashita, Y. Watanabe, and M. Ishikawa, “MIDAS projection,” ACM Trans. Graph. 37(6), 1–12 (2018). [CrossRef]  

3. S. Tabata, S. Noguchi, Y. Watanabe, and M. Ishikawa, “High-speed 3D sensing with three-view geometry using a segmented pattern,” in 2015 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS) (IEEE, 2015), pp. 3900–3907.

4. Y. Mikawa, T. Sueishi, Y. Watanabe, and M. Ishikawa, “VarioLight,” in SIGGRAPH Asia 2018 Emerging Technologies (ACM, 2018), pp. 1–2.

5. R. Fischer, B. Tadic-Galeb, P. Yoder, and R. Galeb, Optical System Design Second Edition (McGraw-Hill, 2008).

6. D. Iwai, S. Mihara, and K. Sato, “Extended Depth-of-Field Projector by Fast Focal Sweep Projection,” IEEE Trans. Visual. Comput. Graphics 21(4), 462–470 (2015). [CrossRef]  

7. Y.-P. Wang, S.-W. Xie, L.-H. Wang, H. Xu, S. Tabata, and M. Ishikawa, “ARSlice: Head-Mounted Display Augmented with Dynamic Tracking and Projection,” J. Comput. Sci. Technol. 37(3), 666–679 (2022). [CrossRef]  

8. L. Wang, H. Oku, and M. Ishikawa, “Variable-focus lens with 30 mm optical aperture based on liquid–membrane–liquid structure,” Appl. Phys. Lett. 102(13), 131111 (2013). [CrossRef]  

9. H. Ren and S.-T. Wu, Introduction to Adaptive Lenses (Wiley, 2012).

10. J.-H. R. Chang, B. V. K. V. Kumar, and A. C. Sankaranarayanan, “Towards multifocal displays with dense focal stacks,” ACM Trans. Graph. 37(6), 1–13 (2018). [CrossRef]  

11. K. Rathinavel, H. Wang, A. Blate, and H. Fuchs, “An Extended Depth-at-Field Volumetric Near-Eye Augmented Reality Display,” IEEE Trans. Visual. Comput. Graphics 24(11), 2857–2866 (2018). [CrossRef]  

12. H. Ren, H. Xianyu, S. Xu, and S.-T. Wu, “Adaptive dielectric liquid lens,” Opt. Express 16(19), 14954 (2008). [CrossRef]  

13. G. Li, D. L. Mathine, P. Valley, P. Äyräs, J. N. Haddock, M. S. Giridhar, G. Williby, J. Schwiegerling, G. R. Meredith, B. Kippelen, S. Honkanen, and N. Peyghambarian, “Switchable electro-optic diffractive lens with high efficiency for ophthalmic applications,” Proc. Natl. Acad. Sci. 103(16), 6100–6104 (2006). [CrossRef]  

14. L. Li, L. Xiao, J.-H. Wang, and Q.-H. Wang, “Movable electrowetting optofluidic lens for optical axial scanning in microscopy,” Opto-Electron. Adv. 2(2), 18002501 (2019). [CrossRef]  

15. H. Zappe, Tunable Micro-Optics (Cambridge University, 2016).

16. “Optotune Homepage,” http://www.optotune.com/.

17. Y. Watanabe, H. Oku, and M. Ishikawa, “Architectures and applications of high-speed vision,” Opt. Rev. 21(6), 875–882 (2014). [CrossRef]  

18. M. Ishikawa, “High-Speed Vision and its Applications Toward High-Speed Intelligent Systems,” J. Robot. Mechatronics 34(5), 912–935 (2022). [CrossRef]  

19. L. Wang, H. Oku, and M. Ishikawa, “An improved low-optical-power variable focus lens with a large aperture,” Opt. Express 22(16), 19448–19456 (2014). [CrossRef]  

20. L. Wang and M. Ishikawa, “Dynamic response of elastomer-based liquid-filled variable focus lens,” Sensors 19(21), 4624 (2019). [CrossRef]  

Supplementary Material (3)

NameDescription
Visualization 1       A comparison experiment of the fixed projection and the proposed system. The left column shows the projection images when the projector was in fixed in 2.0 m, and the middle column shows the projection images when the projector was in fixed in 0.5 m.
Visualization 2       Rotate object projection mapping. A high-speed projector and a high-speed camera were employed. A series of the content was to be projected. The experiment result confirmed that the proposed system could always project a well-focused image.
Visualization 3       3D data projection mapping. A high-speed projector and a high-speed camera were employed. A series of the content was to be projected. The experiment result confirmed that the proposed system could always project a well-focused image.

Data availability

Data underlying the results presented in this paper are not publicly available at this time but may be obtained from the authors upon reasonable request.

Cited By

Optica participates in Crossref's Cited-By Linking service. Citing articles from Optica Publishing Group journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (6)

Fig. 1.
Fig. 1. Sketch of the proposed projection system was illustrated. A variable focus lens was employed in the optics unit. A depth sensor was used to detect the target motion, so that the system could control the projection focal length and update the projection information in real-time.
Fig. 2.
Fig. 2. Process flow of the dynamic focal tracker projection system was illustrated. The system includes four units, which are a random moving target, a depth sensor, and signal processing unit and a high-speed projector.
Fig. 3.
Fig. 3. Photos of the proposed system prototypes version 1(a) and 2(b). A high-speed projector, three solid lenses and a variable focus lens was placed along the optical axis. In version 1(a), a stereo camera was placed on the side-by-side position with the last lens and a 120 fps projector was used. In version 2(b), a high-speed camera was employed with a dichroic mirror and a high-speed projector was used.
Fig. 4.
Fig. 4. Curve fitting illustrating the relationship between the projection focal length and the applied voltage on the variable focus lens.
Fig. 5.
Fig. 5. The comparison experiment result of the fixed projection and the proposed system. The left column shows the projection images when the projector was in fixed in 2.0 m, and the middle column shows the projection images when the projector was in fixed in 0.5 m. The photos in the right column showed that the system could track the depth changing of the target and projected a well-focused projection information between 0.5 m to 2.0 m (Visualization 1).
Fig. 6.
Fig. 6. (a) Sketch of the high-speed prototype version 2 was illustrated. A high-speed projector and a high-speed camera were employed. (b) A series of the content was to be projected. (c) The experiment result confirmed that the proposed system could always project a well-focused and well-mapping image on the target. (d) A volumetric CT data can be projected without blury in a virtual 3D space. Visualization 2 and Visualization 3.

Equations (1)

Equations on this page are rendered with MathJax. Learn more.

D i s t a n c e = 8.89 ( 1 / ( V o l t a g e + 4.258 ) ) 0.02958
Select as filters


Select Topics Cancel
© Copyright 2024 | Optica Publishing Group. All rights reserved, including rights for text and data mining and training of artificial technologies or similar technologies.