Expand this Topic clickable element to expand a topic
Skip to content
Optica Publishing Group

On-axis near-eye display system based on directional scattering holographic waveguide and curved goggle

Open Access Open Access

Abstract

The contradiction between the field of view (FOV), luminance uniformity (LU) and weight has always restricted the development of augmented reality display systems. An on-axis near-eye display (NED) system based on directional scattering holographic waveguide (DSHW) and curved goggle is proposed in order to realize a large FOV with high LU, light weight, and conformal design capability. The DSHW which consists of a linear volume holographic grating and holographic diffuser is used to deliver the virtual image and construct a transparent directional emission display screen with high LU. The curved goggle is used to project the image on the display screen into human eye and form a large FOV, with a suitable exit pupil diameter (EPD) and eye relief distance (ERF) and while keeping the external scene visible. Our proposed NED achieved an FOV of 44° horizontal (H) × 12° vertical (V) easily, which is almost consistent with the theoretical design. The EPD is 6 mm, ERF is 18.6 mm, and LU is about 88.09% at full viewing angle. The system is lightweight and flexible, which can be further applied in the next-generation, integrated protection-display helmet system through conformal optical design.

© 2019 Optical Society of America under the terms of the OSA Open Access Publishing Agreement

1. Introduction

Augmented reality (AR) as a promising technology has a wide range of applications in defense, education, medical, industrial and commercial fields [1,2]. In an AR NED system, the optical imaging system is one of the core hardware. There are many research institutions and companies [3–10] dedicated to the development of next generation AR optical imaging lens or system, such as Canon, Epson, Meta, ODG, Sony, DigiLens and Microsoft, which involve approaches including geometric optical elements [11,12], plane optical waveguide [13–17], electronically controlled optical components [18–21], computational holography [22–24], retinal scanning display (RSD) [25] etc. Among the above various technologies, NEDs based on holographic waveguide or geometric optical elements are most attractive.

Holographic waveguide features lightweight, flexibility and high transparency, which are very attractive to consumers. However, there are some gross drawbacks for traditional volume holographic waveguide (Sony), where the main challenge is that it’s difficult to balance the field of view and luminance uniformity of linear holographic grating according to coupled wave theory [26], which makes the advantages above of holographic waveguide not obvious. In recent years, some researchers have tried to solve this thorny problem through various designs. For instance, Yu et.al [27] employed three linear volume holographic gratings with different grating vectors to serve as the out-coupler of holographic waveguide. However, due to the discreteness of multiple grating directions, the exit pupil diameter of their proposed system will decrease as the number of gratings increases. Mukawa et.al [28] used the oblique incident waveguide method to improve the FOV to some extent, but its FOV is still limited by the refractive index modulation of its material. DigiLens [9] adopted H-PDLC [29] to serve as holographic media, which greatly increase the refractive index modulation and angular bandwidth of holographic grating, but the resolution of H-PDLC is not very high and displayed image is a little blurry. Geometrical optical elements, such as convex/concave lens (ODG), off-axis curved goggle (Meta) and free-form optics (Canon), have the ability to achieve a large FOV compared with holographic waveguide. For Meta, its FOV reaches 90 degrees easily by using off-axis goggle. But the shortcomings of geometrical optical elements are obvious, which have to need more relay lenses to deliver the virtual image and correct off-axis aberrations, and these lead the size and weight of overall system relatively cumbersome and assembly more difficult.

The combination of holographic waveguide and geometrical optical elements will have the potential to integrate the advantages of both. For example, Han et.al [30] used the free-form optics and multiplexed linear volume holographic grating as the in-coupler and out-coupler of holographic waveguide, respectively, achieving the integrated design of AR lens with large FOV and lightweight, but the wholistic manufacture is full with challenges and multiplexing technique will lead many unwanted stray light orders. Park J.H et.al [31] proposed a Maxwellian near-to-eye display which includes a plane waveguide and convex lens element recorded in a volume hologram to achieve an RSD-like display, whereas its exit pupil is a set of some discrete points. Maimone et.al [32] also integrated a convex lens recorded in a volume hologram and free-form optics to develop a lightweight AR display prototype with a viewing angle of 80°, but the optical path essentially determines that the exit pupil is only one point, which cannot be used practically.

In this paper, we propose a novel combined on-axis near-eye display system based on directional scattering holographic waveguide and curved goggle with a large FOV, high LU, light weight, manufacturing independently and conformal design capability. The directional scattering holographic waveguide is used to deliver virtual image and construct a transparent directional emission display screen in front of the human eye, which is designed and fabricated by holography. The curve goggle is an optimized dual-aspherical shell that is placed coaxially in front of the waveguide for projection without off-axis aberrations. The optical experiments demonstrated that our proposed NED achieved a FOV of 44° (H) × 12° (V) easily, with a 6 mm EPD, an 18.6 mm ERF and about 88.09% LU at full viewing angle, which greatly eases the contradiction of the FOV, LU and weight of NED system.

2. System and principles

2.1 System specification

The proposed NED system is illustrated in Fig. 1, which consists of a collimated illumination source, an amplitude-type spatial light modulator (SLM, such as LCD and LCOS) loadedwith virtual image to be displayed, 4f system optics system (L1, L2 and stop), a DSHW with linear volume holographic grating H1 and holographic diffuser H2, and an on-axis curved see-through goggle (STG) functioning as protective goggle and magnifying mirror. The collimated illumination source is spatially modulated by the SLM to form the desired virtual image light. 4f system optics are used to block higher order diffraction terms caused by the discrete pixel structure of the SLM. The filtered virtual image light is coupled into DSHW by the in-coupler H1 and propagates along the waveguide in the condition of total internal reflection (TIR). When the image light hits H2, each pixel of image will be directionally scattered to STG by the out-coupler H2 with some diffusion angle to meet the requirements of EPD and ERF. Finally, the STG projects the image on H2 into human eye.

 figure: Fig. 1

Fig. 1 The illustration of our proposed NED.

Download Full Size | PDF

The H1 acts as a relay grating and waveguide in-coupler, whose diffraction function is illustrated in Fig. 2(a). It is recorded by two plane waves. The probe light (i.e. virtual image light) illuminates H1 with incident angle same as the recording light’ angle, which means that H1 works in Bragg diffraction conditions and each pixel of the virtual image will have the same diffraction efficiency. The out-coupler H2 acts as an intermediate image plane and transparent directional emission display screen, which is recorded by plane light and scattered light. The image transmitted from H1 will be mapped on the surface of H2, and each pixel having almost one transmission direction will be scattered by H2 into a bundle of divergent light, with a direction angle θs and a diffusion angle φs, wherein θs and φs can be designed holographically, as shown in Fig. 2(b). The STG receiving the light emitted from the H2 will reflect the divergent light partially and enlarged the image on H2. Figure. 3 shows the optical path diagram and some key geometric parameters, where R is the best fit inner surface radius of STG, L represents the distance between the vertices of STG and H2, and W refers to the height of image. The EPD and ERF, determining the eye box of the display system, are calculated by the surface type of STG, L, θs, φs, and W.

 figure: Fig. 2

Fig. 2 Schematics of (a) the in-coupler H1 and (b) out-coupler H2.

Download Full Size | PDF

 figure: Fig. 3

Fig. 3 The optical path diagram of STG.

Download Full Size | PDF

2.2 Principles and material

The internal grating structure of H1 are determined by the geometry of its recording beams. Since H1 is recorded by two plane waves, the grating vector in different regions will be consistent. H1 operates under Bragg diffraction conditions, which means its diffraction function can also be described by Kogelnik theory [26], which are shown by the following formulas:

η=th2(ν)
η=sin2(ν)
ν=πΔndλcos(θp)cos(θr)
where Eqs. (1) and (2) are used to calculate the diffraction efficiency η of the reflection and transmission gratings, respectively. v is the coupling variable. d and Δn refer to the thickness and refractive index modulation of the holographic medium, λ is the wavelength in vacuum, θp and θr denote the incident angle and diffraction angle of probe light, respectively. The Bragg mismatch variable in Kogelnik theory are none in Eqs. (1)–(2).

The recording beams of H2 are plane wave and scattered light which consists of a set of plane waves propagating in different angles, where each of them has a random phase, so its internal micro-nano structure is more complex than that of H1 and generally require complex mathematical model to be approximated [33]. Although its internal physical properties are difficult to be modeled and calculated accurately, its external optical path can be designed by holography, which brings researchers more design freedom and application areas [34–36] than traditional diffuser components. In our proposed NED system, it is necessary to increase the diffraction efficiency as much as possible, so that more image lights are scattered to the STG, while reducing the backscattering [33] of H2 to improve image contrast.

EPD and ERF should be designed carefully to meet the requirements of human eye observation. The schematic for calculating EPD and ERF is shown in Fig. 4, where the entrance pupil is located at infinity, which means the EPD is determined by the edge ray IA of the edge pixel I and ERF. We assume that STG is rotationally symmetric, therefore, Fig. 4 only depicts the upper part, where IA is the upper edge ray of I, A is the intersection of IA and STG, AD represents the normal line N at the intersection A, AE is the reflected ray, B is the projection point of A on the optical axis OE, K is the height from A to B, C is the virtual intersection of IA and OE. ERF is calculated approximately from the H2 to human eye. The asphericity of STG, such as conic and even aspheric surface, doesn’t affect the paraxial characteristics of STG. According to the Gaussian formulas, the position and size of paraxial virtual image formed by STG are described as follows:

1L'=2R+1L
Y=L'LW
where L’ is the distance of the virtual image from the principal point of STG, and Y is the virtual image size. The vertical field of view (V) can be expressed as:
VFOV=2arctan(L'2(L+L'))
when L is equal to the focal length of the inner surface of STG, the vertical field of view can be simplified as follows:
VFOV=2arctan(WR)
The expression of horizontal field of view (H) is similar to the Eq. (7), which is limited by the image size in the other direction.

 figure: Fig. 4

Fig. 4 Calculation model of FOV, EPD and ERF.

Download Full Size | PDF

The scattering characteristics of H2 are described by the direction angle θs and diffusion angle φs. Here, we set θs to 90 degrees. According to the geometric relationship in Fig. 4, the following formulas can be obtained:

AD¯2K2KW/2tan(φs/2)+L=OD¯
sin(α)=K/AD¯
γ=φs2α
β=φs/2γ
Where AD¯ and OD¯represent the length of line segments AD and OD, respectively. If γ is a positive value, the reflected light is on the right side of the normal line, and vice versa, the negative light means the reflected light is on the left side of the normal line. Furthermore, according to the sine theorem in ΔACE, the relationship between ERF and EPD can be estimated as below:
ERF=Ksin(γ)sin(β)sin(φs/2)EPD2tan(β)+W2tan(φs/2)
It can be seen from the relationship that ERF and EPD have a negative linear relationship when the other system parameters are fixed. In actual use, it is necessary to consider the appropriate ERF and EPD to meet the actual observation requirements. When the STG is a spherical shell, AD¯and OD¯in Eqs. (8) and (9) are equal to R. Here, we set the R to 45 mm, L to 22.5 mm, W to 18 mm, θs to 90 degrees and φs to 50 degrees. According to Eqs. (8)–(12), when the EPD is 6 mm, the ERF is about 18.6 mm, which satisfies the human eye viewing requirements. The height K is about 17.78 mm, which means the size of STG should be large than 36 mm. Furthermore, based on the above initial structure and key system parameters, we optimized the two surfaces of STG using Synopsys’ CODE V, with a ERF 18.6 mm calculated from the rear surface to eye pupil and 1.5mm thickness of STG, to ensure low distortion of external scene and improve virtual image quality. Since the STG is coaxial, it means that we don’t need to correct for extra off-axis aberrations, and we can compress the volume and weight of STG by using aspheric surfaces. The surface local height of STG and imaging simulations are shown in Fig. 5, respectively, where both the two surfaces are 8th-order even aspheric surfaces.

 figure: Fig. 5

Fig. 5 The surface local height of (a) front aspheric surface and (b) rear aspheric surface; (c)The simulation results of virtual image and (d) external real scene.

Download Full Size | PDF

3. Fabrication and optical experiments

3.1 Fabrication and test

In the proposed NED system, the most important component is the DSHW. On the one hand, it needs to complete the coupling input and transmission of virtual image. On the other hand, it has to divergently expand each pixel on the two-dimensional image to form a transparent directional emission display screen with little backscattering. The manufacturing schematic of H2 is shown in Fig. 6(a). The 532 nm laser passes through a collimating and filtering (CF) device to form a plane wave front. P1 is a half-wave plate which is combined with PBS to adjust the two exposure light intensity ratios, P2 being a half-wave plate. M1 and M2 are silver-plated mirrors, and the scattering element (SE) transforms the input light to form scattered object light waves, with a 50° diffusion angle and a 90° direction angle. The holographic medium used for exposure is self-developed photopolymer. The manufacturingsetup of H1 is the same as that of H2 except for the SE component. It is noting that the recording angles of H1 and H2 (θd), the maximum size of the image W in transmission direction and the thickness of waveguide Dw need to meet following constraint:

W<2Dwcot(θd)
Here, we set θd to 18° and Dw to 3mm, so the size W should be less than 18.5mm. The STG is manufactured by compression molding process, and the material is PMMA, which is lightweight with high transparency. The DSHW and STG after processing are shown in Fig. 6(b), where other specific parameters are listed in Table 1.

 figure: Fig. 6

Fig. 6 (a) Schematic diagram of holographic exposure; (b) STG and DSHW components; (c) Actual ray tracing of DSHW (The black curtain makes light visible and measurable).

Download Full Size | PDF

Tables Icon

Table 1. Optical parameters of DSHW and STG.

Actual ray tracing of DSHW is given in Fig. 6(c). The incident green light is modulated by H1 and enters the waveguide. After three times of TIR, the light hits the H2 and is coupled out by H2 at a scattering angle of about 50 degrees. The diffraction efficiencies of H1 and H2 are about 75% and 78%, respectively, and the energy utilization efficiency of the DSHW is about 58.5%.

3.2 Optical experiments

The experimental setup for imaging is shown in Fig. 7. The incident polarized laser illuminates SLM, where the character “HOLOGRAPHIC TECHNOLOGY” is loaded on the SLM. Considering that the shrinkage unevenness of H1 in vertical direction (i.e. the cross section in Fig. 1) is relatively more obvious than that of horizontal direction in practice, we comprehensively considered the image quality and field of view, and choose the size of character on H1 to be about 18mm × 4.8 mm (the minimum clear aperture required for STG is 36 mm × 24.5 mm). The polarization directions of the polarizer and incident polarized laser are orthogonal. Holder 1 is used to adjust and fix the STG, and XY translation mount is used to clamp the DSHW, adjust the incidence angle of laser to compensate for the shrinkage of holographic material, and select the appropriate input area on H1, which makes H1 operate under Bragg diffraction conditions.

 figure: Fig. 7

Fig. 7 Experimental setup.

Download Full Size | PDF

A micro camera is applied to simulate human eye and capture the displayed virtual image at 18.6 mm ERF. Figure. 8(a) shows the captured virtual image, which is superimposed in real scene clearly with almost no backscattering, which demonstrates the feasibility of our proposed system. The virtual image shows a little blurring noise, which are caused by laser speckle and uneven shrinkage of material in vertical direction. Improving the uniformity and surface quality of the holographic media, and using collimated LD or LED sources with high power can further alleviate or even eliminate. Due to the back surface reflection of STG, two images can be seen in Fig. 8(a). To attenuate this effect, an anti-reflection coating can be applied to the back surface. We use a black film method to verify that the method above is effective, and the result is shown in Fig. 8(b). It is obvious that only one sharp image is displayed. As described in Section 2.1, STG directly projects the image on H2 into human eye, so the surface flaw of H2 will be also discernible, which may lead some pixels to fail to escape the waveguide. In processing, best exposure conditions were repeatedly explored, including exposure and baking time, to minimize stains. The results in Fig. 8 show that there are only a few stains in some character, such as “C” and “E” in the “TECHNOLOGY”. Improving material quality and processing techniques can further solve this problem. Moreover, we measured the actual viewing angle via the micro camera, which is approximately 44° (H) × 12° (V), and that is almost identical to the theoretical result (46.7° (H) × 12.6° (V)).

 figure: Fig. 8

Fig. 8 Experimental results.

Download Full Size | PDF

The luminance uniformity of different viewing angles was tested using a 4x8 grid loaded on the SLM. Figures. 9(a) and 9(b) show the original grid and captured image with some surface defects, respectively. To ensure that the uniformity test values are valid and convincing, we calculated all points in the upper right part 2 × 4 grid for the reason of the rotational symmetry and few flaws. The definition of uniformity is same as defined in Refs [27,30]. The test results are given in Table 2, which show that the uniformity is about 88.09% at full viewing angle. The reason for LU loss is mainly due to the angular scattering inhomogeneity of H2, which can be pre-compensated by adjusting the spatial intensity distribution of the illumination source and improving the light distribution curve of H2. Further, color display with high efficiency can be realized by using full-color sensitive holographic material and wavelength multiplexing technology.

 figure: Fig. 9

Fig. 9 Test of luminance uniformity.

Download Full Size | PDF

Tables Icon

Table 2. Luminance uniformity of different FOV.

4. Conclusion

We have proposed a novel on-axis near-eye display system based on directional scattering holographic waveguide and curved goggle. The principles of display were analyzed, and key parameters of system and elements were designed theoretically and optimized. Furthermore, we manufactured the directional scattering holographic waveguide and curved goggle by holographic exposure and compression molding, respectively. Actual ray tracing and optical experiments were also performed to demonstrate the feasibility of our proposed NED system. The proposed NED system realizes a FOV of 44° (H) × 12° (V) easily, with a 6 mm exit pupil diameter, an 18.6 mm eye relief and 88.09% luminance uniformity at full viewing angle. The system can further expand the vertical FOV and achieve color display by reducing the vertical uneven shrinkage and using wavelength multiplexing, which will be developed and used in the warrior integrated protection-display helmet system through conformal optics design.

Funding

National Natural Science Foundation of China (NSFC) (61575024, 61420106014); UK Government’s Newton Fund.

References

1. J. Carmigniani, B. Furht, M. Anisetti, P. Ceravolo, E. Damiani, and M. Ivkovic, “Augmented reality technologies, systems and applications,” Multimedia Tools Appl. 51(1), 341–377 (2011). [CrossRef]  

2. J. P. Rolland and K. Thompson, “See-Through Head Worn Displays for Mobile Augmented Reality,” Commun. China Comp. Fed. 7(8), 28–37 (2011).

3. H. Hoshi, N. Taniguchi, H. Morishima, T. Akiyama, S. Yamazaki, and A. Okuyama, “Off-axial HMD optical system consisting of aspherical surfaces without rotational symmetry,” SPIE 2653, 234–242 (1996). [CrossRef]  

4. http://www.epson.com/cgi-bin/Store/jsp/Moverio/Home.do?BV_UseBVCookie=yes

5. https://developer.sony.com/develop/smarteyeglass-sed-e1/

6. http://optinvent.com/wp-content/uploads/2016/03/IDW2010_PRJ2_1.pdf.

7. http://www.osterhoutgroup.com/introduction

8. https://buy.metavision.com/#specifications

9. J. D. Waldern, A. J. Grant, M. M. Popovich, “DigiLens switchable Bragg grating waveguide optics for augmented reality applications,” Digital Optics for Immersive Displays, 106760G (2018).

10. T. Levola and P. Laakkonen, “Replicated slanted gratings with a high refractive index material for in and outcoupling of light,” Opt. Express 15(5), 2067–2074 (2007). [CrossRef]   [PubMed]  

11. D. Cheng, Y. Wang, H. Hua, and J. Sasian, “Design of a wide-angle, lightweight head-mounted display using free-form optics tiling,” Opt. Lett. 36(11), 2098–2100 (2011). [CrossRef]   [PubMed]  

12. H. Huang and H. Hua, “High-performance integral-imaging-based light field augmented reality display using freeform optics,” Opt. Express 26(13), 17578–17590 (2018). [CrossRef]   [PubMed]  

13. C. M. Bigler, P. A. Blanche, and K. Sarma, “Holographic waveguide heads-up display for longitudinal image magnification and pupil expansion,” Appl. Opt. 57(9), 2007–2013 (2018). [CrossRef]   [PubMed]  

14. J. Yang, P. Twardowski, P. Gérard, and J. Fontaine, “Design of a large field-of-view see-through near to eye display with two geometrical waveguides,” Opt. Lett. 41(23), 5426–5429 (2016). [CrossRef]   [PubMed]  

15. R. Shechter, Y. Amitai, and A. A. Friesem, “Compact beam expander with linear gratings,” Appl. Opt. 41(7), 1236–1240 (2002). [CrossRef]   [PubMed]  

16. R. Shi, J. Liu, H. Zhao, Z. Wu, Y. Liu, Y. Hu, Y. Chen, J. Xie, and Y. Wang, “Chromatic dispersion correction in planar waveguide using one-layer volume holograms based on three-step exposure,” Appl. Opt. 51(20), 4703–4708 (2012). [CrossRef]   [PubMed]  

17. A. Cameron, “The application of holographic optical waveguide technology to Q-Sight family of helmet-mounted displays,” Proc. SPIE 7326, 73260H (2009). [CrossRef]  

18. M. D. Simmonds and M. S. Valera, “Display comprising an optical waveguide and switchable diffraction gratings and method of producing the same,” US patent, US9664824B2 (2017).

19. M. Popovich, J. D. Waldern, and A. J. Grant. “DigiLens switchable Bragg grating waveguide optics for augmented reality applications,” Proc. SPIE Digital Optics for Immersive Displays, 15 (2018).

20. X. Wang, Y. Qin, H. Hua, Y.-H. Lee, and S.-T. Wu, “Digitally switchable multi-focal lens using freeform optics,” Opt. Express 26(8), 11007–11017 (2018). [CrossRef]   [PubMed]  

21. S. Jolly, N. Savidis, B. Datta, D. Smalley, V. M. Bove, “Near-to-eye electroholography via guided-wave acousto-optics for augmented reality,” SPIE OPTO, 101270J (2017).

22. Q. Gao, J. Liu, J. Han, and X. Li, “Monocular 3D see-through head-mounted display via complex amplitude modulation,” Opt. Express 24(15), 17372–17383 (2016). [CrossRef]   [PubMed]  

23. Q. Gao, J. Liu, X. Duan, T. Zhao, X. Li, and P. Liu, “Compact see-through 3D head-mounted display based on wavefront modulation with holographic grating filter,” Opt. Express 25(7), 8412–8424 (2017). [CrossRef]   [PubMed]  

24. H.-J. Yeom, H.-J. Kim, S.-B. Kim, H. Zhang, B. Li, Y.-M. Ji, S.-H. Kim, and J.-H. Park, “3D holographic head mounted display using holographic optical elements with astigmatism aberration compensation,” Opt. Express 23(25), 32025–32034 (2015). [CrossRef]   [PubMed]  

25. M. Sugawara, M. Suzuki, and N. Miyauchi, “14-5L: Late-News Paper: Retinal Imaging Laser Eyewear with Focus-Free and Augmented Reality,” Sid Symposium Digest of Technical Papers47(1), 164–167 (2016).

26. H. Kogelnik, “Coupled wave theory for thick hologram gratings,” Bell Syst. Tech. J. 48(9), 2909–2947 (1969). [CrossRef]  

27. C. Yu, Y. Peng, Q. Zhao, H. Li, and X. Liu, “Highly efficient waveguide display with space-variant volume holographic gratings,” Appl. Opt. 56(34), 9390–9397 (2017). [CrossRef]   [PubMed]  

28. H. Mukawa, K. Akutsu, I. Matsumura, S. Nakano, T. Yoshida, and M. Kuwahara, “8.4: distinguished paper: a full color eyewear display using holographic planar waveguides,” SID Symposium Digest of Technical Papers39(1), 89–92 (2012).

29. L. H. Domash, G. P. Crawford, A. C. Ashmead, R. T. Smith, M. M. Popovich, and J. Storey, “Holographic PDLC for photonic applications,” Proc. SPIE Liquid Crystals IV, 4107 (2000). [CrossRef]  

30. J. Han, J. Liu, X. Yao, and Y. Wang, “Portable waveguide display system with a large field of view by integrating freeform elements and volume holograms,” Opt. Express 23(3), 3534–3549 (2015). [CrossRef]   [PubMed]  

31. S.-B. Kim and J.-H. Park, “Optical see-through Maxwellian near-to-eye display with an enlarged eyebox,” Opt. Lett. 43(4), 767–770 (2018). [CrossRef]   [PubMed]  

32. A. Maimone, A. Georgiou, and J. S. Kollin, “Holographic near-eye displays for virtual and augmented reality,” ACM 36(4), 1– 16 (2017).

33. C. Gu, J. Hong, J.-R. Lien, and F. Dai, “Diffraction properties of volume holographic diffusers,” J. Opt. Soc. Am. A 13(8), 1704–1711 (1996). [CrossRef]  

34. K. Murphy, V. Toal, I. Naydenova, and S. Martin, “Holographic beam-shaping diffractive diffusers fabricated by using controlled laser speckle,” Opt. Express 26(7), 8916–8922 (2018). [CrossRef]   [PubMed]  

35. T. Utsugi and M. Yamaguchi, “Reduction of the recorded speckle noise in holographic 3D printer,” Opt. Express 21(1), 662–674 (2013). [CrossRef]   [PubMed]  

36. J. Yeom, J. Jeong, C. Jang, K. Hong, S. G. Park, and B. Lee, “Reflection-type integral imaging system using a diffuser holographic optical element,” Opt. Express 22(24), 29617–29626 (2014). [CrossRef]   [PubMed]  

Cited By

Optica participates in Crossref's Cited-By Linking service. Citing articles from Optica Publishing Group journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (9)

Fig. 1
Fig. 1 The illustration of our proposed NED.
Fig. 2
Fig. 2 Schematics of (a) the in-coupler H1 and (b) out-coupler H2.
Fig. 3
Fig. 3 The optical path diagram of STG.
Fig. 4
Fig. 4 Calculation model of FOV, EPD and ERF.
Fig. 5
Fig. 5 The surface local height of (a) front aspheric surface and (b) rear aspheric surface; (c)The simulation results of virtual image and (d) external real scene.
Fig. 6
Fig. 6 (a) Schematic diagram of holographic exposure; (b) STG and DSHW components; (c) Actual ray tracing of DSHW (The black curtain makes light visible and measurable).
Fig. 7
Fig. 7 Experimental setup.
Fig. 8
Fig. 8 Experimental results.
Fig. 9
Fig. 9 Test of luminance uniformity.

Tables (2)

Tables Icon

Table 1 Optical parameters of DSHW and STG.

Tables Icon

Table 2 Luminance uniformity of different FOV.

Equations (13)

Equations on this page are rendered with MathJax. Learn more.

η =th 2 ( ν )
η= sin 2 ( ν )
ν= πΔnd λ cos( θ p )cos( θ r )
1 L ' = 2 R + 1 L
Y= L ' L W
VFOV=2arctan( L ' 2( L+ L ' ) )
VFOV=2arctan( W R )
AD ¯ 2 K 2 KW/2 tan( φ s /2) +L= OD ¯
sin(α)=K/ AD ¯
γ= φ s 2α
β= φ s /2γ
ERF= Ksin(γ) sin(β)sin( φ s /2) EPD 2tan(β) + W 2tan( φ s /2)
W<2 D w cot( θ d )
Select as filters


Select Topics Cancel
© Copyright 2024 | Optica Publishing Group. All rights reserved, including rights for text and data mining and training of artificial technologies or similar technologies.