Expand this Topic clickable element to expand a topic
Skip to content
Optica Publishing Group

Depth reconstruction with coaxial multi-wavelength aperture telecentric optical system

Open Access Open Access

Abstract

An optical system to measure depth information is proposed here. The proposed optical system has double coaxial multi-wavelength apertures, which makes it possible to simultaneously take an orthogonal projection image and a perspective projection image with these two images separated by wavelengths. The three-dimensional physical position of an object can be derived with the ratio of the radial distances of these separated images with the centers located on the optical axis. Validation of the system by a ray-tracing simulation and an experiment shows that the proposed optical system can be used for depth reconstruction.

© 2018 Optical Society of America under the terms of the OSA Open Access Publishing Agreement

1. Introduction

Optical sensing systems have been widely used in many machine vision applications and offer the advantage of non-contact measurement [1–3]. A conventional entocentric lens system is often used for these applications, which allows a perspective projection image. However, it is difficult to measure physical lengths of objects independently from the depths of the objects with a camera because of the foreshortening effect. A telecentric lens system, in contrast, can obtain an orthogonal projection image that makes it possible to measure the physical lengths of objects. However, it is difficult to obtain accurate depth information from individual images. Thus, several optical systems, such as stereo camera systems [4,5] and micro-lens array systems [6,7], have been developed. Optical systems using structured light have also been widely studied [8–10]. Multi-aperture optical systems allow capturing of depth information [11,12]. Depth information can also be extracted using aperture with different color filter regions by parallax-based method [13]. There is also a novel multi-aperture telecentric lens system, which enables 3D reconstruction of an object [14].

Although the stereo depth camera is one of the most successful devices for measuring depth information, the plurality of the optical imaging lenses and the image sensors are necessary. Especially using pluralities of special cameras such as the high speed cameras and the hyperspectral imaging cameras, the cost and the size becomes crucial. Thus, another optical system that is simple enough to be compact and cost-effective is proposed here for measuring depth information. The optical system can be manufactured by just inserting two multi-wavelength apertures into the conventional imaging optics.

The proposed optical system makes use of the advantages of both a conventional entocentric lens system and a telecentric lens system. In order to achieve this, the optical system has double coaxial multi-wavelength apertures, which makes it possible to simultaneously take an orthogonal projection image and a perspective projection image on an image plane with these two images separated by wavelengths. The two separated images of an object point with a center on the optical axis deviate with respect to radial distance but are the same in the azimuthal direction. It can be considered that one of the two separated image points flows from the other in the radial direction. This flow is here called radial optical flow. The three-dimensional position of the object point can be derived from the ratio of the radial distances of the two separated images of the object point. One of the RGB color images for both projection types of images, however, is lacked as a compensation for taking the two types of images. These images that are position-aligned and captured at the same time by the single optical system can be useful not only for measuring the depth information. Physical length can be directly measured with the orthogonal projection image. On the other hand, the perspective projection image can detect the objects overlapped in the depth direction and provide wide-angle view. Thus, these different types of projection images can work in a complementary manner.

In the following, the structure of the proposed optical system is firstly described. Then, the depth reconstruction method with the optical system is derived using the radial optical flow. A feasibility study for the reconstruction method is performed for a dot object by means of both a ray-tracing simulation and an experiment. The results of the simulation and the experiment are presented to validate the depth reconstruction method with the proposed optical system.

2. Coaxial multi-wavelength aperture telecentric optical system

Figure 1 shows a perspective view of the proposed optical system, which is composed of an imaging lens and double coaxial multi-wavelength apertures along the optical axis of the lens.

 figure: Fig. 1

Fig. 1 Perspective view of proposed optical system with coaxial multi-wavelength aperture.

Download Full Size | PDF

The individual multi-wavelength coaxial aperture has two different coaxial color filters. The center of the color filter can transmit light rays with a certain spectrum and the outer side of the center transmits light rays with the remainder of the spectrum. One of the coaxial multi-wavelength apertures is located between the principal planes of the imaging lens such that the center (resp., outside of the center) of the color filter transmits blue (resp., red) light rays. The other coaxial multi-wavelength aperture is set on the focal plane of the imaging lens such that the center (resp., outside of the center) of the color filter transmits red (resp., blue) light rays.

Light rays emitted from an object point are projected to the image plane and captured by an image sensor located on the image plane. Blue light rays pass through the central blue color filter of the coaxial multi-wavelength aperture attached to the imaging lens and the outer blue filter of the other coaxial multi-wavelength aperture, which means that the blue light rays behave as in a conventional entocentric lens system. Similarly, red light rays pass through the outer red color filter of one coaxial multi-wavelength aperture and the central red color filter of the other coaxial multi-wavelength aperture, which means that the red light rays behave as in a telecentric lens system. In this way, the optical system makes it possible to simultaneously take an orthogonal projection image and a perspective projection image with these two images separated by wavelengths (e.g., the red light rays and the blue light rays).

Assuming the image plane is parallel to the x-y plane, the projected object points on the image plane for the red light rays and the blue light rays can be represented as (p, q) and (P, Q) respectively. The coaxial multi-wavelength aperture with the central red color filter of radius r0 is located on the focal plane of the imaging lens that has a focal length f. The coaxial multi-wavelength aperture with the central blue color filter of radius r1 is placed between the two principal planes of the imaging lens. The centers of both coaxial color filters are on the optical axis. The distance between the image plane and the imaging lens is L, and an object point is located in (x, y, z) coordinate with the origin placed on one (the closest side to the object point) of the principal planes of the imaging lens and the optical axis. The following equations can be derived from geometrical optics for the principal rays:

[pq]=(Lf1)[xy],
and,

[PQ]=(Lz)[xy].

It can be found from Eq. (1) that p and q are independent of z. Thus, the physical lengths of x and y can always be derived from the image for the red light ray. On the other hand, P and Q depend on z. Thus, the information about z is possible to be derived from the image for the blue light ray.

The radii and the azimuth angles in a cylindrical coordinate (r, ϕ) on the image plane for the red and blue light rays can be written with the origin of the coordinate system on the optical axis as

qp=tanϕ,
QP=tanϕ.

From Eq. (3) and (4), azimuth angles of image points for the red and blue light rays are found to be the same. Thus, it can be considered that one of two image points of an object point flows from the other in the radial direction. This flow is here called radial optical flow. From Eq. (1) and (2), following equations can be derived as

p2+q2=(Lf1)r,
P2+Q2=Lzr.

Dividing Eq. (5) by Eq. (6), following equation can be derived as

z=LfLfp2+q2P2+Q2.

Assuming that the radii of the central color filters of the coaxial multi-wavelength apertures of r0 and r1 are small enough to be neglected, the three-dimensional position of the object point can be derived using Eq. (1) and (7) as

[xyz]=[fLfpfLfqLfLfp2+q2P2+Q2].

From Eq. (8), it can be found that the depth distance z with physical lengths of x and y of the object should be reconstructed with separated images by wavelengths (e. g., blue light ray and red light ray).

3. Depth reconstruction method with radial optical flow

Many kinds of objects have feature points. Separated images of a feature point of an object for blue light ray and red light ray can be taken with the proposed optical system. Centers of these separated images of the feature point are assumed to obey Eq. (1) and (2). Under this assumption, the depth of the feature point can be reconstructed with the separated images. A radial distance of the center of the feature image whose area is D on the image plane can be defined with the intensity of the light ray. For the blue light ray, the radial distance rB can be written in a cylindrical coordinate (r, ϕ) on the image plane as,

rB=DrIB(r,ϕ)rdrdϕDIB(r,ϕ)rdrdϕ.

where the integration is taken over the whole area of the feature image. For the red light ray, the radial distance rR can also be written as,

rR=DrIR(r,ϕ)rdrdϕDIR(r,ϕ)rdrdϕ.

The azimuth angles of the centers of the images for the blue light ray and the red light ray should be the same according to Eq. (3) and (4). Thus, the azimuth angles for both light rays can be defined assuming the local orthogonality of the coordinates as,

ϕBR=12(DrϕIB(r,ϕ)rdrdϕrBDIB(r,ϕ)rdrdϕ+DrϕIR(r,ϕ)rdrdϕrRDIR(r,ϕ)rdrdϕ).
The ratio of the radial distance of the red light ray to that of the blue light ray can be written using Eq. (9) and (10) as,
Γ=rRrB.
Using Eq. (12) with quantities represented by Eq. (9)-(11), the three-dimensional position of the feature point can be written as,

[xyz]=[fLfrRcosϕBRfLfrRsinϕBRLfLfΓ].

Although it is necessary to find out the images of the rays of the different wavelength of the same feature point, several methods for such point matching have been developed [15,16]. Thus, reconstruction of the three-dimensional position of an object with its feature points should be performed using the above-mentioned radial optical flow algorithm.

4. Feasibility study of depth reconstruction method

4.1 Ray-tracing simulation

An image captured by the proposed optical system is calculated using ray-tracing simulation. Figure 2 shows the perspective view of the setup of the ray-tracing simulation. Thin Fresnel lens (Edmund Optics, Fresnel lens 77x77x100) are selected for the imaging lens of the optical system in order to perform a feasibility study in which the optical path lengths from the surfaces of the lens are easy to measure. It is also easy to attach a coaxial multi-wavelength aperture on the Fresnel lens with corresponding centers. The thicknesses of the lens and the coaxial multi-wavelength apertures are assumed to be thin enough in comparison with the focal length of the lens. Thus, the attached coaxial multi-wavelength aperture on the lens can be approximately considered to be located on the principal plane of the lens.

 figure: Fig. 2

Fig. 2 Perspective view of the setup of the ray-tracing simulation.

Download Full Size | PDF

The focal length f of the Fresnel lens is 100 mm. The optical path length L is set to be 151 mm. The magnification of the optical system amounts to about 0.5. The radius r0 of the central red color filter for the coaxial multi-wavelength aperture located on the focal plane is set to be 0.5 mm. The radius r1 of the central blue color filter for the coaxial multi-wavelength aperture placed on the principal plane of the lens is set to be 1.0 mm. These parameters are summarized in Table 1. An aperture with diameter of 16 mm is set to be 40 mm apart from the Fresnel lens.

Tables Icon

Table 1. Specifications for optical system

An object is chosen to be a small dot with a diameter of 1 mm in various 22-positions. The distances of the dots, z, are set to be varied in the range of 250 mm to 350 mm. The dots are set to be apart from the optical axis at distances of 2 mm for 11-dots and 7.5 mm for 11-dots respectively. Each dot is numbered, and their positions in a cylindrical coordinate (r, ϕ, z) are summarized in Table 2.

Tables Icon

Table 2. Positions for 22-dots

The calculated image using the ray-tracing simulation for the blue light ray and the red light ray is shown in Fig. 3. It can be found that each dot is separated at both far sides of the depth position z.

 figure: Fig. 3

Fig. 3 Calculated image for 22-dots using ray-tracing simulation for the blue light ray and the red light ray.

Download Full Size | PDF

The original image can be decomposed into individual images by wavelengths as is shown in Fig. 4. It can be found that the radial distances of 11-dots for the red light rays are almost always independent of the distance z and that those of 11-dots for blue light rays are dependent on the distance z.

 figure: Fig. 4

Fig. 4 Decomposed images by wavelengths from the original image. The images are for (a) red light rays and for (b) blue light rays.

Download Full Size | PDF

4.2 Experiment

An experiment is performed with the same optical system setup as in the ray-tracing simulation mentioned above. Figure 5 shows the perspective view of the experimental setup on the left side and the coaxial multi-wavelength aperture with centered blue color filter attached on the Fresnel lens on the right side. An object is set on a rotation stage and illuminated by an LED lamp. A CCD image sensor of a camera (BFLY-U3-23S6C-C) is located on an image plane of the optical system. The object is set to be a plate on which randomly distributed dots, each with diameter of 1 mm, are printed as shown in Fig. 6. The randomly distributed patterns are used for demonstrating that the depth reconstruction can be performed with unintentionally distributed patterns. The rotation stage on which the plate is mounted is able to rotate with a tilt angle θ where θ is set to be 0 when an angle between the z-axis and the normal vector of the object plate is 0 degree. The radial distances of the random dots are within the range of 2.0 mm to 7.5 mm.

 figure: Fig. 5

Fig. 5 (a) Perspective view of the experimental setup and (b) coaxial multi-wavelength aperture on Fresnel lens.

Download Full Size | PDF

 figure: Fig. 6

Fig. 6 Plate object on which randomly distributed dots are printed.

Download Full Size | PDF

The transmittances of the blue and red pass filters with respect to the wavelength [nm] are shown in the bottom of Fig. 7. The quantum efficiencies for the RGB sensors of the camera with respect to the wavelength [nm] are shown in the upper of Fig. 7. From these figures, the blue light ray will not be detected by the red sensor of the camera. In the same way, the red light ray will not be detected by the blue sensor of the camera. Thus, the red light and the blue light can be separated by the filters and the RGB sensors.

 figure: Fig. 7

Fig. 7 Quantum efficiencies of RGB sensors are plotted with respect to the wavelength (upper). The transmittance for the red-pass filter and blue-pass filter are plotted with respect to the wavelength (bottom).

Download Full Size | PDF

The original images captured by the image sensor and the decomposed images from these original images by wavelengths for various distances z (250, 300 and 350 mm) are shown in Fig. 8. The mean wavelength of the blue light ray is about 450 nm and that of the red light ray is about 680 nm. From these figures, it can be found that the radial distances of the dots for the blue light rays depend on the distance z and that the radial distances of the dots for the red light rays are almost the same.

 figure: Fig. 8

Fig. 8 Original images captured by the CCD camera and decomposed images from these original images by wavelengths for several depth distances z (250, 300, and 350 mm) of the object.

Download Full Size | PDF

The decomposed images from the original images by wavelengths for various tilt angles (θ = 30, 45, and 60 degrees) with z = 300 mm are shown in Fig. 9.

 figure: Fig. 9

Fig. 9 Decomposed images for several tilt angles (θ = 30, 45, and 60 degrees) with z = 300 mm.

Download Full Size | PDF

5. Results

The three-dimensional positions of the random dots of the plate object can be reconstructed by applying a radial optical flow algorithm using Eq. (13). The reconstructed depth distances of z against actual distances are shown in Fig. 10 with the circle markers. The horizontal axis indicates the actual depth distance and the vertical axis indicates the reconstructed depth distance. The solid line is a line fitted by the least square method for the reconstruction. The broken line is calculated using the results of the ray-tracing simulation for 22-dots where the results are averaged over the dots of the radial distances of 2.0 and 7.5 mm. The dotted line shows the ideal relationship. The error bars of the reconstruction indicate the standard deviations from the mean values over the all random dots. The reconstruction agrees well with the ideal relationship where the maximum deviation of the fitted line from the actual distance is 3.6 mm at z = 350 mm. The reconstruction also agrees well with the simulation where the maximum deviation of the fitted line from the simulation is 2.9 mm at z = 340 mm. Although the standard deviation of the reconstruction from the ideal relationship is 2.2 mm, that of the reconstruction from the simulation is 1.2 mm. This implies that the reconstruction can be well predicted and correctable. The deviations are considered to be caused by the aberrations of the Fresnel lens and misalignment of the optical elements. These results indicate that depth distances of an object over the wide range can be well reconstructed.

 figure: Fig. 10

Fig. 10 Reconstructed distances against actual distances with the circle markers. The solid line is fitted line for the reconstruction and the broken line is simulation, with the dotted line showing the ideal relationship.

Download Full Size | PDF

Figure 11 shows the reconstructed the three-dimensional positions (x, y, z) of the random dots for various tilt angles θ (30, 45, and 60 degrees) of the plate object with z = 300 mm. The plus, circle, and cross markers indicate the reconstructed positions of the dots for the actual tilt angles of 30, 45, and 60 degrees respectively. In this figure, planes fitted to the reconstructed positions of the dots are also plotted where the planes are written with fitting parameters of θfit and zfit as,

tanθfitx(zzfit)=0.
The fitting parameters are determined by the least square method. The reconstructed tilt angles are 28.1, 44.0, and 55.1 degrees for the actual tilt angles of 30, 45, and 60 degrees respectively. The coefficients of the determinant R2 of the fitted planes of the actual tilt angles of 30, 45, and 60 degrees are 0.62, 0.87, and 0.89 respectively.

 figure: Fig. 11

Fig. 11 Reconstructed three-dimensional positions (x, y, z) of the random dots for various tilt angles θ (30, 45, and 60 degrees) of the plate object with z = 300 mm.

Download Full Size | PDF

6. Discussions

The two projection types of position-aligned images, telecentric and entocentric images, can be taken at the same time by the proposed optical system. Figure 12 shows that the color separated images of two M4 plastic bolts of the same size captured by the optical system that is the same as described in the experiment. The two bolts are located at z = 250 mm and 300 mm respectively. The images are for (a) the red light ray and (b) the blue light ray. The red light ray behaves as in telecentric optical system and the blue light ray behaves as in entocentric optical system. It can be found that the sizes of the two bolts are the same for the red light ray while those are different for the blue light ray. Physical length can be directly measured with the telecentric image. On the other hand, the entocentric image can detect overlapped objects in z-direction and provide wide-view angle. The proposed optical system gives an advantage to select the two different type images depending on the purpose.

 figure: Fig. 12

Fig. 12 Color separated images captured by the optical system for two M4 plastic bolts of the same size that are located at z = 250 mm and 300 mm respectively. The images are for (a) the red light ray and (b) the blue light ray. The red light ray behaves as in telecentric optical system while the blue light ray behaves as in entocentric optical system.

Download Full Size | PDF

The feature detection and matching become difficult especially for an object without any patterns on the surface. The outline shape of the object, however, can be used as features. The feature matching can be restricted only in the radial direction due to the radial optical flow. Thus, a cross point between the outline shape and the radial coordinate axis can be considered as a small feature point. The cross point should be detectable on the blue and red light images. Thus, the cross point can be utilized for the feature detection and matching.

The hyperspectral imaging camera that can take spectrum of all pixels is rapidly developed. With the camera, the feature detection should be attained more easily using multi-color pass filters for the coaxial apertures.

The color-pass filters can be replaced by the orthogonal polarization filters. In this case, for example, rays with s-polarization behave as in the telecentric optical system while rays with p-polarization behave as in the entocentric optical system. Using beam splitter, the rays can be separated to reach different image sensors. In this case, full color information can be utilized for each image in compensation for the additional image sensor.

If a point on an image is on the optical axis, both the blue and red light are blacked out. This is not so critical because the black point can be set outside from the image or can be negligible with small central filters of the apertures.

7. Conclusions

An optical system and an algorithm for reconstructing three-dimensional positions of objects are proposed. The proposed optical system has double coaxial multi-wavelength apertures, which makes it possible to simultaneously take an orthogonal projection image and a perspective projection image with these two images separated by wavelengths. The system was experimentally validated by reconstructing depth distances and tilt angles of an object. The authors believe that the proposed optical system is a promising system for measuring three-dimensional positions of objects.

References

1. M. Bass, C. DeCusatis, J. Enoch, V. Lakshminarayanan, G. Li, C. Macdonald, V. Mahajan, and E. Van Stryland, Handbook of Optics (McGraw-Hill, 2010).

2. E. H. Adelson and J. R. Bergen, in Computational Models of Visual Processing (MIT, 1991).

3. J. H. Park, K. Hong, and B. Lee, “Recent progress in three-dimensional information processing based on integral imaging,” Appl. Opt. 48(34), H77–H94 (2009). [CrossRef]   [PubMed]  

4. X. Ding, L. Xu, H. Wang, X. Wang, and G. Lv, “Stereo depth estimation under different camera calibration and alignment errors,” Appl. Opt. 50(10), 1289–1301 (2011). [CrossRef]   [PubMed]  

5. L. B. Wolff and E. Angelopoulou, “Three-dimensional stereo by photometric ratios,” J. Opt. Soc. Am. A 11(11), 3069–3078 (1994). [CrossRef]  

6. H. Arimoto and B. Javidi, “Integral three-dimensional imaging with digital reconstruction,” Opt. Lett. 26(3), 157–159 (2001). [CrossRef]   [PubMed]  

7. H. Yoo, “Axially moving a lenslet array for high-resolution 3D images in computational integral imaging,” Opt. Express 21(7), 8873–8878 (2013). [CrossRef]   [PubMed]  

8. Z. Cai, X. Liu, X. Peng, and B. Z. Gao, “Ray calibration and phase mapping for structured-light-field 3D reconstruction,” Opt. Express 26(6), 7598–7613 (2018). [CrossRef]   [PubMed]  

9. X. Huang, J. Bai, K. Wang, Q. Liu, Y. Luo, K. Yang, and X. Zhang, “Target enhanced 3D reconstruction based on polarization-coded structured light,” Opt. Express 25(2), 1173–1184 (2017). [CrossRef]   [PubMed]  

10. T. Bell and S. Zhang, “Multiwavelength depth encoding method for 3D range geometry compression,” Appl. Opt. 54(36), 10684–10691 (2015). [CrossRef]   [PubMed]  

11. D. C. Hwang, D. H. Shin, S. C. Kim, and E. S. Kim, “Depth extraction of three-dimensional objects in space by the computational integral imaging reconstruction technique,” Appl. Opt. 47(19), D128–D135 (2008). [CrossRef]   [PubMed]  

12. M. De, J. W. Y. Lit, and R. Tremblay, “Multiaperture Focusing Technique,” Appl. Opt. 7(3), 483–488 (1968). [CrossRef]   [PubMed]  

13. Y. Bando, B. Chen, and T. Nishita, “Extracting depth and matte using a color-filtered aperture,” ACM Trans. Graph. 27(5), 134 (2008). [CrossRef]  

14. J. S. Kim and T. Kanade, “Multiaperture telecentric lens for 3D reconstruction,” Opt. Lett. 36(7), 1050–1052 (2011). [CrossRef]   [PubMed]  

15. Y. Geng, Y. Zhao, and H. Chen, “Stereo matching based on adaptive support-weight approach in RGB vector space,” Appl. Opt. 51(16), 3538–3545 (2012). [CrossRef]   [PubMed]  

16. G. Bradski and A. Kaehler, Learning OpenCV: Computer Vision with the OpenCV Library (O’Reilly Media, 2008)

Cited By

Optica participates in Crossref's Cited-By Linking service. Citing articles from Optica Publishing Group journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (12)

Fig. 1
Fig. 1 Perspective view of proposed optical system with coaxial multi-wavelength aperture.
Fig. 2
Fig. 2 Perspective view of the setup of the ray-tracing simulation.
Fig. 3
Fig. 3 Calculated image for 22-dots using ray-tracing simulation for the blue light ray and the red light ray.
Fig. 4
Fig. 4 Decomposed images by wavelengths from the original image. The images are for (a) red light rays and for (b) blue light rays.
Fig. 5
Fig. 5 (a) Perspective view of the experimental setup and (b) coaxial multi-wavelength aperture on Fresnel lens.
Fig. 6
Fig. 6 Plate object on which randomly distributed dots are printed.
Fig. 7
Fig. 7 Quantum efficiencies of RGB sensors are plotted with respect to the wavelength (upper). The transmittance for the red-pass filter and blue-pass filter are plotted with respect to the wavelength (bottom).
Fig. 8
Fig. 8 Original images captured by the CCD camera and decomposed images from these original images by wavelengths for several depth distances z (250, 300, and 350 mm) of the object.
Fig. 9
Fig. 9 Decomposed images for several tilt angles (θ = 30, 45, and 60 degrees) with z = 300 mm.
Fig. 10
Fig. 10 Reconstructed distances against actual distances with the circle markers. The solid line is fitted line for the reconstruction and the broken line is simulation, with the dotted line showing the ideal relationship.
Fig. 11
Fig. 11 Reconstructed three-dimensional positions (x, y, z) of the random dots for various tilt angles θ (30, 45, and 60 degrees) of the plate object with z = 300 mm.
Fig. 12
Fig. 12 Color separated images captured by the optical system for two M4 plastic bolts of the same size that are located at z = 250 mm and 300 mm respectively. The images are for (a) the red light ray and (b) the blue light ray. The red light ray behaves as in telecentric optical system while the blue light ray behaves as in entocentric optical system.

Tables (2)

Tables Icon

Table 1 Specifications for optical system

Tables Icon

Table 2 Positions for 22-dots

Equations (14)

Equations on this page are rendered with MathJax. Learn more.

[ p q ]=( L f 1 )[ x y ],
[ P Q ]=( L z )[ x y ].
q p =tanϕ,
Q P =tanϕ.
p 2 + q 2 =( L f 1 )r,
P 2 + Q 2 = L z r.
z= Lf Lf p 2 + q 2 P 2 + Q 2 .
[ x y z ]=[ f Lf p f Lf q Lf Lf p 2 + q 2 P 2 + Q 2 ].
r B = D r I B ( r,ϕ )rdrdϕ D I B ( r,ϕ )rdrdϕ .
r R = D r I R ( r,ϕ )rdrdϕ D I R ( r,ϕ )rdrdϕ .
ϕ BR = 1 2 ( D rϕ I B ( r,ϕ )rdrdϕ r B D I B ( r,ϕ )rdrdϕ + D rϕ I R ( r,ϕ )rdrdϕ r R D I R ( r,ϕ )rdrdϕ ).
Γ= r R r B .
[ x y z ]=[ f Lf r R cos ϕ BR f Lf r R sin ϕ BR Lf Lf Γ ].
tan θ fit x( z z fit )=0.
Select as filters


Select Topics Cancel
© Copyright 2024 | Optica Publishing Group. All rights reserved, including rights for text and data mining and training of artificial technologies or similar technologies.