Expand this Topic clickable element to expand a topic
Skip to content
Optica Publishing Group

Multi-view fringe projection profilometry for surfaces with intricate structures and high dynamic range

Open Access Open Access

Abstract

Fringe projection profilometry plays an important role for quality control in production line. However, it is facing challenges in the measurement of objects with intricate structures and high dynamic range that involved in precision manufacturing and semiconductor packaging. In this paper, a multi-view fringe projection profilometry system, which deploys a vertical telecentric projector and four oblique tilt-shift cameras, is presented to address the “blind spots” caused by shadowing, occlusion and local specular reflection. A flexible and accurate system calibration method is proposed, in which the corrected pinhole imaging model is used to calibrate the telecentric projection, and the unified calibration is performed by bundle adjustment. Experimental results show that the 3D repeated measurement error and standard deviation are no more than 10 μm within a measurable volume of 70 × 40 × 20 mm3. Furthermore, a group of experiments prove that the developed system can achieve complete and accurate 3D measurement for high dynamic range surfaces with complex structures.

© 2024 Optica Publishing Group under the terms of the Optica Open Access Publishing Agreement

1. Introduction

Recent advances in precision manufacturing and semiconductor packaging have been boosting a growing demand for online 3D automated optical inspection [1]. Fringe projection profilometry (FPP) [2] is an important optical 3D measurement technique for quality control in production line, since it has the advantages of fast speed, full field data acquisition, independent of measured features and adaptability to complex industrial environment. As the parts and devices involved in the above industries become increasingly precise and miniaturized, FPP is facing challenges in small field of view measurement, especially when measuring surfaces with intricate structures and high dynamic range. Therefore, how to utilize the online measurement advantages of FPP to achieve higher accuracy and complete surface topography acquisition is a topic worthy of extensive research.

To meet the requirements of sub 10 μm measurement accuracy for such smaller dimensional objects, there are some feasible but not yet well studied methods, known as microscopic FPP (MFPP) [3]. Using a microscope or a long working distance (LWD) lens to shrink the projection and imaging field of view (FOV), micron level measurement accuracy is achievable in a centimeter level FOV. Early MFPP system utilizes two sub-optical paths of a stereomicroscope for projection and imaging respectively [4,5], by which sub-micron measurement accuracy could be achieved with the millimeter FOV and sub-millimeter depth of field (DOF). It has the benefit of flexibly adjustable magnification, but bulky, complicated and inadaptable system structure. On the contrary, LWD lens based MFPP is simpler and more flexible in system design, and also capable of multi-view system configuration [6]. LWD lenses can be further divided into non-telecentric perspective imaging lenses and telecentric affine imaging lenses. Recently, telecentric lenses have been extensively employed for high-accuracy measurement due to their remarkable advantages of low distortion and constant magnification. However, accurately calibrating the telecentric model is a challenging task as it is insensitive to changes in the depth along the optical axis. Liu et al. [7] proposed an analytical calibration method for the telecentric FPP system, which estimates the initial parameters using affine camera factorization calibration method, and optimizes all the parameters using bundle adjustment algorithm. In addition, to make full use of the limited DOF and expand the effective measurement FOV, the Scheimpflug principle is adopted to keep the focus plane of both light paths coincide at the object plane [810]. Hu et al. [11] applied the Scheimpflug telecentric lens in a stereo vision system and derived a concise imaging model, offering a valuable reference in system design for accurate microscopic 3D measurement applications.

Measuring surfaces with high-dynamic range (HDR) and intricate structures is another challenge for FPP. In the captured images, the overexposed pixels appeared in the high-reflection region will lead to data loss in the 3D reconstruction, while the contrast deterioration of the projected fringes on low-reflection region will result in an accuracy decrease. Methods such as multiple exposure and adaptive fringe projection have been proposed to deal with HDR surfaces [1214]. However, the above methods require collecting a large number of images, resulting in low measurement efficiency. And they also have a certain degree of blindness, generally relying on experiential design. Zhu et al. [15] proposed a coding strategy that replaces the traditional gray code patterns with polarization code patterns to assist the phase-shift patterns for phase unwrapping and effectively suppressed the phase edge jump error caused by the highlight. Wang et al. [16] proposed a double phase-shifting method, which reduces the saturation-induced phase error by fusing the wrapped phases of the two sets of opposite phase-shifting fringe sequences. A more efficient approach is to use multi-view technology. If one camera is saturated, the others can be used to compensate the saturation error since the cameras viewing from different view are not likely to be saturated in the same area. Therefore the complementarity of multiple view information is induced to achieve more comprehensive measurement. He et al. [17] developed a binocular structured light system to obtain the point cloud from different viewing angles to fill the missing data in the occlusion and reflection regions. Our previous work [10,18] deployed one vertical telecentric imaging branch and four projection branches, and the measurement accuracy of less than 7 µm within the calibrated volume of 50 × 40 × 10 mm3 was reached. The use of telecentric imaging allows for higher measurement accuracy. However, multi-view projection must be done in a time-sharing manner, resulting in a time-consuming image acquisition process.

In this paper, the multi-view FPP system is architected with a vertical telecentric fringe projector and four oblique viewing tilt-shift cameras. The entire FOV is completely illuminated by the top-down telecentric projector with no areas of shadow, and the use of tilt-shift camera can maximize the expansion of the measurement FOV. Each of the cameras can capture the fringe pattern simultaneously, providing a speed benefit over the sequential image acquisition that is required in single camera and obliquely mounted projectors designs. To accurately and flexibly calibrate this system, the pinhole imaging model is used to calibrate the telecentric projection, and the unified calibration is performed by bundle adjustment. The measurement principle is detailed in Section 2. System configuration and calibration are presented in Section 3. Experiments are analyzed in Section 4, and the conclusion is given in Section 5.

2. Principle

2.1 Phase shift profilometry

In standard N-step phase shift profilometry, fringe patterns are sequentially projected onto the measured surface, and the intensity distribution of the captured patterns can be represented as:

$${I_i}(x,y) = A(x,y) + B(x,y) \times \cos ({\varphi (x,y) + {{2\mathrm{\pi }i} / N}} )$$
where i is the steps of phase shifting, i = 1, 2, …, N-1; (x, y) represents the pixel coordinate, A(x, y) is the background intensity, B(x, y) is the modulated intensity; φ(x, y) represents the phase value which is related to the fringe contrast and surface reflectivity. It can be calculated by Eq. (2), and at least three phase shifted images are required for phase calculation.
$$\varphi (x,y) = \arctan \left( {\frac{{\sum\limits_{i = 0}^N {{I_i}({x,y} )\times \sin ({{{2\mathrm{\pi }i} / N}} )} }}{{\sum\limits_{i = 0}^N {{I_i}({x,y} )\times \cos ({{{2\mathrm{\pi }i} / N}} )} }}} \right)$$

To address the problem of phase ambiguity, a variety of methods have been proposed for phase unwrapping, among which the multi-frequency heterodyne method has been proven to be one of the most reliable methods [19]. Usually, three frequency (f1 > f2 > f3) phase functions are used for heterodyne synthesis. f12 is synthesized from f1 and f2, f23 is synthesized from f2 and f3, and finally f123 is synthesized from f12 and f23. By selecting an appropriate fringe number, the wrapped phases of f12 and f1 are sequentially unfolded using the following equations.

$${\phi _{12}}(x) = {\varphi _{12}}(x) + 2\mathrm{\pi } \times \textrm{Round}\left( {\frac{{{{{\varphi_{123}}(x) \times {f_{12}}} / {{f_{123}}}} - {\varphi_{12}}(x)}}{{2\mathrm{\pi }}}} \right)$$
$${\phi _1}(x) = {\varphi _1}(x) + 2\mathrm{\pi } \times \textrm{Round}\left( {\frac{{{{{\phi_{12}}(x) \times {f_1}} / {{f_{12}}}} - {\varphi_1}(x)}}{{2\mathrm{\pi }}}} \right)$$
where φ1(x), φ2(x) and φ3(x) are the wrapped phase functions of frequencies f1, f2 and f3, respectively; φ12(x) is the phase function of the synthesized frequency f12; ϕ12(x) and ϕ1(x) are the phase unwrapping function of φ12(x) and φ1(x).

2.2 Scheimpflug imaging model

Due to the limitation of DOF, most areas of the image acquired by the oblique viewing camera are always defocused. Especially as the magnification increases, the area of clear imaging will significantly decrease. Tilt-shift camera has a certain angle between the optical principal plane of the lens and the imaging plane, and according to Scheimpflug principle the usable DOF can be effectively extended. As shown in Fig. 1, the image plane is tilted towards the optical axis, while the clearly focused object plane conjugated to the image plane is also tilted. The conjugation relationship can be described as:

$${p_o} \cdot \tan \theta = {p_i} \cdot \tan \beta $$
where po is the object distance, pi is the image distance, θ and β indicate the inclination angle between the imaging plane and the principal plane of the lens, the object plane and the principal plane of the lens, respectively.

 figure: Fig. 1.

Fig. 1. Scheimpflug imaging model.

Download Full Size | PDF

When the tilt angle between image plane and lens plane is small (usually less than 6 deg [20]), it can be incorporated into the pinhole imaging model as an additional distortion coefficient to compensate for the tilt effect. To describe the Scheimpflug imaging model, we established a coordinate system based on the traditional pinhole imaging model. Considering the additional projection and rotation transformations introduced by the tilt angle (τx, τy) of the optical axis around the Xc and Yc axes, it can be described as follows:

$${z^c}{\left[ {\begin{array}{{ccc}} {{u^c}}&{{v^c}}&1 \end{array}} \right]^\textrm{T}} = {{\boldsymbol K}^c} \times {{\boldsymbol K}^\tau } \times {{\boldsymbol R}^\tau } \times [{{\boldsymbol R}_w^c|{\boldsymbol t}_w^c} ]\times {\left[ {\begin{array}{{cccc}} {{x^w}}&{{y^w}}&{{z^w}}&1 \end{array}} \right]^\textrm{T}}$$
$${{\boldsymbol K}^c} = \left[ {\begin{array}{{ccc}} {{f_x}}&0&{{u_{c0}}}\\ 0&{{f_y}}&{{v_{c0}}}\\ 0&0&1 \end{array}} \right]$$
$${{\boldsymbol K}^\tau } = \left[ {\begin{array}{{ccc}} {\cos ({\tau_y})\cos ({\tau_x})}&0&{\sin ({\tau_y})\cos ({\tau_x})}\\ 0&{\cos ({\tau_y})\cos ({\tau_x})}&{ - \sin ({\tau_x})}\\ 0&0&1 \end{array}} \right]$$
$${{\boldsymbol R}^\tau } = \left[ {\begin{array}{{ccc}} {\cos ({\tau_y})}&{\sin ({\tau_y})\sin ({\tau_x})}&{ - \sin ({\tau_y})\cos ({\tau_x})}\\ 0&{\cos ({\tau_x})}&{\sin ({\tau_x})}\\ {\sin ({\tau_y})}&{ - \cos ({\tau_y})\sin ({\tau_x})}&{\cos ({\tau_y})\cos ({\tau_x})} \end{array}} \right]$$
where Kc is the projection matrix of the camera, fx and fy represent image distance, (uc0, vc0) is the coordinate of the origin of the ideal image coordinate system; Kτ and Rτ are the projection and rotation matrix introduced by the tilt angle (τx, τy); ${\boldsymbol R}_w^c,{\boldsymbol t}_w^c$ are respectively the rotation and translation matrices from the world coordinate system to the camera coordinate system. As the sine operation of Scheimpflug angle is approximately equal to zero, the matrix Kτ will degenerated into a unit matrix, which will help to improve the calibration accuracy of Scheimpflug distortion model.

In addition, the camera also has imaging distortion, which can be represented by the Taylor series expansion of the center position.

$$\left\{ {\begin{array}{{c}} {{u^c} = {x_c}({1 + k_1^cr_c^2 + k_2^cr_c^4 + k_3^cr_c^6} )}\\ {{v^c} = {y_c}({1 + k_1^cr_c^2 + k_2^cr_c^4 + k_3^cr_c^6} )} \end{array}} \right.$$
where (xc, yc) is the ideal coordinate, (uc, vc) is the actual coordinate with distortion, $r_c^2 = x_c^2 + y_c^2$, $k_1^c,k_2^c,k_3^c$ are the distortion parameters to be solved. At this point, an imaging model for a single tilt-shift camera has been established, and the parameters that need to be calibrated mainly include: ${\boldsymbol R}_w^c,{\boldsymbol t}_w^c,{{\boldsymbol K}^c},{\tau _x},{\tau _y},k_1^c,k_2^c,k_3^c$.

2.3 Telecentric projection model

The use of telecentric lens for parallel projection is to eliminate shadow. It also helps to shrink fringe patterns, achieving high-resolution, low distortion projection. However, the telecentric lens only amplifies in the X and Y directions and is insensitive to the depth in the Z direction, which means it does not have a projection principal point. Assuming that there is a pixel point (up, vp) on the DMD plane is projected into the 3D space with a coordinate of (xw, yw, zw), as shown in Fig. 2, the mathematical expression for the telecentric projection model is as follows:

$${\left[ {\begin{array}{{ccc}} {{u^p}}&{{v^p}}&1 \end{array}} \right]^\textrm{T}} = {{\boldsymbol M}^p} \times {\left[ {\begin{array}{{cccc}} {{x^w}}&{{y^w}}&{{z^w}}&1 \end{array}} \right]^\textrm{T}}$$
$${{\boldsymbol M}^p} = {{\boldsymbol K}^p} \times [{{\boldsymbol R}_w^p|{\boldsymbol t}_w^p} ]= \left[ {\begin{array}{{cccc}} {{m_{11}}}&{{m_{12}}}&{{m_{13}}}&{{m_{14}}}\\ {{m_{21}}}&{{m_{22}}}&{{m_{23}}}&{{m_{24}}}\\ 0&0&0&1 \end{array}} \right]$$
$${{\boldsymbol K}^p} = \left[ {\begin{array}{{cccc}} {{m_x}}&0&0&{{u_{p0}}}\\ 0&{{m_y}}&0&{{v_{p0}}}\\ 0&0&0&1 \end{array}} \right]$$
where Mp is the telecentric projection matrix which contains intrinsic matrix Kp and extrinsic matrix $[{\boldsymbol R}_w^p|{\boldsymbol t}_w^p]$ parameters of the telecentric projector, and mij the image coordinates of the projector; mx and my are the magnifications in pixels for the horizontal and vertical directions of the image, respectively.

 figure: Fig. 2.

Fig. 2. The projection model with bilateral telecentric lens.

Download Full Size | PDF

Similarly, the distortion correction is also necessary, although the telecentric lens has much smaller distortion than the ordinary lens. Since the distortion of telecentric lenses is mainly reflected in radial, ignoring the tiny high-order distortion, the relationship between the distorted image coordinate (up, vp) and the undistorted image coordinate (xp, yp) is expressed as follows:

$$\left\{ {\begin{array}{{c}} {{u^p} = {x_p}({1 + k_1^pr_p^2 + k_2^pr_p^4} )}\\ {{v^p} = {y_p}({1 + k_1^pr_p^2 + k_2^pr_p^4} )} \end{array}} \right.$$
where $r_p^2 = x_p^2 + y_p^2$ is the radial value, $k_1^p,k_2^p$ are the distortion parameters to be solved.

2.4 Multi-view FPP model

The traditional FPP model uses the 3D world coordinate system as the measurement coordinate system, which is built on the calibration target. During the system calibration process, the world coordinate system changes with the position of the calibration target, while the transformation matrix also changes. Choosing the transformation matrix of the calibration target at a certain position to solve the equation will generate random errors. Here we use the camera coordinate system as the measurement coordinate system. Then the multi-view FPP model (see Fig. 3) can be expressed as follows:

$${\left[ {\begin{array}{{ccc}} {{u^{cm}}}&{{v^{cm}}}&1 \end{array}} \right]^\textrm{T}} = {\boldsymbol N}_{3 \times 4}^{cm} \times {{\boldsymbol P}^{cm}}$$
$${\left[ {\begin{array}{{ccc}} {{u^p}}&{{v^p}}&1 \end{array}} \right]^\textrm{T}} = {\boldsymbol M}_{3 \times 4}^{pm} \times {{\boldsymbol P}^{cm}}$$
$${\boldsymbol N}_{3 \times 4}^{cm} = [{{{\boldsymbol K}^{cm}}|{\boldsymbol 0}} ]$$
where ${\boldsymbol N}_{3 \times 4}^{cm}$ is the projection matrix of the m-th camera, ${\boldsymbol M}_{3 \times 4}^{pm}$ is the projection matrix of the projector in the m-th camera coordinate system. Kcm and Mpm can be obtained after system calibration, then the 3D point Pcm = (xcm, ycm, zcm, 1)T in each camera coordinate system can be obtained by solving simultaneous Eqs. (1517). It should be noted that due to the use of telecentric lens, the projector cannot be calibrated separately as the traditional FPP system. This research proposed a unique method of using the camera to calibrate the projector, which will be described in detail in Section 3.2.

 figure: Fig. 3.

Fig. 3. Relationship between camera coordinate systems and projector coordinate system.

Download Full Size | PDF

To register point clouds from four camera coordinate systems, the extrinsic matrix $[{\boldsymbol R}_i^{cm}|{\boldsymbol t}_i^{cm}]$ should be applied to calculate the position and orientation relationship between the four cameras. Assuming that they are aligned with the 1-th camera coordinate system, the rotation matrix Rcm-to-c1 and translation matrix tcm-to-c1 of the m-th camera coordinate system relative to the 1-th projector coordinate system can be expressed as follows:

$$\left\{ {\begin{array}{{c}} {{{\boldsymbol R}^{cm - to - c1}} = {\boldsymbol R}_i^{c1} \times {{({{\boldsymbol R}_i^{cm}} )}^\textrm{T}}}\\ {{{\boldsymbol t}^{cm - to - c1}} = {\boldsymbol t}_i^{c1} - {{\boldsymbol R}^{cm - to - c1}} \times {\boldsymbol t}_i^{cm}} \end{array}} \right.,m = [{1,2,3,4} ]$$
where the subscript i represents the i-th pose of the calibration target. Theoretically, the above equation can be solved by substituting the extrinsic matrix at any calibration position. Here we introduce all the $[{\boldsymbol R}_i^{cm}|{\boldsymbol t}_i^{cm}]$ with the fully usage of all calibration feature points, and then calculate the least-squares solutions to achieve more robust results for Rcm-to-c1 and tcm-to-c1.

3. System configuration and calibration

3.1 System design and configuration

The multi-view FPP system consists of a top-down fringe projector and four oblique viewing cameras, as shown in Fig. 4(a). Since the projector is placed vertically and a bilateral telecentric lens is used for parallel projection, there are no regions shadowed from the fringe pattern (see Fig. 4(b)), and the spatial density of projected fringe can remain consistent at different depths. To take full use of the limited DOF and FOV, the cameras are equipped with tilt-shift lens, which will result in a larger common focus area (Fig. 4(c)) based on Scheimpflug principle.

 figure: Fig. 4.

Fig. 4. Multi-view FPP system design. (a) System configuration diagram. (b) Eliminate shadow areas by parallel projection. (c) Extend common focus area by tilt-shift lens.

Download Full Size | PDF

The four oblique cameras not only allow for collecting images of the fringe pattern in all areas, including those near the bottom of tall features, but also perform complementary measurements for HDR surfaces by providing different viewpoints. Besides, compared with the system of one vertical imaging branch and four projection branches which projects and acquires images from four directions in sequence, the system of one vertical projection branch and four imaging branches can collect fringe images simultaneously, providing a speed benefit of at least four times reduction in image acquisition time. This is a key advantage for an automatic optical inspection (AOI) system at production speed. Another advantage is that oblique cameras can obtain sidewall images of the structures on the measured surface, which is conducive to achieving both 2D and 3D detection simultaneously.

Basically, two conditions need to be met for this system design: the imaging resolution is higher than the projection resolution, the imaging DOF is larger than the projection DOF. The design steps of the multi-view FPP system are given as follows.

  • (i) Determine the lens parameters (i.e. magnification, focal length, DOF) of the oblique imaging and vertical projection according to the required FOV and working distance.
  • (ii) Adjust the tilt angle β between the object plane and the principal plane of the lens to achieve clear imaging in the full FOV.
  • (iii) Calculate the tilt angle θ between the imaging plane and the principal plane of the lens according to Eq. (5).
  • (iv) Estimate the 3D imaging DOF using Eq. (19) and considering the FOV of the telecentric lens of the projector. As shown in Fig. 5, the distance e is determined by the acceptable confusion length, and the vertical offset limit Δ can be estimated as:
    $$\left\{ {\begin{array}{{c}} {\Delta \approx ({{p_1} - {p_2}} )\cos \beta }\\ {{p_1} = {{{p_o}} / {({1 - {s / {2aM}}} )}}}\\ {{p_2} = {{{p_o}} / {({1 + {s / {2aM}}} )}}} \end{array}} \right.$$
    where p1 and p2 are respectively the object distances from the far and near limits of DOF, s is the pixel size of the camera, a is the radius of the incident pupil, and M = pi/po is the vertical magnification of the lens. To maximize the measurement DOF, Δ is generally designed to be greater than the DOF of the projection.
  • (v) Select the cameras and projector with appropriate optical parameters.
  • (vi) Design and configure the direction and location of multi-view imaging branches to maximize the overlap of common focus area.

 figure: Fig. 5.

Fig. 5. Estimation of 3D imaging DOF under Scheimpflug condition.

Download Full Size | PDF

A prototype of the multi-view system is established, as shown in Fig. 6(a). A projection module (DLP4710, resolution 1920 × 1080) combined with a bilateral telecentric lens (magnification 0.153, working distance 273 mm, DOF ±10.7 mm) serve as the projection branch, while four CMOS cameras (China Daheng (Group) Co., Ltd, ME2P-900-13GM-P-VF8, resolution 4200 × 2160) with tilt-shift lens (vertical magnification 0.135) serve as the imaging branch. Each camera is installed on a platform that is capable of minor adjustment in three degrees of freedom to match the FOV and DOF of others. The tilt angle β is designed as 30°, and the corresponding tilt angle θ is about 3.67°. The final projection FOV is about 70 × 40 mm2, and the imaging FOV of each camera is 90 × 50 mm2, which completely covers the projection FOV so as to make full use of the resolution of the projector.

 figure: Fig. 6.

Fig. 6. Multi-view FPP prototype system. (a) The photograph of the system. (b) A comparative experiment between perspective and parallel projection. (c) A comparative experiment between forward and tilt-shift imaging.

Download Full Size | PDF

Figure 6(b) shows that a tall component is completely illuminated in the entire FOV with no areas of shadow by parallel projection, while there is a real concern of shadow effect on the nearby low feature by perspective projection. Figure 6(c) shows the imaging results of a binary fringe pattern. It is obvious that only part of the FOV is imaged clearly, and the rest becomes rather blurry in forward imaging. By contrast, the fringe can be imaged clearly throughout the whole FOV with tilt-shift imaging lens, which has clear edges with high contrast and quality, as shown in the profile line plot of the selected fringe period.

3.2 System calibration

The introduction of tilt-shift imaging lens and telecentric projection lens will significantly improve the performance of the FPP system, but it also brings difficulties to system calibration. Firstly, due to the fact that the optical axis is not perpendicular to the image plane, the distortion model needs to be redefined and there is currently no unified and complete descriptive model. Secondly, due to the insensitivity of the telecentric lens to axial depth changes, some extrinsic parameters are missing. Thirdly, the intrinsic and extrinsic parameters of telecentric lens are naturally coupled and difficult to directly separate. To address the above problems, a joint calibration method that makes full use of the established pinhole imaging model and Scheimpflug distortion model to calibrate telecentric projection is proposed. The calibration steps are described in details as follows.

Step 1: Image acquisition and preprocessing.

An asymmetric circular calibration plate is used as the calibrator. Placing at least three different poses of the calibrator in the measurement volume, the cameras collect a set of images at each pose. The images should include a white light projection image used to extract the circle center as feature points, horizontal and vertical fringe patterns used to solve the absolute phase of the feature points. It is necessary to perform denoising on the images before solving the phase, and the Gaussian filter is usually used for removing high-frequency and spurious noise.

Step 2: Tilt-shift camera calibration.

Extract the sub-pixel coordinates ${\boldsymbol p}_{ij}^{cm} = {(u_{ij}^{cm},v_{ij}^{cm})^T}$ of each circle center of the calibrator from the white light projection images collected in Step 1 using the center extraction algorithm, where i, j and m indicate the i-th pose, the j-th circle center of the calibrator in the m-th camera. With the calculated ${\boldsymbol p}_{ij}^{cm}$ in different poses, the closed-form solution of the m-th camera’s intrinsic matrix Kcm can be estimated using the calibration methods given by Zhang et al. [21]. The Scheimpflug distortion model described in Section 2.2 is utilized to calibrate the distortion coefficient ($k_1^c,k_2^c,k_3^c$) and Scheimpflug angle (τx, τy) by implementing Levenberg-Marquardt algorithm. Setting the initial guess of the distortion coefficient to zero and Scheimpflug angle to the theoretical design value, the initial values of the extrinsic matrix $[{\boldsymbol R}_i^{cm}|{\boldsymbol t}_i^{cm}]$ are obtained.

Step 3: Distortion correction and extrinsic parameters optimization of the cameras.

As the distortion and manufacturing error in the imaging lens cannot be ignored, it is necessary to correct the imaging model before using it to calibrate the telecentric projector. With the obtained distortion coefficients, it is simple to correct the captured circle center by incorporating them into Eqs. (610). For the extrinsic matrix, we apply the perspective-n-point method to optimize the external parameters of the feature points with different poses.

Step 4: 3D coordinate estimation of the feature points.

The world coordinates ${(x_{ij}^w,y_{ij}^w,z_{ij}^w)^T}$ of the feature points can be aligned with the camera coordinate system ${(x_{ij}^{cm},y_{ij}^{cm},z_{ij}^{cm})^T}$ by using the extrinsic matrix obtained in the previous step. Then the extrinsic matrix $[{\boldsymbol R}_i^{cm}|{\boldsymbol t}_i^{cm}]$ turns into [I3 × 3 | 03 × 1], where I3 × 3 is the identity matrix and 03 × 1 is the zero vector. Here, the 3D feature points are converted into the camera coordinate system instead of the world coordinate system, which is difficult to determine in practice.

Step 5: Coordinate calculation of the feature points on DMD plane.

Since the projection can be regards as the inverse progress of imaging, the feature points capture by camera can be mapped to the projector DMD plane. The multi-frequency phase shifting method described in Section 2.1 can be used to solve the horizontal and vertical absolute phase values of integer pixels near the circle center of the feature points. The unique DMD coordinate ${\boldsymbol q}_{ij}^p = (u_{ij}^p,v_{ij}^p)$ corresponding to ${\boldsymbol p}_{ij}^{cm}$ can be solved as follows:

$$\left\{ {\begin{array}{{c}} {u_{ij}^p = \frac{{\phi_v^p({u_{ij}^{cm},v_{ij}^{cm}} )\cdot W}}{{2\pi {f_u}}}}\\ {v_{ij}^p = \frac{{\phi_h^p({u_{ij}^{cm},v_{ij}^{cm}} )\cdot H}}{{2\pi {f_h}}}} \end{array}} \right.$$
where $\phi _v^p(u_{ij}^{cm},v_{ij}^{cm})$ and $\phi _h^p(u_{ij}^{cm},v_{ij}^{cm})$ are respectively the vertical and horizontal absolute phase value of the circle center, fu and fh are respectively the horizontal and vertical fringe number, and W × H is the resolution of the projector. It is worth noting that the phase decoding accuracy can only be at pixel level, and the local homography method [22] can be used to obtain sub-pixel coordinates of the DMD feature points.

Step 6: Projection matrix calculation of the telecentric projector.

After determining the 3D coordinates Pcm of the feature points on each pose of the calibrator in Step 4, and the corresponding coordinates ${\boldsymbol q}_{ij}^{pm}$ on DMD plane, the orthogonal projection imaging model of the telecentric projector (Eq. (11)) in the m-th camera coordinate system can be rewritten using the direct linear transform (DLT) as follows.

$$\left[ {\begin{array}{{c}} {u_{ij}^p}\\ {v_{ij}^p} \end{array}} \right] = {{\boldsymbol P}^{cm}} \times {{\boldsymbol M}^{pm}} = {\left[ {\begin{array}{{cccccccc}} {x_{ij}^{cm}}&{y_{ij}^{cm}}&{z_{ij}^{cm}}&1&0&0&0&0\\ 0&0&0&0&{x_{ij}^{cm}}&{y_{ij}^{cm}}&{z_{ij}^{cm}}&1 \end{array}} \right]_{2 \times 8}}{\left[ {\begin{array}{{c}} {m_{11}^{pm}}\\ {m_{12}^{pm}}\\ {m_{13}^{pm}}\\ {m_{14}^{pm}}\\ {m_{21}^{pm}}\\ {m_{22}^{pm}}\\ {m_{23}^{pm}}\\ {m_{24}^{pm}} \end{array}} \right]_{8 \times 1}}$$
where Mpm is the projection matrix of the telecentric projector in the m-th camera coordinate system. Substituting all the feature points into the above equation and using singular value decomposition (SVD) method, the least squares solution of Mpm can be obtained.

Step 7: Camera-projector joint calibration.

The aim of camera-projector joint calibration is to minimize the re-projection error of telecentric projection and tilt-shift imaging, so as to improve the calibration accuracy. The calibration can be performed as follows.

$$arg\min {\sum\limits_{i = 1}^n {\sum\limits_{j = 1}^m {{{||{{\boldsymbol p}_{ij}^{cm} - \tilde{{\boldsymbol p}}_{ij}^{cm}({{{\boldsymbol K}^{cm}},{\boldsymbol R}_i^{cm},{\boldsymbol t}_i^{cm},{{\boldsymbol k}^{cm}}} )} ||}^2} + ||{{\boldsymbol q}_{ij}^p - \tilde{{\boldsymbol q}}_{ij}^p({{{\boldsymbol M}^{pm}},k_1^p,k_2^p} )} ||} } ^2}$$
where $\tilde{{\boldsymbol p}}_{ij}^{cm}({{\boldsymbol K}^{cm}},{\boldsymbol R}_i^{cm},{\boldsymbol t}_i^{cm},{{\boldsymbol k}^{cm}})$ is the corresponding coordinates of the feature points calculated according to Eqs. (610) in the m-th image plane, ${{\boldsymbol k}^{cm}} = (k_1^c,k_2^c,k_3^c,\tau _x^{cm},\tau _y^{cm})$ is the distortion vector, $\tilde{{\boldsymbol q}}_{ij}^p({{\boldsymbol M}^{cm}},k_1^p,k_2^p)$ is the projection of the feature point under the m-th camera coordinate system according to Eq. (11). Equation (22) can be solved by Levenberg-Marquardt nonlinear optimization, and all initial values can be obtained from the previous step.

Follow the above steps, the calibration of the established multi-view FPP system is conducted by capturing 10 sets of images of a 5 × 11 asymmetric circle calibration plate (see Fig. 7(a)) at different positions. Figure 7(b) and (c) show the reprojection errors of all feature points of the cameras and projector, respectively. It can be seen that the reprojection errors exhibit circular and radial distributions, with error points becoming denser as they approach the center. According to 3σ criterion, the reprojection errors in both X and Y directions are about within ±0.15 pixel. The root mean squares (RMS) are 0.113 pixel and 0.141 pixel, respectively.

 figure: Fig. 7.

Fig. 7. Calibration results. (a) Calibration target (5 × 11 asymmetric circle array). (b) Reprojection error of the camera. (c) Reprojection error of the projector.

Download Full Size | PDF

In addition, the proposed calibration method is compared with the calibration errors of the relevant works [9,23], as shown in Table 1. It can be seen that the proposed method greatly reduces the RMS of the reprojection error and error range, compared with Ref. [23] in which the extrinsic parameter symbols of telecentric model are taken from the pinhole model. In fact, the intrinsic and extrinsic parameters of telecentric model are naturally coupled together, which increases the uncertainty of system calibration. In our previous work [9] using Scheimpflug projection and telecentric imaging architecture, the DMD mapping error will be amplified in camera calibration, and even after non-linear optimization, the distortion of the projector will have a significant impact on 3D reconstruction. In contrast, the proposed calibration method for telecentric projection and Scheimpflug imaging system can effectively avoid the impact of camera-projector corresponding points matching errors on projector calibration, and does not require multiple optimization processes, making it more flexible and simple.

Tables Icon

Table 1. Comparison of reprojection error of the proposed method and related works.

4. Experiment and discussion

4.1 Verification of the measurement volume

The measurement volume is determined by the intersection of the projection and imaging FOV, and the overlap of the projection and imaging DOF. The measurable FOV is relatively easy to evaluate as it is actually the size of the projection area in our system design (70 × 40 mm2, see Section 3.1). To evaluate the measurable DOF of the system, the binary fringe projected in different depths is captured and the contrast of which is calculated. A white ceramic plate not smaller than the measurement FOV is placed parallel to the X-Y plane of the projector coordinate system, and then translated along the Z-axis 30 steps with each step length of 1 mm. Since the image contrast will be sharply decreased when the plate moves out of the DOF, we use the Brenner function as the image clarity evaluation function, and plot the contrast curves for images captured by the four cameras, as shown in Fig. 8, where the black dashed line denotes the half of the maximum contrast.

 figure: Fig. 8.

Fig. 8. Contrast curves of the captured images from the four perspectives.

Download Full Size | PDF

It can be seen that the DOFs of the four perspectives almost overlap, each around 20 mm. We take the full width at half maximum (FWHM) of the contrast curve as the measurable DOF, then the measurable volume of the established multi-view FPP system is 70 × 40 × 20 mm3.

4.2 Measurement accuracy and repeatability analysis

Referring to the VDI/VDE-2634 standard, the accuracy and repeatability of the established multi-view FPP system is evaluated based on spatial measurement errors of spheres. Two standard ceramic spheres with diameter of 5.9995 mm and center distance of 15.0445 mm, as shown in Fig. 9(a), are measured for 10 times from different positions in the measurement volume. Figure 9(b) illustrates the obtained 3D point clouds from one of the viewpoints and merging from four viewpoints. It can be seen that the multi-view fusion point cloud presents a more complete spherical shape while the single view point cloud shows a large amount of missing data.

 figure: Fig. 9.

Fig. 9. 3D point cloud of the standard ceramic spheres. (a) Standard ceramic spheres. (b) The obtained 3D point clouds from one of the viewpoints and merging from four viewpoints.

Download Full Size | PDF

The diameters and center distance of the two spheres are calculated by fitting the 3D point cloud, as shown in Fig. 10. Table 2 records the mean absolute error (MAE) and standard deviation (STD) of the measured values, from which it can be seen that the MAE of distance measurement is about 1.8 μm with a STD of 3.8 μm. It is obvious that the measured values from multi-view fusion point clouds are closer to the reference value and have smaller variations, compared with each single-view measurement result, which means the multi-view FPP system has higher accuracy and precision.

 figure: Fig. 10.

Fig. 10. Measured values of the standard ceramic spheres. (a) Diameter of sphere A. (b) Diameter of sphere B. (c) Center distance.

Download Full Size | PDF

Tables Icon

Table 2. Statistical values of the standard sphere measurements (units: μm).

To further verify the accuracy of height measurement, five standard blocks (the reference height values are 1.002 mm, 3.000 mm, 6.004 mm and 10.000 mm with the deviation of ±0.001 mm) are measured (see Fig. 11) and the relative heights are calculated after plane fitting. Table 3 summarizes 10 sets of measurement data for each step height. It can be seen that the MAE and STD are less than 10 μm.

 figure: Fig. 11.

Fig. 11. 3D measurement of the standard blocks. (a) One of the fringe images modulated by five standard blocks. (b) Point-cloud of five standard blocks.

Download Full Size | PDF

Tables Icon

Table 3. Statistical values of the step height measurements (units: μm).

4.3 Measurement of HDR surfaces with intricate structures

To evaluate the feasibility of the proposed multi-view FPP system for HDR surfaces with intricate structures, a shiny aluminum alloy part is measured, as shown in Fig. 12. The DLP projector projects multi frequency phase-shifting fringe patterns onto the measured surface, and synchronously triggers the four CMOS cameras to capture images of the object. Figure 12(a)-(d) show the first fringe image captured by four cameras respectively. It can be seen that there are local saturation areas in all four images and the positions of which are different due to the different viewpoints. Then, for the fringe images captured by each camera, the three frequency four step phase-shifting algorithm is used to calculate the absolute phase distribution of the measured surface. By calculating and judging the background threshold, the erroneous phase obtained from saturation area in each phase map is removed, as shown in Fig. 12(e)-(h), and compensated by the phase of its corresponding area from the other viewpoint where there is no saturation issue. Based on the multi-view FPP model described in Section 2.4, the 3D point coordinates are calculated pixel by pixel to obtain the point cloud data of the measured object, as shown in Fig. 12(i), and Fig. 12(j) shows the 3D reconstruction results. The experimental results show that the overexposure induced data loss under a single view is effectively compensated by fusing multi-view information. The complete 3D contour data of the measured object is obtained, with high reconstruction quality and good detail resolution.

 figure: Fig. 12.

Fig. 12. Measurement result of a shiny aluminum alloy part. (a)-(d) Fringe images captured by four cameras separately. (e)-(h) Phase maps from four camera perspectives. (i) 3D point cloud of the part. (j) 3D reconstruction result.

Download Full Size | PDF

To further verify the feasibility of the established system for 3D AOI, a printed circuit board assembly (PCBA) with many electronic components is tested, as shown in Fig. 13(a). It is a typical HDR surface that the saturation pixels appeared in the high-reflection area will lead to data loss in the 3D reconstruction, while the contrast deterioration of the projected fringes on dark area will result in an accuracy decrease in the phase measurement. With the multi-view FPP system, the reflected light from a shiny area may saturate pixels, but from other viewpoints it will not be so strong. The same phenomenon also applies to measurements in dark areas. The depth map and 3D reconstruction result in Fig. 13(b) show that almost all components can be fully reconstructed and finely detailed. Figure 13(c) and (d) are respectively the reconstructed point clouds from single view and fusion of four viewpoints, corresponding to the area marked with yellow rectangle in Fig. 13(a). It can be seen that there is a large amount of missing useful data in Fig. 13(c) due to the occlusion, overexposure and dark surface. This problem can be significantly improved through multi-view data fusion. Moreover, a ball grid array (BGA) chip is measured and the 3D reconstruction result is shown in Fig. 14. It can be seen from the details that the solder bumps with high reflectivity are clearly and completely profiled. The above experiments prove that the established multi-view FPP system can provide complete 3D data for HDR surfaces with intricate structures.

 figure: Fig. 13.

Fig. 13. Measurement result of a PCBA. (a) Image of the PCBA. (b) 3D reconstruction result. (c) 3D point cloud from single view. (d) Data fusion of four viewpoints.

Download Full Size | PDF

 figure: Fig. 14.

Fig. 14. Measurement result of a BGA chip.

Download Full Size | PDF

5. Conclusion

In this paper, a multi-view FPP system is presented to address the shortcomings in the measurement of HDR surfaces with intricate structures. The projector is equipped with a telecentric lens for top-down parallel projection, so as to eliminate the shadow effect that exists near the bottom of tall features on the surface. Meanwhile, four oblique tilt-shift cameras are used to enlarge the measurable volume and perform complementary measurements for complex and HDR surfaces. Since the saturated or dark area varies in position from different viewpoints, the resulting erroneous and missing data can be compensated by their corresponding data in the other view. To achieve flexible and accurate calibration of the FPP system involving telecentric lens, we use the established pinhole model to calibrate the telecentric projection by fully considering the correction and error optimization of the tilt-shift camera. The calibration results show that the reprojection errors of the cameras and projector are about within ±0.15 pixel. The accuracy and precision verification experiments demonstrate that the maximum measurement bias and standard deviation do not exceed 10μm, within a measurement volume of 70 × 40 × 20 mm3. The 3D reconstruction experiments for a shiny aluminum alloy part, a PCBA and a BGA chip prove that the developed multi-view FPP system can achieve complete, accurate and fast 3D measurement for HDR surfaces with complex structures. Further improvements can be made by setting different exposure times for each camera to enhance the HDR of the system.

Funding

National Natural Science Foundation of China (52305584, 52225507, U2341275, U23B6005).

Acknowledgment

The authors would like to thank the support by the National Natural Science Foundation of China.

Disclosures

The authors declare no conflicts of interest.

Data availability

Data underlying the results presented in this paper are not publicly available at this time but may be obtained from the authors upon reasonable request.

References

1. M. Abd Al Rahman and A. Mousavi, “A review and analysis of automatic optical inspection and quality monitoring methods in electronics industry,” IEEE Access 8, 183192–183271 (2020). [CrossRef]  

2. G. Zhang, S. Yang, P. Hu, et al., “Advances and prospects of vision-based 3D shape measurement methods,” Machines 10(2), 124 (2022). [CrossRef]  

3. Y. Hu, Q. Chen, S. Feng, et al., “Microscopic fringe projection profilometry: A review,” Opt. Laser Eng. 135, 106192 (2020). [CrossRef]  

4. R. Windecker, M. Fleischer, and H. J. Tiziani, “Three-dimensional topometry with stereo microscopes,” Opt. Eng. 36(12), 3372–3377 (1997). [CrossRef]  

5. D. Wang, N. Yan, H. Liu, et al., “Study on microscopic fringe structured light measurement method for tiny and complex structural components,” Appl. Phys. B 128(11), 207 (2022). [CrossRef]  

6. M. Wang, Y. Yin, D. Deng, et al., “Improved performance of multi-view fringe projection 3D microscopy,” Opt. Express 25(16), 19408–19421 (2017). [CrossRef]  

7. H. Liu, H. Lin, and L. Yao, “Calibration method for projector-camera-based telecentric fringe projection profilometry system,” Opt. Express 25(25), 31492–31508 (2017). [CrossRef]  

8. Y. Yin, M. Wang, B. Z. Gao, et al., “Fringe projection 3D microscopy with the general imaging model,” Opt. Express 23(5), 6846–6857 (2015). [CrossRef]  

9. C. Steger, “A comprehensive and versatile camera model for cameras with tilt lenses,” International Journal of Computer Vision 123(2), 121–159 (2017). [CrossRef]  

10. H. Deng, P. Hu, G. Zhang, et al., “Accurate and flexible calibration method for a 3D microscopic structured light system with telecentric imaging and Scheimpflug projection,” Opt. Express 31(2), 3092–3113 (2023). [CrossRef]  

11. Y. Hu, Z. Liang, S. Feng, et al., “Calibration and rectification of bi-telecentric lenses in Scheimpflug condition,” Optics and Lasers in Engineering 149, 106793 (2022). [CrossRef]  

12. Z. Zhang, J. Yu, N. Gao, et al., “Three-dimensional shape measurement techniques of shiny surfaces,” Infrared and Laser Engineering 49(3), 303006 (2020). [CrossRef]  

13. L. Zhang, Q. Chen, C. Zuo, et al., “Real-time high dynamic range 3D measurement using fringe projection,” Opt. Express 28(17), 24363–24378 (2020). [CrossRef]  

14. J. Wang and Y. Yang, “A new method for high dynamic range 3D measurement combining adaptive fringe projection and original-inverse fringe projection,” Optics and Lasers in Engineering 163, 107490 (2023). [CrossRef]  

15. Z. Zhu, M. Li, F. Zhou, et al., “Stable 3D measurement method for high dynamic range surfaces based on fringe projection profilometry,” Optics and Lasers in Engineering 166, 107542 (2023). [CrossRef]  

16. J. Wang and Y. Yang, “An efficient high dynamic range 3D shape reconstruction method based on double phase-shifting profilometry,” Meas. Sci. Technol. 35(2), 025028 (2024). [CrossRef]  

17. K. He, C. Sui, C. Lyu, et al., “3D reconstruction of objects with occlusion and surface reflection using a dual monocular structured light system,” Appl. Opt. 59(29), 9259–9271 (2020). [CrossRef]  

18. H. Deng, Y. Liu, G. Zhang, et al., “Multi-angle Scheimpflug projection 3D microscope: Design, calibration, and three-dimensional reconstruction,” Measurement 222, 113609 (2023). [CrossRef]  

19. C. Zuo, S. Feng, L. Huang, et al., “Phase shifting algorithms for fringe projection profilometry: A review,” Optics and Lasers in Engineering 109, 23–59 (2018). [CrossRef]  

20. A. Legarda, A. Izaguirre, N. Arana, et al., “Comparison and error analysis of the standard pin-hole and Scheimpflug camera calibration models,” in Proceedings of IEEE Conference on Electronics, Control, Measurement, Signals & Their Application to Mechatronics (IEEE, 2013), pp. 1–6.

21. Z. Zhang, “A flexible new technique for camera calibration,” IEEE Trans. Pattern Anal. Machine Intell. 22(11), 1330–1334 (2000). [CrossRef]  

22. D. Moreno and G. Taubin, “Simple, accurate, and robust projector-camera calibration,” Second International Conference on 3D Imaging, Modeling, Processing, Visualization & Transmission, 3DIMPVT. 464–471 (2012).

23. T. Cheng, X. Liu, L. Qin, et al., “A practical micro fringe projection profilometry for 3-D automated optical inspection,” IEEE Trans. Instrum. Meas. 71, 1–13 (2022). [CrossRef]  

Data availability

Data underlying the results presented in this paper are not publicly available at this time but may be obtained from the authors upon reasonable request.

Cited By

Optica participates in Crossref's Cited-By Linking service. Citing articles from Optica Publishing Group journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (14)

Fig. 1.
Fig. 1. Scheimpflug imaging model.
Fig. 2.
Fig. 2. The projection model with bilateral telecentric lens.
Fig. 3.
Fig. 3. Relationship between camera coordinate systems and projector coordinate system.
Fig. 4.
Fig. 4. Multi-view FPP system design. (a) System configuration diagram. (b) Eliminate shadow areas by parallel projection. (c) Extend common focus area by tilt-shift lens.
Fig. 5.
Fig. 5. Estimation of 3D imaging DOF under Scheimpflug condition.
Fig. 6.
Fig. 6. Multi-view FPP prototype system. (a) The photograph of the system. (b) A comparative experiment between perspective and parallel projection. (c) A comparative experiment between forward and tilt-shift imaging.
Fig. 7.
Fig. 7. Calibration results. (a) Calibration target (5 × 11 asymmetric circle array). (b) Reprojection error of the camera. (c) Reprojection error of the projector.
Fig. 8.
Fig. 8. Contrast curves of the captured images from the four perspectives.
Fig. 9.
Fig. 9. 3D point cloud of the standard ceramic spheres. (a) Standard ceramic spheres. (b) The obtained 3D point clouds from one of the viewpoints and merging from four viewpoints.
Fig. 10.
Fig. 10. Measured values of the standard ceramic spheres. (a) Diameter of sphere A. (b) Diameter of sphere B. (c) Center distance.
Fig. 11.
Fig. 11. 3D measurement of the standard blocks. (a) One of the fringe images modulated by five standard blocks. (b) Point-cloud of five standard blocks.
Fig. 12.
Fig. 12. Measurement result of a shiny aluminum alloy part. (a)-(d) Fringe images captured by four cameras separately. (e)-(h) Phase maps from four camera perspectives. (i) 3D point cloud of the part. (j) 3D reconstruction result.
Fig. 13.
Fig. 13. Measurement result of a PCBA. (a) Image of the PCBA. (b) 3D reconstruction result. (c) 3D point cloud from single view. (d) Data fusion of four viewpoints.
Fig. 14.
Fig. 14. Measurement result of a BGA chip.

Tables (3)

Tables Icon

Table 1. Comparison of reprojection error of the proposed method and related works.

Tables Icon

Table 2. Statistical values of the standard sphere measurements (units: μm).

Tables Icon

Table 3. Statistical values of the step height measurements (units: μm).

Equations (22)

Equations on this page are rendered with MathJax. Learn more.

I i ( x , y ) = A ( x , y ) + B ( x , y ) × cos ( φ ( x , y ) + 2 π i / N )
φ ( x , y ) = arctan ( i = 0 N I i ( x , y ) × sin ( 2 π i / N ) i = 0 N I i ( x , y ) × cos ( 2 π i / N ) )
ϕ 12 ( x ) = φ 12 ( x ) + 2 π × Round ( φ 123 ( x ) × f 12 / f 123 φ 12 ( x ) 2 π )
ϕ 1 ( x ) = φ 1 ( x ) + 2 π × Round ( ϕ 12 ( x ) × f 1 / f 12 φ 1 ( x ) 2 π )
p o tan θ = p i tan β
z c [ u c v c 1 ] T = K c × K τ × R τ × [ R w c | t w c ] × [ x w y w z w 1 ] T
K c = [ f x 0 u c 0 0 f y v c 0 0 0 1 ]
K τ = [ cos ( τ y ) cos ( τ x ) 0 sin ( τ y ) cos ( τ x ) 0 cos ( τ y ) cos ( τ x ) sin ( τ x ) 0 0 1 ]
R τ = [ cos ( τ y ) sin ( τ y ) sin ( τ x ) sin ( τ y ) cos ( τ x ) 0 cos ( τ x ) sin ( τ x ) sin ( τ y ) cos ( τ y ) sin ( τ x ) cos ( τ y ) cos ( τ x ) ]
{ u c = x c ( 1 + k 1 c r c 2 + k 2 c r c 4 + k 3 c r c 6 ) v c = y c ( 1 + k 1 c r c 2 + k 2 c r c 4 + k 3 c r c 6 )
[ u p v p 1 ] T = M p × [ x w y w z w 1 ] T
M p = K p × [ R w p | t w p ] = [ m 11 m 12 m 13 m 14 m 21 m 22 m 23 m 24 0 0 0 1 ]
K p = [ m x 0 0 u p 0 0 m y 0 v p 0 0 0 0 1 ]
{ u p = x p ( 1 + k 1 p r p 2 + k 2 p r p 4 ) v p = y p ( 1 + k 1 p r p 2 + k 2 p r p 4 )
[ u c m v c m 1 ] T = N 3 × 4 c m × P c m
[ u p v p 1 ] T = M 3 × 4 p m × P c m
N 3 × 4 c m = [ K c m | 0 ]
{ R c m t o c 1 = R i c 1 × ( R i c m ) T t c m t o c 1 = t i c 1 R c m t o c 1 × t i c m , m = [ 1 , 2 , 3 , 4 ]
{ Δ ( p 1 p 2 ) cos β p 1 = p o / ( 1 s / 2 a M ) p 2 = p o / ( 1 + s / 2 a M )
{ u i j p = ϕ v p ( u i j c m , v i j c m ) W 2 π f u v i j p = ϕ h p ( u i j c m , v i j c m ) H 2 π f h
[ u i j p v i j p ] = P c m × M p m = [ x i j c m y i j c m z i j c m 1 0 0 0 0 0 0 0 0 x i j c m y i j c m z i j c m 1 ] 2 × 8 [ m 11 p m m 12 p m m 13 p m m 14 p m m 21 p m m 22 p m m 23 p m m 24 p m ] 8 × 1
a r g min i = 1 n j = 1 m | | p i j c m p ~ i j c m ( K c m , R i c m , t i c m , k c m ) | | 2 + | | q i j p q ~ i j p ( M p m , k 1 p , k 2 p ) | | 2
Select as filters


Select Topics Cancel
© Copyright 2024 | Optica Publishing Group. All rights reserved, including rights for text and data mining and training of artificial technologies or similar technologies.