Expand this Topic clickable element to expand a topic
Skip to content
Optica Publishing Group

Accurate and flexible calibration method for a 3D microscopic structured light system with telecentric imaging and Scheimpflug projection

Open Access Open Access

Abstract

3D imaging and metrology of complex micro-structures is a critical task for precision manufacturing and inspection. In this paper, an accurate and flexible calibration method for 3D microscopic structured light system with telecentric imaging and Scheimpflug projector is proposed. Firstly, a fringe projection 3D microscopy (FP-3DM) system consisting of a telecentric camera and a Scheimpflug projector is developed, which can take full advantage of the depth of field (DOF) and increase the measurement depth range. Secondly, an accurate and flexible joint calibration method is proposed to calibrate the developed system, which utilizes the established pinhole imaging model and Scheimpflug distortion model to calibrate telecentric imaging, and fully considers the correction and error optimization of the Scheimpflug projection model. Meanwhile, the optimized local homography is calculated to obtain more accurate sub-pixel correspondence between the camera and the projector, and the perspective-n-point (PnP) method make the 3D coordinate estimation of the feature point more accurate. Finally, a prototype and a dedicated calibration program are developed to realize high-resolution and high-precision 3D imaging. The experimental results demonstrate that the re-projection error is less than 1µm, and the 3D repeated measurement error based on feature fitting is less than 4µm, within the calibrated volume of 10(H)mm × 50(W)mm × 40(D)mm.

© 2023 Optica Publishing Group under the terms of the Optica Open Access Publishing Agreement

1. Introduction

With recent advances in precision manufacturing and semiconductor manufacturing, 3D imaging and metrology of complex micro-structures has become increasingly important [1]. Many non-contact optical methods have been studied to measure the morphology of tiny objects, such as co-focal microscopy [2,3], white-light interference microscopy [4,5] and fringe projection 3D microscopy (FP-3DM) [6,7]. Each technique has its own advantages and limitations under different applications. The co-focal microscopy and white-light interference microscopy have the advantage of high accuracy of 3D measurement. But they need expensive optical equipment and complex data processing algorithm, especially it is unsuitable for the harsh industrial environments. In contrast, fringe projection profilometry (FPP) can be widely used in complex industrial environments, due to its insensitivity to changes in background, contrast and noise [8].

Traditional FP-3DM prototypes normally have two system frameworks. One is to improve one channel of the stereo microscopy through different projection technologies. Due to the limitation of objective lens aperture and imaging principle, the field of view (FOV) and depth of field (DOF) are both severely limited to the sub-millimeter order [9,10]. The other uses a non-telecentric lens with long working distance (LWD) [6,7,11]. However, in this FP-3DM system, the limited FOV usually results in a small DOF, which is insufficient to measure 3D object with height variations of several millimeters. To improve both FOV and DOF of microscopic 3D measurements, telecentric lens have been applied to FP-3DM in recent years due to their favorable properties, such as orthographic projection, low distortion, and constant magnification over a specific distance range [12]. Liu et al. [13] and Li et al. [14] used telecentric lens for imaging and projection respectively, to measure the 3D surface of micro-components. Hu et al. proposed a microscopic telecentric stereo vision system, which uses the principle of stereo structured light to acquire 3D data of micro targets through two telecentric cameras [15]. Rao et al. [16] and Li et al. [17] achieved complete 3D reconstruction using a telecentric lens and a small FOV projector. Almost all FP-3DM system using telecentric lens consist of a projection branch and an imaging branch with an angle between them. This will result in only a small portion of the projected fringes are in focus and the rest of the area remains defocused. In addition, the limited DOF of the telecentric lens will further significantly reduce the common focus area between the projector and the camera.

This research developed a novel Scheimpflug projection FP-3DM system, consisting of a camera with a telecentric lens and a small FOV pinhole projector using Scheimpflug conditions. Based on the Scheimpflug principle, a special angle is introduced between the DMD chip and the projection lens (the projection lens is tilted relative to the DMD plane), which will result in a larger common focus area [18].

However, system calibration for Scheimpflug projection FP-3DM is not straightforward. The orthogonal imaging of the telecentric lens will result in insensitivity of depth changing along the optical axis, and there is no unified and perfect calibration method at present. The Scheimpflug projection also lacks a unified description model. Zhu et al. proposed using a camera with telecentric lens and a speckle projector for digital image correlation (DIC) deformation measurement [19]. The system uses a high-precision displacement device and polynomial fitting method to calibrate the Z direction. However, system calibration using a high-precision displacement device is often difficult (e.g. moving completely perpendicular to the Z-axis) and expensive in practice. In order to improve the flexibility and accuracy of calibration, Li et al. [16] and Li et al. [14,20] has formulated the orthographic projection of telecentric lens into an intrinsic and an extrinsic matrix, and successfully applied the model to structured light (SL) system with double telecentric lenses. However, due to strong constraints (such as the orthogonality of the rotation matrix), it is difficult for this method to achieve high-precision calibration of external parameters. In addition, magnification and external parameters are naturally coupled together and difficult to separate, which further complicates the calibration process and increases the uncertainty of calibration. Mei et al. used the commercial software Halcon to calibrate the SL system with telecentric stereo vision based on the Scheimpflug condition, but the non-open source of commercial software limits the further application of this method [21]. Yin et al. proposed using a camera and projector with long working distance (LWD) lens for 3D reconstruction of micro targets, and the camera adopted the Scheimpflug principle to increase the common focus area [22]. However, the FOV and DOF of this system are relatively small (calibration volume 0.5(H)mm × 5(W)mm × 4(D)mm) and the magnification changes greatly within the DOF, which is not sufficient to measure 3D targets with height variations of several millimeters. In addition, they also introduced a general imaging model to calibrate the FP-3DM system under Scheimpflug conditions. However, because there are only three calibration target poses with strong geometric constraints, and the parameter calibration process is complex, it is difficult for this method to achieve high-precision calibration of the system.

This research proposes a joint calibration method of the Scheimpflug projection FP-3DM system, which makes full use of the established pinhole imaging model and Scheimpflug distortion model to calibrate the telecentric imaging. Firstly, the Scheimpflug angle is added to the pinhole imaging model as an additional distortion coefficient to compensate tilt effect, so that the projector internal parameters and distortion parameters are accurately solved. And then using the calibrated projector, the 3D coordinates of the feature points for telecentric imaging calibration are optimally estimated by the PnP (Perspective-n-point) method. Finally, the Levenberg-Marquardt algorithm is used to simultaneously minimize the re-projection error of telecentric imaging and Scheimpflug projector to improve the calibration accuracy. In addition, a sub-pixel mapping strategy based on local homography is adopted to reduce the mapping error between the camera and the projector, which further reduces the calibration error of the projector and reduces the error caused by the camera distortion. Since both the projector and the camera use the same set of feature points, the entire system calibration process can be performed simultaneously. Since the calibration target poses can be arbitrarily placed to calibrate the whole system, the proposed joint calibration method has strong flexibility.

2. Principle

2.1 Telecentric imaging model

This research chooses telecentric lens for imaging to realize high-precision, low-distortion microscopic 3D measurement. The camera imaging model with bilateral telecentric lens is shown in Fig. 1. Basically, the telecentric lens only amplifies in the X and Y directions, and is insensitive to the depth in the Z direction. By analyzing the light transmission matrix of such an optical system, the object point q(xc, yc, zc)T in the camera coordinates, its homogeneous image coordinates qc(uc, vc)T can be described as follows:

$$\left[ {\begin{array}{c} {{q^c}}\\ 1 \end{array}} \right] = \left[ {\begin{array}{ccc} {mx}&0&{u_0^c}\\ 0&{my}&{v_0^c}\\ 0&0&1 \end{array}} \right] \cdot \left[ {\begin{array}{c} {{x^c}}\\ {{y^c}}\\ 1 \end{array}} \right] = {K^c} \cdot \left[ {\begin{array}{c} {{x^c}}\\ {{y^c}}\\ 1 \end{array}} \right]$$
where Kc is the intrinsic matrix of the telecentric camera, and mx and my are the effective magnifications in the X and Y directions of the telecentric lens, respectively. For an ideal telecentric camera, $u_0^c$ and $v_0^c$ can be set as zeros. The relationship between the world coordinate system ${q^w}$ (xw, yw, zw)T and the camera coordinate system is as follows:
$$q = R_{3 \times 3}^c \cdot {q^w} + T_{3 \times 1}^c$$
where Rc3 × 3 = [rcij] is the rotation matrix and Tc3 × 1= [tcx tcy tcz]T is the translation vector. Combining Eqs. (1) and (2), the orthogonal projection imaging formula of the telecentric lens from the 3D object point to the 2D camera image point is:
$$\left[ \begin{array}{l} {q^c}\\ 1 \end{array} \right] = {K^c} \cdot \left[ {\begin{array}{cc} {R_{2 \times 3}^c}&{T_{2 \times 1}^c}\\ {{0_{1 \times 3}}}&1 \end{array}} \right]\left[ {\begin{array}{c} {\widehat q}\\ 1 \end{array}} \right] = {M^c} \cdot \left[ {\begin{array}{c} {{q^w}}\\ 1 \end{array}} \right]$$
where $M_{3 \times 4}^c$ = [mcij] is the telecentric imaging projection matrix and mcij is the projection matrix parameter.

 figure: Fig. 1.

Fig. 1. The camera imaging model with bilateral telecentric lens: the telecentric lens is performs parallel projection and there is not a projection center.

Download Full Size | PDF

2.2 Scheimpflug projection model

Almost all FP-3DM system using telecentric len consist of a projection branch and an imaging branch with an angle between them. However, in FP-3DM system, the limited FOV usually results in a small DOF. As shown in Fig. 2, the imaging and projection areas are perpendicular to the optical axis of the telecentric lens and the projector respectively, which means that there is an angle between the plane focus areas of telecentric imaging and projection, and that only a small portion of the projected fringes are in focus, the rest of the area remains defocused. Therefore, the common focus area will be severely restricted. A special angle is introduced between the DMD chip and the projection lens (i.e. the projection lens is tilted relative to the DMD plane), which will result in a larger common focus area [18], as shown in Fig. 3. The orientation and position of the DMD chip and projection lens are carefully assembled so that the DMD plane, the principal plane of the projection lens, and the object plane form a Scheimpflug intersection. Meanwhile, the imaging branch is set perpendicular to the projection plane. Therefore, the imaging DOF and the projected DOF can be adjusted to overlap to obtain a larger common focus area. The tilt angle β of the object plane and the tilt angle θ of the projection lens satisfy the following relationship [23].

$$\theta \textrm{ = }\arctan (M \cdot \tan (\beta ))$$
where M is the nominal vertical magnification along the optical axis, and conforms to the Gaussian imaging formula. Equation (4) and Fig. 3 also show that the angles β of projection and imaging branches can be optimally designed according to FP sensitivity, and then determine the tilt angle θ of the DMD plane according to the FOV range.

 figure: Fig. 2.

Fig. 2. Traditional FP-3DM system configuration diagram with telecentric lens. (The common focus area between the imaging DOF and the projected DOF of the traditional FP-3DM system is severely limited).

Download Full Size | PDF

 figure: Fig. 3.

Fig. 3. Scheimpflug projection FP-3DM system configuration diagram.(The DMD plane, the principal plane of the projection lens, and the object plane form a Scheimpflug intersection.The imaging DOF and projected DOF can be adjusted to overlap to obtain larger common focus area under the Scheimpflug projection condition).

Download Full Size | PDF

The projector model can be described by the well-known pinhole imaging model, which is illustrated in Fig. 4. The projection from 3D object points ${p^w}$ (xw, yw, zw)T to 2D projector DMD chip pp(up, vp)T can be expressed as follows:

$$s\left[ \begin{array}{l} {p^p}\\ 1 \end{array} \right] = {K^p}[{R_{3 \times 3}^p|T_{3 \times 1}^p} ]\cdot \left[ {\begin{array}{c} {{p^w}}\\ 1 \end{array}} \right] = {M^p} \cdot \left[ {\begin{array}{c} {{p^w}}\\ 1 \end{array}} \right]{\kern 1pt}$$
$${K^p} = \left[ {\begin{array}{ccc} {fx}&0&{u_0^p}\\ 0&{fy}&{v_0^p}\\ 0&0&1 \end{array}} \right]$$
where s is the scale factor, Kp is the intrinsic matrix of the projector, fx and fy are the effective focal lengths of the projector in the u and v directions respectively, ${(u_0^p,v_0^p)^T}$ are the coordinates of the principal points; Rp3 × 3 =[rpij] and Tp3 × 1= [tpx tpy tpz]T is the rotation matrix and translation vector of the projector coordinate system (Xp, Yp, Zp)T relative to the world coordinate system (Xw, Yw, Zw)T, $M_{3 \times 4}^p$=[mpij] is the projector projection matrix and mpij is the projection matrix parameter.

 figure: Fig. 4.

Fig. 4. Model of pinhole projector imaging.(Under the Scheimpflug condition, the relative rotation of the projection lens and the DMD chip results in perspective distortion, which can be expressed as a homography transformation between the ideal DMD plane and the Scheimpflug DMD plane).

Download Full Size | PDF

The pinhole model represented by Eq. (5) describes the linear imaging process. However, in practice, the imaging process is nonlinear, due to lens distortion will change the direction of incident light. Ideally, (xp, yp, zp)T at the normalized plane is:

$$\left[ {\begin{array}{c} {{x_{norm}}}\\ {{y_{norm}}} \end{array}} \right]\textrm{ = }\left[ {\begin{array}{c} {{x^p}/{z^p}}\\ {{y^p}/{z^p}} \end{array}} \right]$$

The lens distortion will change the position of the ideal undistorted image coordinates ${\overrightarrow x _{norm}}$= (xnorm, ynorm)T, thus affecting the imaging process. This paper introduces second-order radial and tangential distortion.,which can be expressed as:

$$\left[ {\begin{array}{c} {x_{dist}^{u1}}\\ {y_{dist}^{\nu 1}} \end{array}} \right]\textrm{ = }\left[ {\begin{array}{c} {{x_{norm}}(1\textrm{ + }k1{r^2} + k2{r^4}) + 2{p_1}{x_{norm}}{y_{norm}} + {p_2}({r^2} + 2x{{_{norm}^{}}^2})}\\ {{y_{norm}}(1\textrm{ + }k1{r^2} + k2{r^4}) + {p_1}({r^2} + 2y{{_{norm}^{}}^2}) +{+} 2{p_2}{x_{norm}}{y_{norm}}} \end{array}} \right]$$
where radial distance $r = \sqrt {{x_{norm}}^2 + {y_{norm}}^2}$, $p_{dist}^{u1} = {(x_{dist}^{u1},y_{dist}^{v1})^T}$ are the real distorted image coordinates, k1, k2 and p1, p2 are radial and tangential distortion coefficients respectively.

Under Scheimpflug condition, the Scheimpflug adapter tilts the projection lens by rotation, resulting in perspective distortion, see Fig. 4. The rotation can be divided into two rotations around the axis xp and yp, and the angle is θx and θy. When the tilt angle between the projection plane and the lens plane is small (usually less than 6 degrees), the Scheimpflug angle can be added to the pinhole imaging model as an additional distortion coefficient to compensate for the tilt effect [24,25]. This tilt distortion can be modeled as:

$$\left[ \begin{array}{l} {p^p}\\ 1 \end{array} \right] = {K^p} \cdot \left[ {\begin{array}{c} {p_{dist}^{u2}}\\ 1 \end{array}} \right]$$
$$\left[ {\begin{array}{c} {p_{dist}^{u2}}\\ 1 \end{array}} \right] = H_p^s \cdot \left[ {\begin{array}{c} {p_{dist}^{u1}}\\ 1 \end{array}} \right]$$
where $p_{dist}^{u2} = {(x_{dist}^{u2},y_{dist}^{v2})^T}$ is the image coordinate after Scheimpflug condition, $H_p^s$ is the homography transformation between ideal projection DMD plane and Scheimpflug projection DMD plane. The corresponding projection point of the same object point on the two DMD planes is on the same line, there is a homography between the two DMD planes according to the projection geometry. The homography $H_p^s$ has the following QR decomposition:
$$\begin{array}{l} H_p^s = {K_h}(\theta_x,\theta_y) \cdot {R_h}(\theta_x,\theta_y)\\ {K_h}(\theta_x,\theta_y) = \left[ {\begin{array}{ccc} {\cos (\theta_y)\cos (\theta_x)}&0&{\sin (\theta_y)\cos (\theta_x)}\\ 0&{\cos (\theta_y)\cos (\theta_x)}&{ - \sin (\theta_x)}\\ 0&0&1 \end{array}} \right]\\ {R_h}(\theta_x,\theta_y)\textrm{ = }\left[ {\begin{array}{ccc} {\cos (\theta_y)}&{\sin (\theta_x)\sin (\theta_y)}&{ - \sin (\theta_y)\cos (\theta_x)}\\ 0&{\cos (\theta_x)}&{\sin (\theta_x)}\\ {\sin (\theta_y)}&{ - \cos (\theta_y)\sin (\theta_x)}&{\cos (\theta_y)\cos (\theta_x)} \end{array}} \right] \end{array}$$
where Rh(θx, θy) represents the rotation matrix between two DMD planes, and Kh(θx, θy) represents the coordinate scale transformation caused by perspective projection between two DMD planes. Eq. (11) indicates that the homography from the ideal DMD plane to the Scheimpflug DMD plane can be decomposed into twice matrix multiplication. One is the upper triangular matrix, and the other is the orthogonal matrix.

The goal of projector calibration under Scheimpflug condition is to estimate the intrinsic matrix Kp, the extrinsic matrix Rp3 × 3 and Tp3 × 1, and the distortion vector (k1, k2, p1, p2, θx, θy). These coefficients can be obtained by minimizing the sum of the re-projection errors, using a nonlinear algorithm such as the Levenberg-Marquardt algorithm.

2.3 Multi-frequency phase-shifting method

In order to calibrate a projector, a “captured” image needs to be generated for the projector, since the projector cannot capture an image. This is achieved by mapping camera points to projected points using absolute phase. In Scheimpflug projection FP-3DM system, fringe distortion is quantified by its phase distribution, which is modulated by object topography. In this research, phase-shift profilometry is used to extract the phase distribution from the distorted fringes, as it provides the highest measurement resolution and accuracy and can eliminate the interference of ambient light and surface reflectivity. In phase-shift profilometry, N equally spaced phase-shifting sinusoidal intensity fringe patterns are sequentially projected onto the surface of the measured object. The distorted fringe distribution captured by telecentric imaging can be expressed as:

$$In = A(x,y) + B(x,y) \cdot \cos [\varphi (x,y) + 2\pi n/N]$$
where A(x, y) represents the average intensity related to fringe pattern brightness and background intensity, B(x, y) represents the modulated intensity related to pattern contrast and surface reflectivity, n is the phase-shifting index, and n = 0,1,2,…,N-1, and φ(x, y) is the corresponding wrapped phase map, which can be extracted by the following equation.
$$\varphi (x,y) ={-} \arctan \left[ {\frac{{\sum\nolimits_{n = 0}^{N - 1} {In(x,y) \cdot \sin (2\pi n/N)} }}{{\sum\nolimits_{n = 0}^{N - 1} {In(x,y) \cdot \cos (2\pi n/N)} }}} \right]$$

Equation (13) produces a wrapped phase map in the range of [−π, +π]. In this paper, a multi-frequency (heterodyne) technique is used for phase unwrapping to eliminate the 2π discontinuity of the phase value. This method extends the discontinuous phase range to a synthetic wavelength at the beat frequency of two-frequency (fh and fl), the equivalent phase of which is generated by the difference of the two wrapped phase functions.

$${\varphi _{eq}}(x,y) = [{{\varphi_h}(x,y) - {\varphi_l}(x,y)} ](\bmod {\kern 1pt} {\kern 1pt} {\kern 1pt} {\kern 1pt} {\kern 1pt} 2\pi )$$
where φh(x,y) and φl(x,y) are the wrapped phase corresponding to high and low frequencies, respectively, and φeq(x,y) is the equivalent phase with the corresponding equivalent wavelength λeq.
$$\lambda_{eq} = \frac{{\lambda_l \cdot \lambda_h}}{{\lambda_l - \lambda_h}}$$

By taking the equivalent phase as the reference phase, the fringe order at high frequency wavelengths can be determined, and then phase unwrapping can be performed

$${k_h}(x,y) = Round\left[ {\frac{{({{{\lambda_{eq}} / {\lambda_h}}} )\cdot {\varphi_{eq}}(x,y) - {\varphi_h}(x,y)}}{{2\pi }}} \right]$$
$${\Phi _h}(x,y) = {\varphi _h}(x,y) + 2\pi \cdot {k_h}(x,y)$$
where Φh(x,y) is unwrapped phase, Round[] is a rounding mathematical operation. It should be noted that the two-frequency unwrapping method can be extended to three or more wavelengths, which allows the equivalent wavelength to be further increased. In order to obtain that the captured image has only one phase period in the horizontal and vertical full-field range, and to reduce the phase calculation error, this research adopts the three-frequency heterodyne method to perform phase unwrapping.

2.4 3D reconstruction algorithm

When the camera and projector are calibrated for the Scheimpflug condition, the 3D topography of the object can be reconstructed using a stereo vision algorithm. After the projection lens distortion and the Scheimpflug angle are calibrated, the microscopic 3D image can be reconstructed through the following equations (see Eqs. (3) and (5) for specific description).

$${\left[ {\begin{array}{ccc} {{u^c}}&{{v^c}}&1 \end{array}} \right]^T} = M_{3 \times 4}^c \cdot X$$
$${\left[ {\begin{array}{ccc} {{u^p}}&{{v^p}}&1 \end{array}} \right]^T} = M_{3 \times 4}^p \cdot X$$
where Mc3 × 4 and Mp3 × 4 are the projection matrix of telecentric imaging and projector respectively. Equations (18) and (19) can be rearranged as Q·X = 0, and Q is derived as:
$$Q{\kern 1pt} {\kern 1pt} {\kern 1pt} \textrm{ = }\left[ {\begin{array}{c} {{u^c}M_3^c - M_1^c}\\ {{v^c}M_3^c - M_2^c}\\ {{u^p}M_3^p - M_1^p}\\ {{v^p}M_3^p - M_2^p} \end{array}} \right]$$
where $M_i^c$ and $M_i^p$ are the ith row of $M_{3 \times 4}^c$ and $M_{3 \times 4}^p$ respectively.

The homonymous point pairs of telecentric imaging and projector are solved by the phase of projection fringe. Given the camera image coordinates (uc, vc) and the projector horizontal coordinates up, the 3D coordinates (Xw, Yw, Zw)T of the object can be calculated by:

$$\left[ {\begin{array}{c} {X_{w}}\\ {Y_{w}}\\ {Z_{w}} \end{array}} \right]\textrm{ = }{\left[ {\begin{array}{ccc} {m_{11}^c - {u^c}m_{31}^c}&{m_{12}^c - {u^c}m_{32}^c}&{m_{13}^c - {u^c}m_{33}^c}\\ {m_{21}^c - {v^c}m_{31}^c}&{m_{22}^c - {v^c}m_{32}^c}&{m_{23}^c - {v^c}m_{33}^c}\\ {m_{11}^p - {u^p}m_{31}^p}&{m_{12}^p - {u^p}m_{32}^p}&{m_{13}^p - {u^p}m_{33}^p} \end{array}} \right]^{ - 1}} \cdot \left[ {\begin{array}{c} {{u^c}m_{34}^c - m_{14}^c}\\ {{v^c}m_{34}^c - m_{24}^c}\\ {{u^p}m_{34}^p - m_{14}^p} \end{array}} \right]$$

3. Calibration procedures

The traditional calibration method of monocular SL system is to calibrate the camera and projector separately, and then take any point as the coordinate origin for 3D reconstruction [25]. However, it is difficult to calibrate Scheimpflug projection FP-3DM system according to traditional monocular SL calibration method. Firstly, it is difficult to calibrate the telecentric camera directly and accurately, especially the inaccuracy of the extrinsic matrix parameter, which will lead to large 3D reconstruction errors. Secondly, due to the limited DOF of the telecentric camera, intrinsic and extrinsic parameters are naturally coupled together and difficult to directly separate. In order to achieve high-precision 3D measurement and solve the limitations of the most advanced system calibration, this research is proposed a joint calibration method of Scheimpflug projection FP-3DM system, which makes full use of the established pinhole imaging model and Scheimpflug distortion model to calibrate telecentric imaging. The calibration method has two basic assumptions. First, the distortion of the telecentric lens is small (less than 0.1%); second, the Scheimpflug angle (θx, θy) is small (less than 5 degrees). These two assumptions are easy to satisfy in practice. The joint calibration framework consists of nine main steps, which are described in detail next.

Step 1: Image acquisition and preprocessing.

In this research, the asymmetric black-and-white circle calibration plate, see Fig. 5(a), is used as the calibration target. Place calibration target in different spatial orientations. A set of images for each target pose is captured after telecentric imaging, which consists of white light projection, horizontal fringe pattern and vertical fringe pattern, see Figs. 5(b)-(d). The white light projection is used to extract the circle center $q_{ij}^c{(u_{ij}^c,v_{ij}^c)^T}$ captured by the camera as the feature point, see Fig. 5(e), i and j denote the ith spatial pose and the jth feature point of the calibration target respectively. The horizontal and vertical fringe patterns is used to solve for the horizontal and vertical absolute phase of the circle center. The captured image is often noisy, so it is necessary to preprocess the image before solving the phase. In this research, Gaussian filter is used to remove high frequency and spurious noise.

 figure: Fig. 5.

Fig. 5. Illustration of calibration process. (a) calibration target; (b) captured image with circle centers; (c) captured image with horizontal pattern projection; (d) captured image with vertical pattern projection;(e)captured circle center with telecentric imaging; (f) mapped circle center image for projector.

Download Full Size | PDF

Step 2: Circle center unwrapped phase retrieval.

In step 1, the sub-pixel position coordinates of the circle center $q_{ij}^c{(u_{ij}^c,v_{ij}^c)^T}$ are obtained. Using the multi-frequency phase-shifting method described in Section 2.3, the horizontal and vertical absolute phase value of an integer pixel in the neighborhood near the circle center can be solved (k × k, k is the pixel range of the selected neighborhood).

Step 3: Projector circle center measurement.

If the absolute phase value of the sub-pixel coordinates of the circle center is known, and then the unique projector circle center $p_{ij}^p{(u_{ij}^p,v_{ij}^p)^T}$ corresponding to the camera circle center can be obtained.

$$u_{ij}^p = \frac{{\Phi v(u_{ij}^c,v_{ij}^c) \cdot W_p}}{{2\pi \cdot n_v}}$$
$$v_{ij}^p = \frac{{\Phi h(u_{ij}^c,v_{ij}^c) \cdot H_p}}{{2\pi \cdot n_h}}$$
where $\Phi v(u_{ij}^c,v_{ij}^c)$ and $\Phi h(u_{ij}^c,v_{ij}^c)$ represent the vertical and horizontal absolute phase value of the circle center $q_{ij}^c{(u_{ij}^c,v_{ij}^c)^T}$ respectively, nv and nh represent the fringe number of the projected horizontal and vertical patterns respectively, and Hp × Wp is the projector resolution.

However, the phase decoding accuracy can only be at the pixel level, and there may be errors in decoding for single pixel. Inspired by [26], the local homography method is adopted in this paper, which is only valid in neighborhood of the circle center. Specifically, first use the absolute phase values of integer pixels within (k × k) in the neighborhood of the circle center obtained from step 2, obtain the corresponding projector coordinate values through Eqs. (22) and (23), and then a local mapping relationship from the camera coordinates to the projector coordinates is constructed. As shown in Fig. 6, let $q_{ijk}^c{(u_{ijk}^c,v_{ijk}^c)^T}$ be the integer pixel image coordinates near the circle center, and $p_{ijk}^p{(u_{ijk}^p,v_{ijk}^p)^T}$ be the decoded projector pixel at that point, find a minimum homography ${\overline H _{ij}}$ that makes:

$${\overline H _{ij}} = \mathop {\arg \min }\limits_{{H_{ij}}} {\sum\limits_{\forall {p_{ij}}} {||{q_{ijk}^c - {H_{ij}} \cdot p_{ijk}^p} ||} ^2}$$

Applying a local homography matrix ${\overline H _{ij}}$, the camera circle center coordinates $q_{ij}^c{(u_{ij}^c,v_{ij}^c)^T}$ is converted to projector coordinates $p_{ij}^p{(u_{ij}^p,v_{ij}^p)^T}$.

$$\left[ {\begin{array}{c} {p_{ij}^p}\\ 1 \end{array}} \right] = {\overline H _{ij}} \cdot \left[ {\begin{array}{c} {q_{ij}^c}\\ 1 \end{array}} \right]$$

 figure: Fig. 6.

Fig. 6. Schematic diagram of local homography transformation.(Firstly, the local mapping relationship from camera coordinates to projector coordinates is constructed by using the absolute phase of the integer pixel points near the circle center, and then the camera circle center coordinates is converted to projector coordinates).

Download Full Size | PDF

Figure 5(f) shows the mapped circle center image for projector. As shown in Fig. 5(f), through the local homography method, the feature point of the calibration target captured by the camera can be accurately mapped to the corresponding projector circle center image, and reduce the calibration error of the projector. In addition, the local homography method allows model non-linear errors caused by camera distortion.

Step 4: Projector intrinsic matrix calibration.

Using the projector circle center $p_{ij}^p{(u_{ij}^p,v_{ij}^p)^T}$ in different poses extracted from the previous step, the closed-form solution of the projector's intrinsic matrix Kp (i.e. fx, fy, up0,vp0) can be estimated using the camera calibration methods provided by Zhang et al. [27], and then the distortion coefficient and Scheimpflug angle (k1, k2, p1, p2, θx, θy) are iteratively solved by Levenberg-Marquardt algorithm.

$$\mathop {\min }\limits_{} {\sum\limits_{i = 1}^n {\sum\limits_{j = 1}^m {{\bigg \|}{p_{ij}^p - \widehat p_{ij}^p({K^p},R_i^p,t_i^p,\overrightarrow k )} {\bigg \|}} } ^2}$$
where $\widehat p_{ij}^p({K^p},R_i^p,t_i^p,\overrightarrow k )$ is the projection of feature point in image i according to Eqs. (5)–(11), $\overrightarrow k = {[{{k_1},{k_2},{p_1},{p_2},{\theta_x},{\theta_y}} ]^T}$ is the vector of projector distortion coefficients. n represents the number of different poses, and m represents the number of feature points on the calibration target. The initial guess of k1, k2, p1 and p2 can be simply set to zero, and the initial guess of θx and θy can be set to the theoretical design value. So far, we have obtained the precise intrinsic matrix Kp and all distortion coefficients of the projector, as well as the initial values of the extrinsic matrix.

Step 5: Projector distortion and Scheimpflug angle correction.

Because the distortion and manufacturing error of the projection lens are larger than that of the telecentric lens, and the system configuration adopts Scheimpflug projection. Therefore, it is necessary to correct the distortion and Scheimpflug angle of the projector. Applying the distortion coefficient and Scheimpflug angle obtained in step 4 to Eqs. (5)–(11), the projector circle center $p_{ij}^p{(u_{ij}^p,v_{ij}^p)^T}$ can be easily corrected and recorded as $p_{ij}^{p.corr}{(u_{ij}^{p.corr},v_{ij}^{p.corr})^T}$.

Step 6: Projector extrinsic matrix calibration.

In order to obtain a more accurate extrinsic matrix and complete the high-precision 3D coordinate estimation of feature points in the key step 7, this research adopts the PnP (Perspective-n-point) method to optimize the external parameters $[{R_i^p|T_i^p} ]$ of the feature points with different poses. Essentially, this optimization method iteratively minimizes the difference between the observed projection point and the calculated projection point, which can be expressed as the following function:

$$\mathop {\min }\limits_{avg[{R_i^p|t_i^p} ]} {\sum\limits_{i = 1}^n {\sum\limits_{j = 1}^m {{\bigg \|}{{{(u_{ij}^{p.corr},v_{ij}^{p.corr}\cdot 1)}^T} - {K^p}[{R_i^p|t_i^p} ]\cdot {{(x_{ij}^w,y_{ij}^w,z_{ij}^w)}^T}} {\bigg \|}} } ^2}$$
where ||·|| represents the least-squares distance, $p_{ij}^{p.corr}{(u_{ij}^{p.corr},v_{ij}^{p.corr})^T}$ is the projector center coordinate after distortion correction. The initial value of $[{R_i^p|T_i^p} ]$ is the result obtained in step 4.

Step 7: 3D coordinate estimation of feature points.

Using the projector extrinsic matrix obtained in the previous step under different poses, align the world coordinates of the calibration target circle center ${(x_{ij}^w,y_{ij}^w,z_{ij}^w)^T}$ with the projector coordinate system ${(x_{ij}^p,y_{ij}^p,z_{ij}^p)^T}$, and then the projector extrinsic matrix $[{R_i^p|T_i^p} ]$ becomes [I3 × 3| 03 × 1] (I3 × 3 is the identity matrix, and 03 × 3 is the zero vector). In other words, the measured data is converted to the projector coordinate system, rather than the world coordinate system, which is difficult to determine in practice.

$${\textrm{(}x_{ij}^p\textrm{,}y_{ij}^p\textrm{,}z_{ij}^p\textrm{,}1\textrm{)}^T}\textrm{ = }[{R_i^p|t_i^p} ]\cdot {\textrm{(}x_{ij}^w\textrm{,}y_{ij}^w\textrm{,}z_{ij}^w\textrm{,}1\textrm{)}^T}$$
Step 8: Calculate telecentric projection matrix.

Once the 3D coordinates of the circle center on each target pose are determined. Through direct linear transform (DLT), the orthogonal projection imaging of telecentric camera (Eq. (3)) can be rewritten as

$$\begin{array}{l} A \cdot L = \left[ {\begin{array}{c} {u_{ij}^c}\\ {v_{ij}^c} \end{array}} \right]\\ A = \left[ {\begin{array}{ccccccccccc} {x_{ij}^p}&{y_{ij}^p}&{z_{ij}^p}&1&0&0&0&0&{ - u_{ij}^cx_{ij}^p}&{ - u_{ij}^cy_{ij}^p}&{ - u_{ij}^cz_{ij}^p}\\ 0&0&0&0&{x_{ij}^p}&{y_{ij}^p}&{z_{ij}^p}&1&{ - v_{ij}^cx_{ij}^p}&{ - v_{ij}^cy_{ij}^p}&{ - v_{ij}^cz_{ij}^p} \end{array}} \right]\\ L = {\left[ {\begin{array}{ccccccccccc} {m_{11}^c}&{m_{12}^c}&{m_{13}^c}&{m_{14}^c}&{m_{21}^c}&{m_{22}^c}&{m_{23}^c}&{m_{24}^c}&{m_{31}^c}&{m_{32}^c}&{m_{33}^c} \end{array}} \right]^T}_{1 \times 11} \end{array}$$
where $q_{ij}^c{(u_{ij}^c,v_{ij}^c)^T}$ is circle center captured by the camera in step 1. Substituting all the circle center into Eq. (29), the least squares solution of L1 × 11 can be obtained by using the Singular Value Decomposition (SVD) method. Thus, we have realized the high-precision calibration of telecentric imaging projection matrix by using precisely calibrated Scheimpflug projector. It is worth noting that the telecentric imaging projection matrix solved here has contains the precise position relationship between the Scheimpflug projector and the camera.

Step 9: Camera-projector joint calibration.

The purpose of Camera-projector joint calibration is to improve the calibration accuracy by simultaneously minimizing the re-projection error of telecentric imaging and Scheimpflug projection. Joint calibration can be carried out by:

$$\mathop {\min }\limits_{avg} {\sum\limits_{i = 1}^n {\sum\limits_{j = 1}^m {{\bigg \|}{p_{ij}^p - \widehat p_{ij}^p({K^p},R_i^p,t_i^p,\overrightarrow k )} {\bigg \|}} } ^2} + {{\bigg \|}{q_{ij}^c - \widehat q_{ij}^c(M^c)} {\bigg \|}^2}$$
where $\widehat q_{ij}^c({M^c})$ is the projection of point in image i according to Eq. (3). Equation (30) can be solved by Levenberg-Marquardt nonlinear optimization algorithm, and all initial value can be obtained from the previous steps.

After the above system calibration process, combining the imaging model and calibration results, Eq. (21) can be further deduced as

$$\left[ {\begin{array}{c} {{x^p}}\\ {{y^p}}\\ {{z^p}} \end{array}} \right]\textrm{ = }{\left[ {\begin{array}{ccc} {m_{11}^c}&{m_{12}^c}&{m_{13}^c}\\ {m_{21}^c}&{m_{22}^c}&{m_{23}^c}\\ {{f_x}}&0&{u_0^p - {u^p}} \end{array}} \right]^{ - 1}} \cdot \left[ {\begin{array}{c} {{u^c} - m_{14}^c}\\ {{v^c} - m_{24}^c}\\ 0 \end{array}} \right]$$
where ${(x_{}^p,y_{}^p,z_{}^p)^T}$ is the 3D reconstruction coordinate under the projector coordinate system, ${(u_{}^c,v_{}^c)^T}$ is the camera coordinate of the measured object, ${(u_{}^p,v_{}^p)^T}$ is the projector coordinate corresponding to ${(u_{}^c,v_{}^c)^T}$. It can be seen from Eq. (31) that almost all the telecentric projection matrix parameters [mcij] = Mc3 × 4 are used in the reconstruction process, indicating that they are critical to the reconstruction accuracy. Compared with the traditional 3D reconstruction imaging (Eq. (21)), the calculation of Eq. (31) is more concise and efficient.

4. System configuration and experiment

4.1 System configuration

To validate the proposed system model and calibration method, a Scheimpflug projection FP-3DM prototype was established based on a specially designed framework, as shown in Fig. 7. The test system adopts a high-resolution camera (BC-SM24M6X4, resolution 5312 × 4608) and telecentric lens (MML0275-SR130VI-18C, Magnification 0.275, Working distance 130 mm, Depth of field 15 mm) as the imaging branch, projection module (DLP4710, Resolution 1920 × 1280) and a lens with vertical magnification 0.057× as the projection branch. According to Section 2.2, the tilt angle β of the object plane is designed as 30°, and the corresponding tilt angle of the projection lens θy is about 1.88°. The final imaging FOV is about 50mm × 40 mm, and the projected FOV is 80mm × 130 mm.

 figure: Fig. 7.

Fig. 7. Scheimpflug projection FP-3DM prototype test system: the test system consists of a high-resolution camera, a telecentric lens, a Scheimpflug projector and a precision vertical translation stage.

Download Full Size | PDF

The pattern of the calibration target is a white 9 × 11 asymmetric circle array, which is evenly distributed on a black background (the diameter of the circle is 2 mm, and the distance between adjacent circle centers is 5 mm), as shown in Fig. 8(a). Precision vertical translation stage (STANDA 8MVT40-13, Resolution in full step 0.08um, Travel range 13 mm) is used for quantitative and precise translation. The plastic printed fonts and 50 cent-coins are chosen as test samples, shown in Figs. 8(b) and 8(c). The standard ceramic plane of different heights shown in Fig. 8(d) is used as the standard plane to evaluate the accuracy of 3D imaging. Considering the camera parameters and experimental conditions selected for this test, normal Gaussian filtering with a kernel size of 5 × 5 is selected for image preprocessing (the standard deviation in both horizontal and vertical directions is set as 1). In the test, a three-frequency heterodyne (Tv= [128 123 119], Th= [72 67 63]) and six-steps phase-shifting are chosen.

 figure: Fig. 8.

Fig. 8. Components for experiments. (a) Calibration target; (b) Plastic printed fonts; (c) 50 cent-coins; (d) Standard ceramic plate.

Download Full Size | PDF

4.2 Calibration experiment

Firstly, in order to verify the focusing projection characteristics of the proposed Scheimpflug projection FP-3DM prototype, the fringe patterns of Scheimpflug projection(see Fig. 9(a)) and forward projection(see Fig. 9(b)) are compared under the same experimental conditions. To more directly observe the difference between the two schemes, the experiment adopts binary fringe projection. It can be seen from the comparison of Figs. 9(a) and 9(b) that the fringe edge of the captured forward projection image is becomes increasingly blurred, because the projection focus area is not perpendicular to the optical axis of the camera. In contrast, the captured Scheimpflug projection image has clear fringe edges, and the overall contrast and quality of the image is better than that of the forward projection. Figures 9(c), 9(d) and 9(e) are profile line plot of a binary fringe period selected from the same three positions (A, B, C) in Figs. 9(a) and 9(b) respectively. It can be seen that with the increase of the vertical coordinate of the captured image, the deformation of the forward projection fringe contour becomes more serious, which indicates that the projection fringe becomes increasingly defocused.

 figure: Fig. 9.

Fig. 9. Comparison of image quality between Scheimpflug projection and forward projection. (a) captured Scheimpflug projection image; (b) captured forward projected image; (c)-(e) profile line comparison plot of fringe period selected from the same three positions (A, B, C) in the Figs. 9 (a) and (b) respectively.

Download Full Size | PDF

It is worth noting that the projected binary fringe are not standard rectangles after being captured by the camera. This is because the binary fringes projected by DMD will be smoothed and filtered by the optical system composed of the projection lens and the imaging lens, as well as the camera's high resolution.

To further verify the common focus area and reconstructed DOF range of the proposed Scheimpflug projection FP-3DM prototype. Place a white ceramic plane parallel to the X-Y plane, and then translate it 50 times along the Z-axis, with a distance of 200µm each time, as shown in Fig. 10(a). The sinusoidal fringe image was projected onto the common focus flat and captured by the camera, as shown in Fig. 10(b). Since the projected sinusoidal fringes pass through the projection lens and the telecentric lens sequentially, the contrast of the captured image was affected by the focus degree of the projection branch and the imaging branch, which can be used as a marker to measure the common focus area. Since the sinusoidal grating fringe was projected, the RMS contrast is used in this paper. Within the FOV, five typical location areas (A∼E) were selected as contrast calculation areas, as shown in Fig. 10(c). To reduce the influence of uneven illuminance distribution of oblique projection on RMS contrast calculation, this paper limits the location areas to a fringe period of 100 × 100 pixels range. Figure 11 shows a comparison of the RMS contrast curves of the five location areas. The RMS contrast of the five location areas(A∼E) is sinusoidally distributed with translation distance, due to the projected sinusoidal fringes on the common focus flat move relatively with translation. However, at different translation distances, the fluctuation range of the contrast curves of the five areas is basically within a constant range. The contrast averages of location areas (A∼E) are 0.3075, 0.3114, 0.3077, 0.3082, 0.3109, and the standard deviations are 0.0246, 0.0247, 0.0251, 0.0246, 0.0245, respectively. This means that the DOF of the imaging branch and the projection branch are overlap, and the common focus area is a plane body parallel to the X-Y plane. Meanwhile, it also indicates that the measurement system can obtain a reconstruction DOF range greater than 10 mm, and the telecentric lens DOF can be effectively utilized. This is consistent with the target design of the system.

 figure: Fig. 10.

Fig. 10. Schematic diagram of the experiment in the common focal area. (a) The common focus flat is translated 50 times along the Z-axis (200µm/time); (b) Projected stripes and captured fringe images; (c) Five selection location areas within the field of view.

Download Full Size | PDF

 figure: Fig. 11.

Fig. 11. RMS Contrast Curve. (To reduce the influence of uneven illuminance distribution of oblique projection, the local area of RMS contrast calculation is limited to one fringe period).

Download Full Size | PDF

It is worth noting that the telecentricity of the telecentric imaging is limited (the incident light is not completely perpendicular to the optical axis in the DOF) due to manufacturing errors, and since the imaging DOF and projected DOF is also limited, the imaging and projection resolution degrade outside the perfect focus plane. In addition, theoretically, the exact Scheimpflug condition cannot be satisfied in the common focal area outside the perfect focus plane. Based on the above analysis, in order to overlap the telecentric imaging DOF and the Scheimpflug projection DOF, and ensure accurate 3D imaging, this paper gives two reference suggestions for the design of Scheimpflug projection FP-3DM system: 1. The telecentric imaging resolution is greater than the projector resolution. 2. The projector DOF ${\Delta ^p}$ is greater than the telecentric imaging DOF ${\Delta ^c}$, and meets the following empirical equation.

$${k_1} \cdot {\Delta ^p} \cdot \cos (\beta ) = {k_2} \cdot {\Delta ^c}$$
where k1 and k2 are an empirical parameter, recommended k1 is 0.3-0.5, and k2 is 0.5-0.8.

In order to verify the accuracy of the joint calibration method, the calibration target shown in Fig. 8(a) was used to collect images in 12 different poses, and the system was calibrated according to the calibration procedures in Section 3. Figure 12 shows the re-projection error at all circle center of the camera and projector. Figure 13 shows the overall re-projection error of the calibration target images under 12 different poses. Table 1 lists the intrinsic parameters and distortion coefficients for projector calibration. Table 2 lists the projection matrix parameters for telecentric imaging. It can be seen from Fig. 12 that the re-projection errors of the entire circle center in the X and Y directions of the telecentric imaging and projector calibration are approximately within the range of ±1.2µm and ±0.2µm, respectively. The re-projection errors of telecentric imaging and projector calibration are distributed in a ring-shaped ray, and the closer the error points are to the center, the denser they are. Meanwhile, the standard deviation (STD) of the re-projection error of the entire circle center is 0.328µm and 0.035µm respectively, and the STD of re-projection errors in the X and Y direction are (0.554µm, 0.406µm) and (0.044µm, 0.054µm) respectively. The average value of the re-projection error in the X and Y directions is close to zero. From Fig. 13, the overall re-projection errors of telecentric imaging and projector are in the range of 0.9µm and 0.15µm, respectively, and the overall mean errors of 12 different pose calibration are 0.684µm and 0.138µm, respectively. The calibrated Scheimpflug angle θx and θy are -0.401° and 1.626° respectively (see Table 1). Compared with the design values (θx = 0°, θy = 1.88°), the deviations δθx and δθy are -0.401° and 0.254° respectively, which include the assembly error and the numerical iterative error. Considering the resolution of CMOS and DMD chips, the joint calibration method can provide accurate calibration results, and it also shows that the model can adequately describe the imaging of cameras and projectors.

 figure: Fig. 12.

Fig. 12. Re-projection errors of the entire circle center points. (a) Camera re-projection error; (b) Projector re-projection error. (The STD value of all points are 0.328µm and 0.035µm respectively, and the STD values in the X and Y directions of the telecentric imaging and projector calibration are (0.554µm, 0.406µm) and (0.044µm, 0.054µm), respectively).

Download Full Size | PDF

 figure: Fig. 13.

Fig. 13. Overall re-projection error of the calibration target image under different poses. (a) Camera re-projection error; (b) Projector re-projection error. (The overall re-projection errors of telecentric imaging and projector are in the range of 0.9µm and 0.15µm, respectively, and the overall mean error of 12 different pose calibration are 0.684µm and 0.138µm, respectively).

Download Full Size | PDF

Tables Icon

Table 1. The intrinsic parameters and distortion coefficients for projector calibration.

Tables Icon

Table 2. The projection matrix parameters for telecentric imaging.

It is worth noting that projector calibration is more accurate than telecentric imaging calibration. This may be caused by the following reasons: 1. Since the pixel size of the camera is smaller than that of the projector, it can provide accurate corner coordinates for the projector calibration, 2. It may be the result of the mapping coupling error outside the optimization error, 3. The telecentricity error caused by lens distortion, 4. Manufacturing and image extraction error of center feature points.

4.3 3D reconstruction

In 3D reconstruction, the calibration target is first taken as an precise spatial plane to verify the accuracy of measurement. The lengths of the two diagonals AC and BD of the calibration target under 12 different poses were measured as shown in Fig. 8(a), which are formed by the circle centers at the corners. The circle of the calibration target is precisely made, and the distance between the adjacent circle centers d is 5 ± 0.0010 mm. Therefore, the actual length of AC and BD can be expressed by 9d/√2, which is 31.8198 mm. Firstly, by reconstructing the 3D geometry of each target pose, the 3D point clouds of the four white circles at the corners were extracted (see Fig. 14(a)). Then, since the calibration target has known circular features, and the 3D coordinates of the circle center cannot be directly found in the 3D point cloud, the normal vectors of four circular planes and the 3D coordinates of four center points (i.e. A, B, C and D) were obtained through least square circle fitting (see Fig. 14(b)). Finally, the euclidean distance of AC and BD were calculated and compared with the actual value. The results are shown in Table 3, for different calibration target, the average length of AC and BD are 31.8199 mm and 31.8191 mm respectively, the average error is less than 4µm, and the maximum error does not exceed 8µm. Considering the length of the measurement diagonal, the percentage error is quite small (about 0.026%), which is even comparable to the manufacturing uncertainty. The major error source could come from center extraction. In addition, through calculation, the normal vector angles of circle plane A and circle plane B, C, D are 178.9°, 178.7°, and 179.1°, respectively, which indicates that this calibration method can provide better uncertainty of 3D plane measurement.

 figure: Fig. 14.

Fig. 14. Examples of two diagonal measurements of a calibration target. (a) 3D point cloud of the four circles at the corners; (b) Normal vectors of four measuring circles and 3D coordinates of the circle center.

Download Full Size | PDF

Tables Icon

Table 3. Measurement result of two diagonals on calibration target (in mm).

To further check the step height and uncertainty of 3D plane measurement, different standard ceramic planes were measured (the standard values of the Coordinate Measuring Machining (CMM) are 0 mm, 0.492 mm, 0.948 mm, 1.363 mm, 1.848 mm, of which the 0 mm standard ceramic plane was used as the reference plane). Figure 15(a) shows the 3D point cloud of different ceramic planes, and Fig. 15(b) shows the 3D reconstruction and fitting results of different ceramic planes. First, the 3D geometric each ceramic plane is obtained by reconstruction (see Fig. 15(a)). Then, rectangular regions of the same size were selected on each 3D plane respectively (see Fig. 15(b)), and normal vectors of the five planes (A∼E) were obtained through least squares plane fitting. Finally, the average distance between all points in each rectangular area and the reference plane is calculated and compared with the standard value. From Fig. 15(b), the normal vectors of the fitting planes A∼E obtained are basically parallel to the Z axis of the local coordinate system, which further indicates that the calibration method can reduce the uncertainty of 3D plane measurement. Table 4 summarizes the step height results. The maximum error does not exceed 7µm, which is very small compared to the random noise level. The major error source can come from the roughness of the ceramic surface being tested, or random noise from the camera. The results show that the calibration method can provide high-precision for step height measurement.

 figure: Fig. 15.

Fig. 15. 3D reconstruction of ceramic planes with different step height. (a) The 3D point cloud of different standard ceramic planes; (b) The 3D reconstruction and plane fitting results of different ceramic planes. (In order to display the measurement plane more intuitively, the 3D point cloud shown in the figure is converted to the local coordinate system.)

Download Full Size | PDF

Tables Icon

Table 4. Measurement results of ceramic planes with different step height (in mm).

In previous 3D reconstruction experiments, sample with known features (feature circle and standard plane) were used to verify the accuracy of the calibration method. For sample with known features, measuring relative spatial distances between known features by fitting is a simple and improved accuracy method. However, the fitting method is not suitable for measuring sample with unknown features. For sample with unknown features, if the relative spatial positions of any two interest regions are to be obtained, the camera coordinates of the interest region can be marked and recorded in the sample images, and its absolute phase are decoded and mapped to the corresponding projector coordinates. Then, calculate the 3D coordinates of the interest region in the projector coordinate system through Eq. (31) and measure the relative spatial position. In this case, the 3D measurement resolution will be limited by the camera and telecentric lens resolution. The actual resolution of the calibrated imaging system can be verified with the USAF resolution target.

To visually demonstrate the success of the Scheimpflug projection FP-3DM prototype system and the calibration method, objects of complex geometry in two different materials were measured. First, a 50-cent coin was measured (see Fig. 8(c)), and then the plastic printed font was measured (see Fig. 8(b)). The reconstructed 3D point cloud and local details are shown in Fig. 16. It can be seen that the calibration method can provide a clear edge contours for the 3D reconstruction of complex micro-structures, and has a strong ability to distinguish details, and can clearly display the surface morphology of convex characters. The experimental results show that the system configuration and calibration method can well complete reconstructed for different materials (e.g. metal, plastic) and different types of geometric (e.g. slopes, planes, spheres). It can be noticed that the reconstructed plane is wrinkled in the partial enlarged view shown in Figs. 16(b) and 16(d). This is mainly caused by the roughness of the tested sample.

 figure: Fig. 16.

Fig. 16. Experimental result of measuring complex surface geometry. (a) 50 cent coin 3D measurement point cloud; (b) plastic printed fonts 3D measurement point cloud; (c) Partial enlargement of Fig. 16 (a); (d) Partial enlargement of Fig. 16 (c). (In order to display the measured height more intuitively, the 3D point cloud shown in the figure is false colour coded and converted to the local coordinate system).

Download Full Size | PDF

5. Conclusion

In this research, a FP-3DM system was proposed, which consists of a camera with a telecentric lens and a small FOV pinhole projector using Scheimpflug conditions, so as to make full use of the limited DOF of the telecentric lens, and can make the telecentric imaging and projection plane focus areas coincide, thereby increasing the measurement range. Meanwhile, a flexible joint calibration method was explored, which fully considers the correction and error optimization of the Scheimpflug projection model. During the calibration of the whole system, only the standard plate with a circular pattern was used, and the calibration target can be placed freely with flexibility. The effectiveness of the proposed system and method was verified by experiments in common focal area, system calibration, accuracy evaluation and 3D topography measurement. Experimental results demonstrate that the re-projection error of the system calibration is less than 1µm, and the 3D repeated measurement error based on feature fitting is less than 4µm, within the calibrated volume of 10(H)mm × 50(W)mm × 40(D)mm.

The maximum error of 3D measurement for unknown feature samples is limited by the telecentric imaging resolution.

Funding

National Science Fund for Distinguished Young Scholars (52225507); Key Research and Development Projects of Shaanxi Province (2020ZDLGY04-02); Open Project of Key Laboratory for Micro/Nano Technology and System, Dalian University of Technology, Liaoning Province.

Acknowledgments

The authors would like to thank the support by the National Science Fund for Distinguished Young Scholars; Key Research and Development Program of Shaanxi Province; and the Open Project of Key Laboratory for Micro/Nano Technology and System, Dalian University of Technology, Liaoning Province.

Disclosures

The authors declare no conflicts of interest.

Data availability

Data underlying the results presented in this paper are not publicly available at this time but may be obtained from the authors upon reasonable request.

References

1. Y. Hu, Q. Chen, T.Y. Tao, H. Li, and C. Zuo, “Absolute three-dimensional micro surface profile measurement based on a Greenough-type stereomicroscope,” Meas. Sci. Technol. 28(4), 045004 (2017). [CrossRef]  

2. A. Syring, Z. Wang, J. Molle, H. Keese, S. Wundrack, Ra. Stosch, and T. Voss, “Confocal Raman Microscopy for the Analysis of the Three-Dimensional Shape of Polymeric Microsphere Layers,” Appl. Spectrosc. 76(6), 678–688 (2022). [CrossRef]  

3. T. Liu, J. Wang, Q. Liu, J. Hu, Z. Wang, C. Wan, and S. Yang, “Chromatic confocal measurement method using a phase Fresnel zone plate,” Opt. Express 30(2), 2390–2401 (2022). [CrossRef]  

4. V. Srivastava, M. Inam, R. Kumar, and D. S. Mehta, “Single Shot White Light Interference Microscopy for 3D Surface Profilometry Using Single Chip Color Camera,” J. Opt. Soc. Korea 20(6), 784–793 (2016). [CrossRef]  

5. G. Zhang, S. Yang, J. Fluegge, and H. Bosse, “Fiber optic white light interferometer for areal surface measurement,” Meas. Sci. Technol. 31(2), 025005 (2019). [CrossRef]  

6. D. S. Mehta, M. Inam, J. Prakash, and A. M. Biradar, “Liquid-crystal phase-shifting lateral shearing interferometer with improved fringe contrast for 3D surface profilometry,” Appl. Opt. 52(25), 6119–6125 (2013). [CrossRef]  

7. M. Wang, Y. Yin, D. N. Deng, X. F. Meng, X. L. Liu, and X. Peng, “Improved performance of multi-view fringe projection 3D microscopy,” Opt. Express 25(16), 19408–19421 (2017). [CrossRef]  

8. S. V. D. Jeught and J. J. J. Dirckx, “Real-time structured light profilometry: a review,” Opt. Laser Eng. 87, 18–31 (2016). [CrossRef]  

9. R. Rodriguez-Vera, K. Genovese, J. A. Rayas, and F. Mendoza-Santoyo, “Vibration Analysis at Microscale by Talbot Fringe Projection Method,” Strain 45(3), 249–258 (2009). [CrossRef]  

10. A Li, X. Peng, Y. K. Yin, X. L. Liu, Q. P. Zhao, K. Körner, and W. Osten, “Fringe projection based quantitative 3D microscopy,” Optik 124(21), 5052–5056 (2013). [CrossRef]  

11. C. Quan, C. J. Tay, X. Y. He, X. Kang, and H. M. Shang, “Microscopic surface contouring by fringe projection method,” Opt. Laser Technol. 34(7), 547–552 (2002). [CrossRef]  

12. B. Li and S. Zhang, “Microscopic structured light 3D profilometry: Binary defocusing technique vs. sinusoidal fringe projection,” Opt. Laser Eng. 96, 117–123 (2017). [CrossRef]  

13. H. Liu, H. Lin, and L. Yao, “Calibration method for projector-camera-based telecentric fringe projection profilometry system,” Opt. Express 25(25), 31492–31508 (2017). [CrossRef]  

14. D. Li, C. Liu, and J. Tian, “Telecentric 3D profilometry based on phase-shifting fringe projection,” Opt. Express 22(26), 31826–31835 (2014). [CrossRef]  

15. Y. Hu, Q. Chena, S Feng, T. Tao, A. Asundi, and C Zuo, “A new microscopic telecentric stereo vision system- Calibration, rectification, and three-dimensional reconstruction,” Opt. Laser Eng. 113, 14–22 (2019). [CrossRef]  

16. L. Rao, F. Da, W. Kong, and H Huang, “Flexible calibration method for telecentric fringe projection profilometry systems,” Opt. Express 24(2), 1222–1237 (2016). [CrossRef]  

17. B. Li and S. Zhang, “Flexible calibration method for microscopic structured light system using telecentric lens,” Opt. Express 23(20), 25795–25803 (2015). [CrossRef]  

18. P. Cornicl, C. Illoul, A. Cheminet, and G. Besnerais, “Another look at volume self-calibration: calibration and self-calibration within a pinhole model of Scheimpflug cameras,” Meas. Sci. Technol. 27(9), 094004 (2016). [CrossRef]  

19. F. Zhu, W. Liu, H. Shi, and X. He, “Accurate 3d measurement system and calibration for speckle projection 15.method,” Opt. Laser Eng. 48(11), 1132–1139 (2010). [CrossRef]  

20. D. Li and J. Tian, “An accurate calibration method for a camera with telecentric lenses,” Opt. Laser Eng. 51(5), 538–541 (2013). [CrossRef]  

21. Q. Mei, J. Gao, H. Lin, and Y. Chen, “Structure light telecentric stereoscopic vision 3D measurement system based on Scheimpflug condition,” Opt. Laser Eng. 86, 83–91 (2016). [CrossRef]  

22. Y. Yin, M. Wang, B. Gao, X. Liu, and X. Peng, “Fringe projection 3D microscopy with the general imaging model,” Opt. Express 23(5), 6846–6857 (2015). [CrossRef]  

23. H. Louhichi, T. Fournel, J. Lavest, and H. Aissia, “Self-calibration of Scheimpflug cameras: an easy protocol,” Meas. Sci. Technol. 18(8), 2616–2622 (2007). [CrossRef]  

24. A. Legarda, A. Izaguirre, N. Arana, and A. Iturrospe, “Comparison and error analysis of the standard pin-hole and Scheimpflug camera calibration models,” in Proceedings of IEEE Conference on Electronics, Control, Measurement, Signals & Their Application to Mechatronics (IEEE, 2013), pp. 1–6.

25. S. Fenga, C Zuoa, L. Zhang, and T. Tao, “Calibration of fringe projection profilometry: A comparative review,” Opt. Laser Eng. 143, 106622 (2021). [CrossRef]  

26. D.Moreno, and Taubin. G,“Simple, Accurate, and Robust Projector-Camera Calibration,” in Proceedings of IEEE Conference on Second International Conference on 3D Imaging (IEEE, 2012), pp. 464–471.

27. Z. Zhang, “A Flexible New Technique for Camera Calibration,” IEEE Trans. Pattern Anal. Machine Intell. 22(11), 1330–1334 (2000). [CrossRef]  

Data availability

Data underlying the results presented in this paper are not publicly available at this time but may be obtained from the authors upon reasonable request.

Cited By

Optica participates in Crossref's Cited-By Linking service. Citing articles from Optica Publishing Group journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (16)

Fig. 1.
Fig. 1. The camera imaging model with bilateral telecentric lens: the telecentric lens is performs parallel projection and there is not a projection center.
Fig. 2.
Fig. 2. Traditional FP-3DM system configuration diagram with telecentric lens. (The common focus area between the imaging DOF and the projected DOF of the traditional FP-3DM system is severely limited).
Fig. 3.
Fig. 3. Scheimpflug projection FP-3DM system configuration diagram.(The DMD plane, the principal plane of the projection lens, and the object plane form a Scheimpflug intersection.The imaging DOF and projected DOF can be adjusted to overlap to obtain larger common focus area under the Scheimpflug projection condition).
Fig. 4.
Fig. 4. Model of pinhole projector imaging.(Under the Scheimpflug condition, the relative rotation of the projection lens and the DMD chip results in perspective distortion, which can be expressed as a homography transformation between the ideal DMD plane and the Scheimpflug DMD plane).
Fig. 5.
Fig. 5. Illustration of calibration process. (a) calibration target; (b) captured image with circle centers; (c) captured image with horizontal pattern projection; (d) captured image with vertical pattern projection;(e)captured circle center with telecentric imaging; (f) mapped circle center image for projector.
Fig. 6.
Fig. 6. Schematic diagram of local homography transformation.(Firstly, the local mapping relationship from camera coordinates to projector coordinates is constructed by using the absolute phase of the integer pixel points near the circle center, and then the camera circle center coordinates is converted to projector coordinates).
Fig. 7.
Fig. 7. Scheimpflug projection FP-3DM prototype test system: the test system consists of a high-resolution camera, a telecentric lens, a Scheimpflug projector and a precision vertical translation stage.
Fig. 8.
Fig. 8. Components for experiments. (a) Calibration target; (b) Plastic printed fonts; (c) 50 cent-coins; (d) Standard ceramic plate.
Fig. 9.
Fig. 9. Comparison of image quality between Scheimpflug projection and forward projection. (a) captured Scheimpflug projection image; (b) captured forward projected image; (c)-(e) profile line comparison plot of fringe period selected from the same three positions (A, B, C) in the Figs. 9 (a) and (b) respectively.
Fig. 10.
Fig. 10. Schematic diagram of the experiment in the common focal area. (a) The common focus flat is translated 50 times along the Z-axis (200µm/time); (b) Projected stripes and captured fringe images; (c) Five selection location areas within the field of view.
Fig. 11.
Fig. 11. RMS Contrast Curve. (To reduce the influence of uneven illuminance distribution of oblique projection, the local area of RMS contrast calculation is limited to one fringe period).
Fig. 12.
Fig. 12. Re-projection errors of the entire circle center points. (a) Camera re-projection error; (b) Projector re-projection error. (The STD value of all points are 0.328µm and 0.035µm respectively, and the STD values in the X and Y directions of the telecentric imaging and projector calibration are (0.554µm, 0.406µm) and (0.044µm, 0.054µm), respectively).
Fig. 13.
Fig. 13. Overall re-projection error of the calibration target image under different poses. (a) Camera re-projection error; (b) Projector re-projection error. (The overall re-projection errors of telecentric imaging and projector are in the range of 0.9µm and 0.15µm, respectively, and the overall mean error of 12 different pose calibration are 0.684µm and 0.138µm, respectively).
Fig. 14.
Fig. 14. Examples of two diagonal measurements of a calibration target. (a) 3D point cloud of the four circles at the corners; (b) Normal vectors of four measuring circles and 3D coordinates of the circle center.
Fig. 15.
Fig. 15. 3D reconstruction of ceramic planes with different step height. (a) The 3D point cloud of different standard ceramic planes; (b) The 3D reconstruction and plane fitting results of different ceramic planes. (In order to display the measurement plane more intuitively, the 3D point cloud shown in the figure is converted to the local coordinate system.)
Fig. 16.
Fig. 16. Experimental result of measuring complex surface geometry. (a) 50 cent coin 3D measurement point cloud; (b) plastic printed fonts 3D measurement point cloud; (c) Partial enlargement of Fig. 16 (a); (d) Partial enlargement of Fig. 16 (c). (In order to display the measured height more intuitively, the 3D point cloud shown in the figure is false colour coded and converted to the local coordinate system).

Tables (4)

Tables Icon

Table 1. The intrinsic parameters and distortion coefficients for projector calibration.

Tables Icon

Table 2. The projection matrix parameters for telecentric imaging.

Tables Icon

Table 3. Measurement result of two diagonals on calibration target (in mm).

Tables Icon

Table 4. Measurement results of ceramic planes with different step height (in mm).

Equations (32)

Equations on this page are rendered with MathJax. Learn more.

[ q c 1 ] = [ m x 0 u 0 c 0 m y v 0 c 0 0 1 ] [ x c y c 1 ] = K c [ x c y c 1 ]
q = R 3 × 3 c q w + T 3 × 1 c
[ q c 1 ] = K c [ R 2 × 3 c T 2 × 1 c 0 1 × 3 1 ] [ q ^ 1 ] = M c [ q w 1 ]
θ  =  arctan ( M tan ( β ) )
s [ p p 1 ] = K p [ R 3 × 3 p | T 3 × 1 p ] [ p w 1 ] = M p [ p w 1 ]
K p = [ f x 0 u 0 p 0 f y v 0 p 0 0 1 ]
[ x n o r m y n o r m ]  =  [ x p / z p y p / z p ]
[ x d i s t u 1 y d i s t ν 1 ]  =  [ x n o r m ( 1  +  k 1 r 2 + k 2 r 4 ) + 2 p 1 x n o r m y n o r m + p 2 ( r 2 + 2 x n o r m 2 ) y n o r m ( 1  +  k 1 r 2 + k 2 r 4 ) + p 1 ( r 2 + 2 y n o r m 2 ) + + 2 p 2 x n o r m y n o r m ]
[ p p 1 ] = K p [ p d i s t u 2 1 ]
[ p d i s t u 2 1 ] = H p s [ p d i s t u 1 1 ]
H p s = K h ( θ x , θ y ) R h ( θ x , θ y ) K h ( θ x , θ y ) = [ cos ( θ y ) cos ( θ x ) 0 sin ( θ y ) cos ( θ x ) 0 cos ( θ y ) cos ( θ x ) sin ( θ x ) 0 0 1 ] R h ( θ x , θ y )  =  [ cos ( θ y ) sin ( θ x ) sin ( θ y ) sin ( θ y ) cos ( θ x ) 0 cos ( θ x ) sin ( θ x ) sin ( θ y ) cos ( θ y ) sin ( θ x ) cos ( θ y ) cos ( θ x ) ]
I n = A ( x , y ) + B ( x , y ) cos [ φ ( x , y ) + 2 π n / N ]
φ ( x , y ) = arctan [ n = 0 N 1 I n ( x , y ) sin ( 2 π n / N ) n = 0 N 1 I n ( x , y ) cos ( 2 π n / N ) ]
φ e q ( x , y ) = [ φ h ( x , y ) φ l ( x , y ) ] ( mod 2 π )
λ e q = λ l λ h λ l λ h
k h ( x , y ) = R o u n d [ ( λ e q / λ h ) φ e q ( x , y ) φ h ( x , y ) 2 π ]
Φ h ( x , y ) = φ h ( x , y ) + 2 π k h ( x , y )
[ u c v c 1 ] T = M 3 × 4 c X
[ u p v p 1 ] T = M 3 × 4 p X
Q  =  [ u c M 3 c M 1 c v c M 3 c M 2 c u p M 3 p M 1 p v p M 3 p M 2 p ]
[ X w Y w Z w ]  =  [ m 11 c u c m 31 c m 12 c u c m 32 c m 13 c u c m 33 c m 21 c v c m 31 c m 22 c v c m 32 c m 23 c v c m 33 c m 11 p u p m 31 p m 12 p u p m 32 p m 13 p u p m 33 p ] 1 [ u c m 34 c m 14 c v c m 34 c m 24 c u p m 34 p m 14 p ]
u i j p = Φ v ( u i j c , v i j c ) W p 2 π n v
v i j p = Φ h ( u i j c , v i j c ) H p 2 π n h
H ¯ i j = arg min H i j p i j | | q i j k c H i j p i j k p | | 2
[ p i j p 1 ] = H ¯ i j [ q i j c 1 ]
min i = 1 n j = 1 m p i j p p ^ i j p ( K p , R i p , t i p , k ) 2
min a v g [ R i p | t i p ] i = 1 n j = 1 m ( u i j p . c o r r , v i j p . c o r r 1 ) T K p [ R i p | t i p ] ( x i j w , y i j w , z i j w ) T 2
( x i j p , y i j p , z i j p , 1 ) T  =  [ R i p | t i p ] ( x i j w , y i j w , z i j w , 1 ) T
A L = [ u i j c v i j c ] A = [ x i j p y i j p z i j p 1 0 0 0 0 u i j c x i j p u i j c y i j p u i j c z i j p 0 0 0 0 x i j p y i j p z i j p 1 v i j c x i j p v i j c y i j p v i j c z i j p ] L = [ m 11 c m 12 c m 13 c m 14 c m 21 c m 22 c m 23 c m 24 c m 31 c m 32 c m 33 c ] T 1 × 11
min a v g i = 1 n j = 1 m p i j p p ^ i j p ( K p , R i p , t i p , k ) 2 + q i j c q ^ i j c ( M c ) 2
[ x p y p z p ]  =  [ m 11 c m 12 c m 13 c m 21 c m 22 c m 23 c f x 0 u 0 p u p ] 1 [ u c m 14 c v c m 24 c 0 ]
k 1 Δ p cos ( β ) = k 2 Δ c
Select as filters


Select Topics Cancel
© Copyright 2024 | Optica Publishing Group. All rights reserved, including rights for text and data mining and training of artificial technologies or similar technologies.