Expand this Topic clickable element to expand a topic
Skip to content
Optica Publishing Group

Parameters estimation-based calibration method for rotational axis of line-structured light 3D reconstruction

Open Access Open Access

Abstract

Rotation is a critical component in 3D reconstruction systems, where accurate calibration of rotation axis parameters is essential for 3D stitching. In this study, what we believe to be a novel parameters estimation-based method for calibrating rotation axis parameters using 2D planar targets is proposed. Compared to traditional circle fitting methods, this method takes both orientation and position information into account, resulting in better precision performance. By leveraging the transmission of spatial pose relationships, the parameters estimation-based calibration method also effectively mitigates the impact of noise for more accurate calibration of rotation axis parameters. Error validation and 3D reconstruction experiments proved the superior performance of the proposed method. The experiment results demonstrate the effectiveness and applicability of the approach in enhancing the calibration of rotation axis parameters for 3D reconstruction systems.

© 2023 Optica Publishing Group under the terms of the Optica Open Access Publishing Agreement

1. Introduction

Benefiting from its simplicity, strong robustness, and non-contact measurement advantages, line-structured light has found wide applications in industrial scenarios such as reverse engineering, defect detection, and workpiece alignment [14], making it an important method for high-precision three-dimensional(3D) surface measurement in manufacturing [57]. For a line-structured light system, the reconstruction of 3D features relies on the dimensional expansion of an external displacement device. A line-structured light 3D sensor can only measure the 3D coordinates at the intersection of the light plane and the object in a single measurement. To perform a 3D shape measurement of the object, a scanning device needs to be combined to scan the surface of the object using the sensor. In certain scenarios where only a specific viewpoint of an object's 3D shape needs to be captured, a translation scanning method is typically adequate. This approach involves moving the scanning device along a linear path to capture the desired perspective. For example, in inspecting circuit board defects, the focus is on examining specific areas or components, and a single viewpoint is sufficient for analysis. However, there are situations where a more complete view of the object’s 3D shape is required. This is particularly important in applications such as digitizing cultural relics or creating detailed models of complex objects. In these cases, a combination of rotational scanning techniques is necessary.

In 3D reconstruction using rotational scanning, the point cloud coordinates output by the sensor need to undergo a rotation transformation to obtain the final measurement results. The rotation angle can be directly read from the rotation device, but the relative pose between the rotation axis and the sensor is unknown. Therefore, the calibration of extrinsic parameters for rotational scanning involves determining the relative pose between the sensor and the rotation axis. There are two cases to consider: one is when the line structured light sensor is fixed and the object rotates, and the other is when the object is fixed and stationary while the line-structured light sensor undergoes rotation.

For the rotational object scanning method, the coordinate system of the sensor is fixed. Therefore, a common calibration approach involves determining the equation of the rotation axis in the sensor's coordinate system. The basic idea of this calibration method is to perform a 3D spatial circle fitting on the trajectory of a target. By obtaining the plane in which the circle lies and the 3D coordinates of the circle's center, the equation of the rotation axis can be determined. The key lies in accurately obtaining the 3D coordinates of the target. Deng rotates a spherical target and fits the point cloud of each target position to determine the 3D coordinates of the sphere center in the sensor coordinate system [8,9]. The trajectory of the sphere center's motion is then used to calculate the equation of the straight line representing the rotation axis. Chen, similarly, uses a planar target and estimates the extrinsic parameters for each position during the rotation. The motion trajectory of feature points is fitted in 3D space to calculate the equation of the rotation axis [1012]. However, the circular fitting method based on the spherical target can be affected by factors such as intersection position and shooting angle, which may interfere with the precise estimation of the sphere center's coordinates.

A rotating sensor is used to scan a fixed object by placing the sensor on a rotating device. Li, Deshmukh and et al [1318] attached the sensor to the end of a robotic arm to measure the object. The relative poses between the arm's joints and its base coordinate system are known, requiring only the calibration of the sensor's relative pose. Zhu [19] uses a single-axis rotating table to rotate the sensor. By estimating the extrinsic parameters of a planar target, the sensor's relative pose to the rotation axis is calculated. The camera projection center is then determined through circle fitting, allowing for the estimation of the rotation center.

In this paper, the parameter estimation method achieves the calibration of the rotation axis position by deducing the mutual relationship of the calibration target's spatial pose. Different from the methods based on structured light scanning calibration objects, the proposed method in this paper only relies on the camera to achieve axis calibration. Applications related to axis calibration involving camera self-rotation can be addressed using the method presented in this paper. Furthermore, compared to the circle fitting method, the parameter estimation method fully utilizes the orientation and position information of the target, resulting in higher calibration accuracy. This paper consists of the following sections:, Section 2 establishes the model of line structured light rotation, Section 3 provides a detailed derivation of the parameter estimation-based axis calibration method, Section 4 explores the influence of camera-axis position relationship on calibration accuracy through simulations, and Section 5 compares the circle fitting method with the parameter estimation method through experiments, demonstrating the accuracy and practical value of the proposed method through re-projection error and 3D reconstruction results.

2. Mathematical model of line structured light rotation

The research objective of this study is to address the calibration of external parameters for a rotating line structured light sensor, particularly when the sensor is positioned close to the rotation axis. The main components of the system are a line structured light sensor and a turntable, which allow the establishment of a mathematical model for the self-rotation of the line structured light sensor.

Figure 1 illustrates an application example of a rotating structured light sensor. The line-structured light sensor performs image capture once per rotation, which is considered as one frame. The laser line feature points captured at each position are then unified into the same coordinate system through coordinate transformation, enabling the 3D reconstruction of the measured object through the combination of multiple frames. Figure 2 illustrates the schematic diagram of multi-frame point cloud alignment of the line-structured light 3D measurement system during the rotation.

 figure: Fig. 1.

Fig. 1. Application model of self-rotating line structured light.

Download Full Size | PDF

 figure: Fig. 2.

Fig. 2. Schematic diagram of multi-frame point cloud alignment of the line-structured light 3D measurement system during the rotation.

Download Full Size | PDF

During the calibration process of the line structured light system, the line laser is not involved. Therefore, the mathematical model can be simplified to describe the relationship between the rotation axis and the camera. In this research, the initial position of the camera is one of the key reference frames. When the line structured light system rotates, let's assume the camera coordinate system at the initial position (before the rotation) is denoted as C1: OC1-XC1YC1ZC1. After the rotation of the turntable by an angle θ, the camera coordinate system becomes C2: OC2-XC2YC2ZC2.

Establish the coordinate system for the rotation axis, as shown in Fig. 3. Define the rotation axis coordinate system as follows: the rotation axis r intersects with the OC1-XC1ZC1 plane in the camera coordinate system, and the intersection point is defined as the origin Orot of the rotation axis coordinate system. Let the Xrot, Yrot, and Zrot axes of the rotation axis coordinate system have the same direction as the XCYCZC axes. Suppose the coordinates of Orot in the camera coordinate system are (x0, 0, z0).

 figure: Fig. 3.

Fig. 3. Coordinate system definition in the rotation axis calibration process for line-structured light sensor.

Download Full Size | PDF

For convenience of representation, Fig. 4 depicts a simplified diagram of the model, showing only the representation of the defined coordinate system.

 figure: Fig. 4.

Fig. 4. Simplified model for rotation axis calibration.

Download Full Size | PDF

According to the geometric relationship, the coordinate systems before and after rotation have the following relationship:

$$\boldsymbol{O}_{\boldsymbol{rot}}^{\boldsymbol{R}}\boldsymbol{O}_{\boldsymbol{C2}}^{\boldsymbol{R}}\boldsymbol{= }{\boldsymbol{R}_{\boldsymbol{rot}}}\cdot \boldsymbol{O}_{\boldsymbol{rot}}^{\boldsymbol{R}}\boldsymbol{O}_{\boldsymbol{C1}}^{\boldsymbol{R}}$$
where $O_{rot}^R$, $O_{C1}^R$, $O_{rot}^R$ represent the coordinates of Orot, OC1, OC2, in the frame of the rotation axis. According to the Rodrigues rotation formula, Rrot represents the rotation matrix and can be obtained by
$${\boldsymbol{R}_{\boldsymbol{rot}}} = \left[ {\begin{array}{ccc} {\cos \theta + (1 - \cos \theta ){k_x}^2}&{(1 - \cos \theta ){k_x}{k_y} - \sin \theta {k_z}}&{(1 - \cos \theta ){k_x}{k_z} + \sin \theta {k_y}}\\ {(1 - \cos \theta ){k_x}{k_y} + \sin \theta {k_z}}&{\cos \theta + (1 - \cos \theta ){k_y}^2}&{(1 - \cos \theta ){k_y}{k_z} - \sin \theta {k_x}}\\ {(1 - \cos \theta ){k_x}{k_z} - \sin \theta {k_y}}&{(1 - \cos \theta ){k_y}{k_z} + \sin \theta {k_x}}&{\cos \theta + (1 - \cos \theta ){k_z}^2} \end{array}} \right]$$
VR = [kx, ky, kz]T represents the unit vector of the rotation axis in the coordinate system of the rotation axis. Since the coordinate system Orot of the rotation axis and the initial position coordinate system C1: OC1-XC1YC1ZC1 of the camera have the same metric structure, the Eq. (1) still holds in the C1 coordinate system and can be expressed as
$$\boldsymbol{O}_{\boldsymbol{rot}}^{\boldsymbol{C1}}\boldsymbol{O}_{\boldsymbol{C2}}^{\boldsymbol{C1}}\boldsymbol{= }{\boldsymbol{R}_{\boldsymbol{rot}}}\cdot \boldsymbol{O}_{\boldsymbol{rot}}^{\boldsymbol{C1}}\boldsymbol{O}_{\boldsymbol{C1}}^{\boldsymbol{C1}}$$

Similarly, where $\boldsymbol{O}_{\boldsymbol{rot}}^{\boldsymbol{C1}}$, OC1 and OC2 are the coordinates of Orot, OC1, and OC2 in the C1 coordinate system, and VR = [kx, ky, kz]T remains unchanged. The process of axis calibration can be simplified through a mathematical model to calibrate VR and $\boldsymbol{O}_{\boldsymbol{rot}}^{\boldsymbol{C1}}$.

3. Rotational axis calibration method based on parameter estimation

As shown in Fig. 2, axis calibration requires the use of a two-dimensional calibration board. The structured light sensor rotates by a certain angle θ, and the camera captures images of the calibration board. The world coordinate system (OW-XWYWZW) is defined with the origin located at the upper left corner point of the checkerboard. The XW axis aligns with the column axis of the checkerboard, extending from left to right. The YW axis corresponds to the row axis of the checkerboard, extending from top to bottom. The ZW axis is determined using the right-hand rule.

Let's assume P represents a point on the calibration board. At each rotation position, there exists:

$$\left\{ {\begin{array}{l} {P_{}^{C1} = {\boldsymbol{R}_{\boldsymbol{1}}}P_1^W + {\boldsymbol{T}_{\boldsymbol{1}}}}\\ {P_{}^{C2} = {\boldsymbol{R}_{\boldsymbol{2}}}P_2^W + {\boldsymbol{T}_{\boldsymbol{2}}}} \end{array}} \right.$$

Let R1 and T1 denote the rotation and translation matrix between the initial camera coordinate system and the world coordinate system of the checkerboard before rotation. Similarly, R2 and T2 represent the rotation and translation matrix between the camera coordinate system and the world coordinate system of the checkerboard target after rotation. These matrices can be easily obtained through the calibration of the target's extrinsic parameters. PC1 and PC2 represent the representation of point P in the C1 and C2 coordinate systems respectively. $P_1^W$ and $P_2^W$ represent the representation of the corners of the checkerboard target in the world coordinate system. Given that P1 and P2 are actually the same corner point on the checkerboard and both belong to the same world coordinate system, we can rewrite Eq. (4) as

$$P_{}^{C1} = {\boldsymbol{R}_{\boldsymbol{1}}}{\boldsymbol{R}_{\boldsymbol{2}}}^{ - 1}P_{}^{C2} - {\boldsymbol{R}_{\boldsymbol{1}}}{\boldsymbol{R}_{\boldsymbol{2}}}^{ - 1}{\boldsymbol{T}_{\boldsymbol{2}}} + {\boldsymbol{T}_{\boldsymbol{1}}}$$

Expanding Eq. (3), we can obtain:

$$\boldsymbol{O}_{\boldsymbol{C2}}^{\boldsymbol{C1}}\boldsymbol{- O}_{\boldsymbol{rot}}^{\boldsymbol{C1}}\boldsymbol{= }{\boldsymbol{R}_{\boldsymbol{rot}}}\boldsymbol{(O}_{\boldsymbol{C1}}^{\boldsymbol{C1}}\boldsymbol{- O}_{\boldsymbol{rot}}^{\boldsymbol{C1}}\boldsymbol{)}$$

Thus

$$\boldsymbol{O}_{\boldsymbol{C2}}^{\boldsymbol{C1}}\boldsymbol{= }{\boldsymbol{R}_{\boldsymbol{rot}}}\boldsymbol{O}_{\boldsymbol{C1}}^{\boldsymbol{C1}}\boldsymbol{+ (E - }{\boldsymbol{R}_{\boldsymbol{rot}}}\boldsymbol{)O}_{\boldsymbol{rot}}^{\boldsymbol{C1}}$$

$\boldsymbol{O}_{\boldsymbol{C1}}^{\boldsymbol{C1}}$ is the origin of the C1 coordinate, so the coordinate will be [0,0,0]T.

Thus Eq. (7) can be simplified as

$$\boldsymbol{O}_{\boldsymbol{C2}}^{\boldsymbol{C1}}\boldsymbol{= }(\boldsymbol{E} - {\boldsymbol{R}_{\boldsymbol{rot}}})\boldsymbol{O}_{\boldsymbol{rot}}^{\boldsymbol{C1}}$$
where E represents the identity matrix. In Eq. (5), when we substitute the origin of coordinate system C2 with [0, 0, 0]T, we have
$$\boldsymbol{O}_{\boldsymbol{C2}}^{\boldsymbol{C1}} = {\boldsymbol{T}_{\boldsymbol{1}}} - {\boldsymbol{R}_{\boldsymbol{1}}}{\boldsymbol{R}_{\boldsymbol{2}}}^{ - 1}{\boldsymbol{T}_{\boldsymbol{2}}}$$

So

$$\boldsymbol{O}_{\boldsymbol{rot}}^{\boldsymbol{C1}} = {(\boldsymbol{E - }{\boldsymbol{R}_{\boldsymbol{rot}}})^{ - 1}}({\boldsymbol{T}_{\boldsymbol{1}}}- {\boldsymbol{R}_{\boldsymbol{1}}}{\boldsymbol{R}_{\boldsymbol{2}}}^{ - 1}{\boldsymbol{T}_{\boldsymbol{2}}})$$

The rotation angle θ is known, and by substituting it into Eq. (2), VR = [kx, ky, kz]T can be calculated. Although the rotation angle can also be estimated from Eq. (2), there is a certain degree of error in the estimation. The rotational angle error of the rotating table is much smaller than the estimation error. Therefore, using the measured rotation angle can further reduce the estimation error for VR. To improve the calibration accuracy, it is necessary to collect feature points from multiple positions, perform multiple rotations, and capture multiple frames. What’s more, to minimize the impact of noise, the bundle adjustment algorithm plays a crucial role in this process. Assuming that VRi is estimated after each sampling, there may be differences of VRi due to noise. Assuming that each VRi has the same weight, the optimal solution should be the one that minimizes the sum of the angles φi between VR and each VRi (where φi is a positive value). What’s more, there won’t be huge difference between VR and each VRi, it can be assume that ${\varphi _i} \in (0,\pi /2)$. In this case, φi and sinφi have the same monotonicity. The objective function is constructed as follows:

$$\min \sum\limits_{i = 1}^N {{{\sin }^2}\varphi } \Leftrightarrow \min \sum\limits_{i = 1}^N {||{{\boldsymbol{V}_{\boldsymbol{R}}} \times {\boldsymbol{V}_{\boldsymbol{Ri}}}} ||}$$

During the rotation process, the rotation angle is known, and the orientation of the target is not random but subject to certain constraints. For example, a planar target perpendicular to the radius direction of the rotating table remains vertical throughout the rotation. The pose information of the target can also reflect the relative pose relationship between the rotation axis and the camera. Therefore, compared to the traditional circle fitting method that only uses the position information of the target, the model described above simultaneously utilizes the rotation angle information (substituted into the rotation angle θ), target position information (T1, T2), and target pose information (R1, R2). It provides a more comprehensive description of the rotation process of planar targets and exhibits stronger robustness.

4. Simulation of placement between camera and rotation axis

In the process of experiment, we have observed that the estimation error of the rotation axis position (Orot) varies when the sensor is at different distances from the center of rotation. The corresponding simulation are performed as follows. As is shown in Fig. 5, the rotation axis is fixed while the projection center of camera being placed in different position. We change the position of camera several times. In the simulation, fx = 10000, fy = 10000, u0 = 720, v0 = 540. fx, fy, u0 and v0 are the internal parameters of camera. The array of feature points on the target is 8 × 11, with a chessboard grid size of 1 mm. The lens focal length is 25 mm, and the distance from the focal plane to the projection center is 85 mm. The world coordinate error of the corner points is 0 mm. The standard deviation of the corner pixel coordinates is 0.3 pixels. The camera intrinsic parameters have no deviation. Take one image before rotation and one image after rotation, respectively.

 figure: Fig. 5.

Fig. 5. Simulation diagram: The impact of different distances between the camera projection center and the axis of rotation on the estimation of the axis.

Download Full Size | PDF

In the simulation, the deviations of the rotation axis position Orot and the reprojection error of the target corner points are calculated using the positional information before and after rotation. Figure 6 illustrates the variation of the camera's position in the simulation. At the initial position, the rotation axis is set at a pre-set distance from the camera's projection center. The simulation begins with the predefined distance as the starting position and increases the distance by a certain step size, repeating the calculation of deviation.

 figure: Fig. 6.

Fig. 6. Simulation schematic diagram of the change in distance between camera projection centers.

Download Full Size | PDF

Assuming that the camera is initially positioned with no distance in the Xrot direction and the predefined distance is only generated in the Zrot direction. The OC1 position is represented as (0, 0, 10) in the Orot-XrotYrotZrot coordinate system. The camera movement step size is set to 100 mm. After each movement, the simulation is repeated 1000 times at the current position.

As is shown in Fig. 7 and Fig. 8, the heights of the bar represent the means of error, while the error bars represent the standard deviations in the bar graph. It can be observed that as the distance between the rotation axis and the camera projection center increases, the estimation error for the rotation axis position also increases. The same situation is reflected in the reprojection error of the target's corner points. The simulation results demonstrate that the method proposed in this paper is more suitable for measurement scenarios where the rotation axis is close to the camera projection center, specifically with the line structured light sensor self-rotation. The method proposed in this article can be widely applied to scenarios where line structure light sensors undergo rotation. Furthermore, for the same field of view, if the rotating radius is longer and the same angle of rotation is applied, fewer target feature points can be obtained. This limitation prevents the method from increasing the rotating radius indefinitely. Therefore, during experiments, it is advisable to position the camera projection center as close as possible to the rotation axis in order to achieve better accuracy performance.

 figure: Fig. 7.

Fig. 7. Simulation results: Error of position (Orot).

Download Full Size | PDF

 figure: Fig. 8.

Fig. 8. Simulation results: Reprojection error of the corner points.

Download Full Size | PDF

5. Experiment and discussion

5.1 Intrinsic parameters and line structure light parameters calibration

In this section, extensive experimentation was conducted to validate the performance of the algorithm. Firstly, camera intrinsic parameters calibration and line structured light parameters calibration are essential. The camera used in this paper is acA1440-73gm from BASLER, with a resolution of 1440 × 1080. The line laser projector is PGL-L-405nm-30 mW from New Industries.

Before conducting the validation and comparative experiments in this paper, some additional calibration work is required, including camera intrinsic parameters calibration and line-structured light parameters calibration. The camera intrinsic parameters were calibrated using Zhang's method [19]. However, that calibration progress is not the main focus of this paper, and the final calibration results are provided as prior information for the experimental section of this paper. The camera intrinsic parameters and line-structured light calibration parameters are shown in Table 1.

Tables Icon

Table 1. The intrinsic parameters of camera and line-structured light parameters

5.2 Rotation axis calibration and comparative experiment

To verify the performance of the proposed method, a validation of measurement errors is conducted in this section. This includes evaluating the corner point reprojection error, length measurement error of gauge blocks, and radius error of standard spheres. As shown in Fig. 9, the ZOLIX-TBRK200 electric rotation stage with a rotation angle error of less than 0.003° was utilized for the experiment. A calibration board with a pattern of 12 × 9 and corner point accuracy of 1μm was used. The camera position was adjusted to ensure that the projection center was close to the rotation axis. As depicted in Fig. 10, the calibration process began when the calibration board fully appeared at the left edge of the field of view and ended at the right edge, ensuring complete capture of the checkerboard pattern. The rotation stage was rotated in steps of 0.3°, totaling 43 rotations and yielding a total rotation angle of 12.9°.

 figure: Fig. 9.

Fig. 9. Rotation axis calibration and comparative experiment.

Download Full Size | PDF

 figure: Fig. 10.

Fig. 10. Changes of checkerboard in camera field of view during rotation axis calibration process.

Download Full Size | PDF

The calibration results of the rotation axis parameters using the proposed parameter estimation method and the circle fitting method with corner points as feature points are presented in Table 2.

Tables Icon

Table 2. Calibration results of different axis calibration methods

The next step involves predicting the pixel coordinates of the chessboard target corner points at different rotation angles using the calibration results mentioned above. The procedure is as follows:

  • (1) Place the chessboard target at the initial position and capture an image.
  • (2) Rotate the target by a certain angle and capture another image.
  • (3) Utilizing the calibration results for the rotation axis, it is possible to predict the pixel coordinates of the corner points after rotation using the pixel coordinates of the initial position corner points
  • (4) By comparing the actual pixel coordinates of the rotated corner points with the predicted pixel coordinates, the error can be calculated and it reflects the calibration accuracy of the rotation axis.

The average corner point reprojection error (ereprojection) is defined as the mean of the reprojection errors of all the corner points on the chessboard grid.

$${e_{reprojection}} = \frac{1}{{{N_{CP}}}}\sum\limits_{i,j} {||{{P_{EP,ij}} - {P_{E,ij}}} ||} $$

The PE,ij represents the actual pixel coordinates of a corner point, while the PEP,ij represents the predicted pixel coordinates of the corner point. The i and j correspond to the row and column numbers of the corner point in the chessboard corner matrix, respectively. NCP refers to the total number of corner points on the chessboard.

The ereprojection of the two calibration methods under different angles is shown in Fig. 11.

 figure: Fig. 11.

Fig. 11. Rotation axis calibration and comparative experiment.

Download Full Size | PDF

The following experiment are carried out to further compare the differences in the calibration results. As shown in Fig. 12, we also conducted experiments on length measurement of gauge blocks and radius measurement of standard spheres. The standard sphere, made of ceramic, has a reference radius value of 10 mm with a maximum radius error of 0.6 μm. The standard plane is made of ceramic, and the flatness is 2 μm.

 figure: Fig. 12.

Fig. 12. (a) The experiment of length measurement of gauge blocks (b) The experiment of radius measurement of standard spheres.

Download Full Size | PDF

The length measurement error of gauge blocks is defined as

$${e_{length}} = {l_m} - {l_{ref}}$$
where lm is the measured value of the length of the gauge blocks, lref is the reference value of the length of the gauge blocks.

The radius error is defined as

$${e_{redius}} = {r_m} - {r_{ref}}$$
where rm is the measured value of the sphere radius, rref is the reference value of the sphere radius.

The experiment results of the gauge blocks and standard sphere are shown in Fig. 13 and Fig. 14. Figure 13 shows the point cloud of the flatness deviation of gauge blocks. Figure 14 shows point cloud of the standard sphere. Since the sphere has a radius of 10 mm, the differences in the 3D reconstruction results of each method are on the scale of tens of micrometers. As a result, no significant differences can be observed in the point cloud image. Therefore, only the point cloud generated by the parameter estimation-based method is presented in the figure. The same applies to length measurements with gauge blocks.

 figure: Fig. 13.

Fig. 13. The 3D point cloud of the standard plane with the system calibrated by parameters estimation-based method.

Download Full Size | PDF

 figure: Fig. 14.

Fig. 14. The 3D point cloud of the standard sphere with the system calibrated by parameters estimation-based method.

Download Full Size | PDF

To verify the repeatability of the two methods, 10 repeated calibrations were performed for each method. Based on the calibration results from the 10 repetitions, the standard sphere radius and gauge block length were calculated. The root mean square of the measurements from each method is shown in Table 3.

Tables Icon

Table 3. Root mean square of the measurements from different rotation axis calibration methods

According to the calibration results, it can be concluded that the circle fitting method exhibits a large estimation error in the center position of the rotation axis. During the experiment described in this paper, the camera's maximal sampling range was utilized to acquire as much angular information as possible. However, the significant estimation error observed in the circle fitting method is primarily attributed to insufficient sampling of angular information. The circle fitting method suffers from significant accuracy degradation within limited rotation angles, which limits its application. This finding further demonstrates the superior accuracy of the proposed algorithm within the effective sampling range achievable by the camera. The algorithm can more accurately estimate the position and direction information of the rotation axis, even with limited angle sampling.

5.3 3D reconstruction experiment

Based on the calibration results in Section 5.2, the following 3D reconstruction experiments were conducted. The experimental results are shown in Fig. 15. Figure 15(a) shows the authentic appearance of the measured object. Figure 15(b) depicts a 3D reconstruction model obtained by concatenating rotation axis parameters calibrated using the parameter estimation method. The effect of using the circle fitting method is shown in Fig. 15(c). Consistent with the calculation results of the reprojection error, the parameter estimation method has better performance in estimating the position and direction of the rotation axis, thus demonstrates more accurate scale measurements. Figure 15(b) and Fig. 15(c) adopt the same scale, and it is evident that the circle fitting method’s inaccurate estimation of the rotation axis position leads to distortions in the horizontal scale of the 3D reconstruction pattern (It causes the pattern to stretch beyond its actual proportions).

 figure: Fig. 15.

Fig. 15. (a) Authentic appearance of the measured object (b) 3D reconstruction pattern generated from parameters estimation method (c) 3D reconstruction pattern generated from circle fitting method

Download Full Size | PDF

6. Conclusion

To achieve the calibration of rotation axis parameters, including but not limited to line-structured light systems, a parameter estimation-based method is proposed in this paper. This method utilizes a 2D planer target and captures multiple target images by rotating the camera around the rotation axis at certain angles. Subsequently, the mutual relationship of the spatial poses of the calibration targets is deduced to achieve the calibration of the rotation axis parameters. Compared to the circle fitting method, the parameter estimation method fully utilizes the orientation and position information of the target, resulting in higher calibration accuracy. It is noteworthy that the proposed method in this paper differs from those relying on structured light scanning calibration objects, as it solely relies on the camera to achieve rotation axis calibration. Therefore, all problems related to rotation axis calibration involving camera self-rotation can be addressed by referring to the method proposed in this paper. Additionally, this paper explores the influence of the camera-axis position relationship on calibration accuracy and concludes that the closer the camera is to the rotation axis, the higher the calibration accuracy. Lastly, the proposed method is demonstrated to have excellent accuracy and practical value through the evaluation of varies error verification experiment and 3D reconstruction results. The rotation axis parameter calibration method based on parameter estimation will be very helpful in the field of 3D reconstruction.

Funding

National Natural Science Foundation of China (52205573, 61971307, 62231011, U2241265); National Key Research and Development Plan Project (2020YFB2010800); China Postdoctoral Science Foundation (2022M720106); Joint Fund of Ministry of Education for Equipment Pre-research (8091B022144); National Defense Science and Technology Key Laboratory Fund (6142212210304); Guangdong Province Key Research and Development Plan Project (2020B0404030001); the Fok Ying Tung Education Foundation (171055); Young Elite Scientists Sponsorship Program by CAST (2021QNRC001).

Disclosures

The authors declare no conflicts of interest.

Data availability

Data underlying the results presented in this paper are not publicly available at this time but may be obtained from the authors upon reasonable request.

References

1. Z. Liu, J. Sun, H. Wang, et al., “Simple and fast rail wear measurement method based on structured light,” Opt. Lasers Eng. 49(11), 1343–1351 (2011). [CrossRef]  

2. G. Vosselman, “Automated planimetric quality control in high accuracy airborne laser scanning surveys,” ISPRS J. Photogramm. Remote Sens. 74, 90–100 (2012). [CrossRef]  

3. S.C. Park and M. Chang, “Reverse engineering with a structured light system,” Comput. Ind. Eng. 57(4), 1377–1384 (2009). [CrossRef]  

4. G. Yang and Y. Wang, “Three-dimensional measurement of precise shaft parts based on line-structured light and deep learning,” Meas. J. Int. Meas. Confed. 191, 110837 (2022). [CrossRef]  

5. Z. Zhu and H. Liu, “Calibration method of line-structured light sensors based on a hinge-connected target with arbitrary pinch angles,” Appl. Opt. 62(7), 1695–1703 (2023). [CrossRef]  

6. W. Zou and Z. Wei, “High-accuracy calibration of line-structured light vision sensors using a plane mirror,” Opt. Express 27(24), 34681–34704 (2019). [CrossRef]  

7. P. Xiao and L. Zhen, “High-accuracy calibration of line-structured light vision sensor by correction of image deviation,” Opt. Express 27(4), 4364–4385 (2019). [CrossRef]  

8. J Deng, B Chen, X Cao, et al., “3D Reconstruction of rotating objects based on line-structured light scanning,” International Conference on Sensing, Diagnostics, Prognostics, and Control, 2018: 244–247.

9. Y Zong, J Liang, W Pai, et al., “A high-efficiency and high-precision automatic 3D scanning system for industrial parts based on a scanning path planning algorithm,” Opt. Lasers Eng. 158, 107176 (2022). [CrossRef]  

10. P Chen, M Dai, K Chen, et al., “Rotation axis calibration of a turntable using constrained global optimization,” Optik 125(17), 4831–4836 (2014). [CrossRef]  

11. Z Shi, T Wang, and J. Lin, “A simultaneous calibration technique of the extrinsic and turntable for structured-light-sensor-integrated CNC system,” Opt. Lasers Eng. 138, 106451 (2021). [CrossRef]  

12. T Xu, J Zhang, H Cui, et al., “Research on 360° 3D point cloud reconstruction technology based on turntable and line-structured light stereo vision,” 4th International Conference on Advanced Electronic Materials, Computers and Software Engineering, 2021:336–340.

13. R Zhang and J Wu, “A hand-eye calibration method for line structured light system,” Proceedings of the 30th Chinese Control and Decision Conference, 2018:5758–5761.

14. Z Wang, J Fan, F Jing, et al., “An efficient calibration method of line-structured light vision sensor in Robotic Eye-in-Hand System,” IEEE Sens. J. 20(11), 6200–6208 (2020). [CrossRef]  

15. J Li, M Chen, X Jin, et al., “Calibration of a multiple axes 3-D laser scanning system consisting of robot, portable laser scanner and turntable,” Optik 122(4), 324–329 (2011). [CrossRef]  

16. S Yin, Y Ren, Y Guo, et al., “Development and calibration of an integrated 3D scanning system for high-accuracy large-scale metrology,” Measurement 54, 65–76 (2014). [CrossRef]  

17. K Deshmukh, L Rickli J, and A Djuric, “Kinematic modeling of an automated laser line point cloud scanning System,” Procedia Manufacturing 5, 1075–1091 (2016). [CrossRef]  

18. Z Xie, P Yu, H Gong, et al., “Flexible scanning method by integrating laser line sensors with articulated arm coordinate measuring machines,” Chin. J. Mech. Eng. 35(1), 116 (2022). [CrossRef]  

19. Z. Zhang, “Flexible camera calibration by viewing a plane from unknown orientations,” Seventh IEEE International Conference on Computer Vision. IEEE, 1999.

Data availability

Data underlying the results presented in this paper are not publicly available at this time but may be obtained from the authors upon reasonable request.

Cited By

Optica participates in Crossref's Cited-By Linking service. Citing articles from Optica Publishing Group journals and other participating publishers are listed here.

Alert me when this article is cited.


Figures (15)

Fig. 1.
Fig. 1. Application model of self-rotating line structured light.
Fig. 2.
Fig. 2. Schematic diagram of multi-frame point cloud alignment of the line-structured light 3D measurement system during the rotation.
Fig. 3.
Fig. 3. Coordinate system definition in the rotation axis calibration process for line-structured light sensor.
Fig. 4.
Fig. 4. Simplified model for rotation axis calibration.
Fig. 5.
Fig. 5. Simulation diagram: The impact of different distances between the camera projection center and the axis of rotation on the estimation of the axis.
Fig. 6.
Fig. 6. Simulation schematic diagram of the change in distance between camera projection centers.
Fig. 7.
Fig. 7. Simulation results: Error of position (Orot).
Fig. 8.
Fig. 8. Simulation results: Reprojection error of the corner points.
Fig. 9.
Fig. 9. Rotation axis calibration and comparative experiment.
Fig. 10.
Fig. 10. Changes of checkerboard in camera field of view during rotation axis calibration process.
Fig. 11.
Fig. 11. Rotation axis calibration and comparative experiment.
Fig. 12.
Fig. 12. (a) The experiment of length measurement of gauge blocks (b) The experiment of radius measurement of standard spheres.
Fig. 13.
Fig. 13. The 3D point cloud of the standard plane with the system calibrated by parameters estimation-based method.
Fig. 14.
Fig. 14. The 3D point cloud of the standard sphere with the system calibrated by parameters estimation-based method.
Fig. 15.
Fig. 15. (a) Authentic appearance of the measured object (b) 3D reconstruction pattern generated from parameters estimation method (c) 3D reconstruction pattern generated from circle fitting method

Tables (3)

Tables Icon

Table 1. The intrinsic parameters of camera and line-structured light parameters

Tables Icon

Table 2. Calibration results of different axis calibration methods

Tables Icon

Table 3. Root mean square of the measurements from different rotation axis calibration methods

Equations (14)

Equations on this page are rendered with MathJax. Learn more.

O r o t R O C 2 R = R r o t O r o t R O C 1 R
R r o t = [ cos θ + ( 1 cos θ ) k x 2 ( 1 cos θ ) k x k y sin θ k z ( 1 cos θ ) k x k z + sin θ k y ( 1 cos θ ) k x k y + sin θ k z cos θ + ( 1 cos θ ) k y 2 ( 1 cos θ ) k y k z sin θ k x ( 1 cos θ ) k x k z sin θ k y ( 1 cos θ ) k y k z + sin θ k x cos θ + ( 1 cos θ ) k z 2 ]
O r o t C 1 O C 2 C 1 = R r o t O r o t C 1 O C 1 C 1
{ P C 1 = R 1 P 1 W + T 1 P C 2 = R 2 P 2 W + T 2
P C 1 = R 1 R 2 1 P C 2 R 1 R 2 1 T 2 + T 1
O C 2 C 1 O r o t C 1 = R r o t ( O C 1 C 1 O r o t C 1 )
O C 2 C 1 = R r o t O C 1 C 1 + ( E R r o t ) O r o t C 1
O C 2 C 1 = ( E R r o t ) O r o t C 1
O C 2 C 1 = T 1 R 1 R 2 1 T 2
O r o t C 1 = ( E R r o t ) 1 ( T 1 R 1 R 2 1 T 2 )
min i = 1 N sin 2 φ min i = 1 N | | V R × V R i | |
e r e p r o j e c t i o n = 1 N C P i , j | | P E P , i j P E , i j | |
e l e n g t h = l m l r e f
e r e d i u s = r m r r e f
Select as filters


Select Topics Cancel
© Copyright 2024 | Optica Publishing Group. All rights reserved, including rights for text and data mining and training of artificial technologies or similar technologies.