DOI QR코드

DOI QR Code

Three Degrees of Freedom Global Calibration Method for Measurement Systems with Binocular Vision

  • Xu, Guan (Department of Vehicle Application Engineering, Traffic and Transportation College, Nanling Campus, Jilin University) ;
  • Zhang, Xinyuan (Department of Vehicle Application Engineering, Traffic and Transportation College, Nanling Campus, Jilin University) ;
  • Li, Xiaotao (Mechanical Science and Engineering College, Nanling Campus, Jilin University) ;
  • Su, Jian (Department of Vehicle Application Engineering, Traffic and Transportation College, Nanling Campus, Jilin University) ;
  • Lu, Xue (Department of Vehicle Application Engineering, Traffic and Transportation College, Nanling Campus, Jilin University) ;
  • Liu, Huanping (Department of Vehicle Application Engineering, Traffic and Transportation College, Nanling Campus, Jilin University) ;
  • Hao, Zhaobing (Department of Vehicle Application Engineering, Traffic and Transportation College, Nanling Campus, Jilin University)
  • 투고 : 2015.08.12
  • 심사 : 2015.12.16
  • 발행 : 2016.02.25

초록

We develop a new method to globally calibrate the feature points that are derived from the binocular systems at different positions. A three-DOF (degree of freedom) global calibration system is established to move and rotate the 3D calibration board to an arbitrary position. A three-DOF global calibration model is constructed for the binocular systems at different positions. The three-DOF calibration model unifies the 3D coordinates of the feature points from different binocular systems into a unique world coordinate system that is determined by the initial position of the calibration board. Experiments are conducted on the binocular systems at the coaxial and diagonal positions. The experimental root-mean-square errors between the true and reconstructed 3D coordinates of the feature points are 0.573 mm, 0.520 mm and 0.528 mm at the coaxial positions. The experimental root-mean-square errors between the true and reconstructed 3D coordinates of the feature points are 0.495 mm, 0.556 mm and 0.627 mm at the diagonal positions. This method provides a global and accurate calibration to unity the measurement points of different binocular vision systems into the same world coordinate system.

키워드

I. INTRODUCTION

The binocular vision system was proposed in the 1870s, with the advantage of non-contact inspection and high measurement precision [1-3]. It provides flourishing developments in motion parameters testing, measurement of machine parts, parameter detection of micro-operating systems, 3D profile reconstruction [4-7], etc. It also gradually permeates into the aerospace, robot navigation, industrial measurement and bio-medical areas [8-13]. System calibration is the basis for binocular vision to obtain 3D coordinates. It provides the conversion relationship between the 2D coordinates of the images of the cameras and the 3D coordinates of the objects [3, 14, 15]. As the measurement accuracy of the binocular vision system depends on the camera calibration, the research on the camera calibration method has much theoretical importance and practical value.

Monocular vision measurement refers to the method of a camera taking a single photo for the measurement work [16]. It is difficult to achieve real-time large-scale mapping with a monocular camera, due to the purely projective nature of the sensor [17, 18]. Furthermore, the binocular vision system adopts two cameras, which obtain the 3D coordinates by means of matching the corresponding points [19-21]. In this paper, we study the global calibration method of the binocular vision systems that are located on different positions. The previous calibration method can be divided into three aspects by the dimension of the calibration object. The calibration objects of the binocular vision system include 1D objects, 2D objects and 3D objects [7, 22]. The 1D-object-based calibration was proposed a few years ago, which required an object consisting of more than three feature points [23]. The 1D calibration object can be conveniently moved, which is able to solve the block problems among multiple cameras. However, as the constraints provided by the unrestricted 1D object cannot completely identify the internal and external parameters of the camera, the motion of the 1D object is usually strictly constrained [24]. As for the method of the 2D calibration object, the calibration plate using a 2D gray-modulated sinusoidal fringe pattern is chosen to extract the phase feature points as the 2D calibration data [25-27]. The knowledge of the plane motion is not necessary in this calibration and the setup is easier. However, it cannot deal with the occlusion and depth-related problems [28, 29]. A 2D calibration object was reported to calibrate a binocular vision system used for vision measurement [12]. A precise calibration method was proposed for a binocular vision system which is devoted to minimizing the metric distance error between the reconstructed point and the real point in a 3D measurement coordinate system. The measurement accuracy of one binocular vision system is improved compared with the traditional method. For a vision measurement with multiple binocular vision systems, the measurement world coordinates of a binocular vision system are different from the ones of another binocular vision system considering the different positions of the 2D object. Therefore, the measurement world coordinates of a binocular vision system should be transformed to the ones of another binocular vision system. And then the front view of a 2D object is also difficult to simultaneously observe by multiple binocular vision systems. As for the 3D calibration object, the calibration object usually consists of two or three planes orthogonal to each other [30, 31]. The highest accuracy is usually obtained by using a 3D calibration object [32]. But this approach requires expensive calibration apparatus and an elaborate setup [20, 33]. A global calibration method of a laser plane is proposed, adopting a 3D calibration board to generate two horizontal coordinates and a height gauge to generate the height coordinate of the point in the laser plane [34]. The method is performed by a camera and a 3D calibration object. However, a 3D object is unable to calibrate the measurement system that consists of multiple binocular vision systems. A binocular vision system captures the images of the front view of the 3D object while the binocular vision system opposite to the former system is blocked by the back view of the 3D object. The blocking problem in the 3D calibration makes the simultaneous observation by all involved cameras impossible. Thus, we propose a three-degrees-of-freedom (three-DOF) calibration method that translates the 3D calibration board and rotates it to solve the view block problems between opposite binocular systems. The measurement coordinates of multiple binocular vision systems are also unified in a unique coordinate system.

A three-DOF global calibration method for binocular vision measurement is explored in this paper, which unites the measurements of the binocular vision systems at different positions into a single world coordinate system. The rest of this paper is organized as follows: Section 2 presents the principle of the three-DOF global calibration system that calibrates the binocular systems at the original, coaxial and diagonal positions. Section 3 proposes a three-DOF calibration model. According to the model, the coordinates of the feature points at the coaxial and diagonal positions are unified into a same world coordinate system. Section 4 outlines the global calibration experiments in coaxial and diagonal positions. Section 5 summarizes this paper.

 

II. Three-DOF CALIBRATION SYSTEM

The 3D calibration board is chosen as the benchmark of a single measurement system with binocular vision, which takes advantage of the simple and accurate calibration process. However, it is difficult to perform the global calibration among several binocular measurement systems in a large view field because of the limited size of the 3D calibration board and the perspective blocking problem.

Considering the complicated process to manufacture a large 3D calibration board, we present a three-DOF calibration system that makes the 3D calibration board move along the precise rail with a known distance and rotate around the turntable with an angle in the large view field. The three-DOF global calibration system unites the measurement results of the binocular vision systems at different positions into a single world coordinate system.

The structure of the global calibration system for binocular measurement systems is shown in Fig. 1. The three-DOF calibration system is mainly composed of a translational subassembly and a rotational subassembly. The translational subassembly mainly consists of a base frame, a connection plate, an X direction rail, an X direction displacement sensor, a Y direction rail and a Y direction displacement sensor, etc. The rotational subassembly is mainly composed of a 3D calibration board, a turntable, etc. The global calibration objects are the binocular measurement systems at the coaxial and diagonal positions. Three groups of the binocular systems are placed at the initial, coaxial and diagonal positions, respectively. Figures 1(a) and 1(b) show the calibration process of two groups of the binocular systems at the initial and coaxial positions. In the calibration, only one group of binocular system captures the images of the positive side of the 3D calibration board. Therefore, the rails and the turntable are utilized to move the 3D calibration board along the Y direction rail and rotate around from the initial position to the coaxial position on the other side. The binocular system at the coaxial position captures the images of the positive side of the 3D calibration board after translating and rotating 180 degrees. Figure 1(c) shows the calibration process of the binocular system in the diagonal position. In the calibration, the 3D calibration board still has to move along the X direction rail to the diagonal position. Then the binocular system in the diagonal position captures the images of the positive side of the 3D calibration board. The translation displacements and the rotation angle are measured by the X, Y direction sensors and the turntable, respectively. The measurement results of the binocular systems at different positions are ultimately unified into the world coordinate system. The world coordinate system is determined by the initial position of the calibration board.

FIG. 1.The three-DOF global calibration system. (a) The binocular system in the initial position is calibrated by the three-DOF global calibration system. (b) The binocular system in the coaxial position is calibrated by the three-DOF global calibration system. (c) The binocular system in the diagonal position is calibrated by the three-DOF global calibration system. (d) The actual structure of the three-DOF global calibration system.

 

III. Three-DOF CALIBRATION MODEL

The imaging model of the binocular vision is indicated as [4]

where (x, y, z) is the 3D coordinate of a spatial point P, (u1, v1), (u2, v2) respectively represent the 2D coordinates of the corresponding points in the images captured by the two cameras, s1, s2 are non-zero scale factors, N1, N2 are the projection matrices of the two cameras. The projection matrix of the camera imaging model is calibrated by the relationship between 3D world coordinates and 2D image coordinates.

By eliminating the two scale factors s1 and s2 in Eqs. (1) and (2), the equations are described as

From Eqs. (3) and (4), the 3D coordinates of the feature points are reconstructed by the transformation matrices and image coordinates of the feature points.

When we globally calibrate the binocular system on the coaxial or diagonal position, the 3D calibration board is moved from the initial position to the coaxial or diagonal position of the rectangle constructed by the frames. A laser projector is adopted to project a global horizontal laser plane at the initial position. Two horizontal lines on the calibration plate coincide with the laser plane by adjusting the height of the calibration board. Thus, the z direction values of the feature points are invariable when the calibration board moves to different positions. Therefore, the global calibration model is simplified from 3D to 2D.

The intersection point of the three planes of the 3D calibration board in the initial position is taken as the global origin o of the world coordinate system o-xy. Then two images of the 3D calibration board are collected by the binocular system in the initial position. The 2D coordinates of the feature points are extracted from the two images. The 3D world coordinates are obtained from the size of the small square on the calibration board. The 2D image coordinates and the 3D world coordinates of the feature points in the initial position are known values, therefore, the transformation matrices are calibrated by Eqs. (3) and (4). Then the 3D world coordinates are reconstructed by the 2D image coordinates and the transformation matrices.

Three-DOF calibration model is shown in Fig. 2. When the binocular system on the coaxial position is globally calibrated, the 3D calibration board on the initial position rotates around the axis of the turntable, moves along the Y direction rail and finally arrives in the view field of the binocular system on the coaxial position. When we globally calibrate the binocular system on the diagonal position, the 3D calibration board on the coaxial position should move along the X direction rail and finally arrive in the view field of the binocular system on the diagonal position.

FIG. 2.The three-DOF global calibration model for the binocular systems at the initial, coaxial and diagonal positions.

The feature points of the 3D calibration board are purely rotated around the axis of the turntable, while the feature points in the rotation formula should be rotated around the origin of the coordinate system. Therefore, the misalignment between the axis of the turntable and the origin of the coordinate system should be considered when we perform the rotation transformations of the feature points. The o-xy world coordinate system is determined by the initial position of the 3D calibration board. Let o1 be the projection point of the rotation axis of the 3D calibration board in the o-xy coordinate system. The coordinate of o1 is (h, k) in the o-xy coordinate system. The coordinate of the feature point A is (x0, y0) in the o-xy coordinate system. First, as the feature points are rotated around the o1z1 rotation axis, the o-xy coordinate system is moved to the o1-x1y1 coordinate system. The coordinate of the feature point A is (x0-h, y0-k) in the o1-x1y1 coordinate system. Then the 3D calibration board is rotated around the o1z1 axis with a small angle θ. The value of the angle θ is obtained from the scale on the hand wheel of the turntable. After the rotation of the angle θ, the feature point A is moved to the point B. The coordinate of the point B is (x1, y1) in the o-xy world coordinate system. Thus the coordinate of the point B is (x1-h, y1-k) in the o1-x1y1 coordinate system. According to the formula of the rotation matrices, the relationship between the point A and B in the o1-x1y1 coordinate system is given by

The result of the above matrix multiplication is represented as

The coefficient determinants are represented by

The coordinates (x0, y0) and (x1, y1) of the point A, B in the o-xy world coordinate system are reconstructed by Eqs. (3), (4). So the coefficient determinants D, D1, D2 are calculated by the coordinates of the point A and B. According to Cramer’s rule, h and k are given by

Therefore, the 3D calibration board is rotated by a known angle θ to get the coordinate (h, k) of the projection point o1. After calibrating the coordinate (h, k) of the point o1, the coordinate (x1, y1) of the point B in the o-xy world coordinate system is obtained by the coordinate translation and rotation with a known arbitrary θ angle. According to Eq. (5), the coordinate (x1, y1) of the point B in the o-xy world coordinate system is given by

In the calibration of the binocular system at the coaxial position, the 3D calibration board is rotated by 180 degrees and moved along the Y direction to the coaxial position within the view field of the cameras. The point B moves along the Y direction with a distance of ΔY to the point C. The coordinate of the point C is (x2, y2) in the o-xy world coordinate system. As the ΔY direction is not parallel with the y direction, the relationship between the point C (x2, y2) and the point B (x1, y1) is represented by

where α is the angle between the ox axis of the world coordinate system and the Y direction rail.

As the point C and the point A are respectively obtained by the binocular system at the coaxial position and the initial position, substituting Eq. (12) into Eq. (13), the relationship between the point C (x2, y2) and the point A (x0, y0) is given by

In the calibration of the binocular system at the diagonal position, the 3D calibration board is rotated by 180 degrees and moved along the Y direction and X direction to the diagonal position within the view field of the cameras. In other words, the 3D calibration board needs to be moved along the X direction on the basis of the coaxial position. The point C moves along the X direction with a distance ΔX to the point D. The coordinate of the point D is (x3, y3) in the o-xy world coordinate system. The relationship between the point D (x3, y3) and the point B (x1, y1) is represented by

As the point D is obtained by the binocular system at the diagonal position, substituting Eq. (12) into Eq. (15), the relationship between the point D (x3, y3) and the point A (x0, y0) is

The coordinate (x0, y0) of the feature point on the 3D calibration board can be determined by the location of the corner point on the checkerboard. The coordinate (h, k) of the projection point of the rotation axis is obtained by Eqs. (10), (11). The values of ΔX and ΔY are shown by the displacement sensors that are installed on the two directions of the three-DOF calibration system. The rotation angle θ and angle α are read from the scale on the hand wheel of the turntable. In conclusion, the feature point C is captured by the binocular system at the coaxial position. The coordinate (x2, y2) of the point C in the world coordinate system is obtained from Eq. (14). The feature point D is captured by the binocular system on the diagonal position. The coordinate (x3, y3) of the point D in the world coordinate system is obtained from Eq. (16).

 

IV. RESULTS AND DISCUSSIONS

In the experiments, the size of the selected 3D calibration board is 500 mm × 500 mm × 500 mm. The plane pattern of the calibration board is a square checkerboard with the same size of 60 mm × 60 mm. The evenly distributed 18 points in the two sides of the calibration board are taken as the feature points. The measurement scope and accuracy of the displacement sensors in the X direction and the Y direction are 2000 mm and 0.05%, respectively. The turntable is the equipment to measure the rotation angle of the 3D calibration board. The accuracy of the turntable is ±40″. The laser level is the device to level the three-DOF global calibration system. The horizontal accuracy of the laser level is ±1 mm.

Three global calibration experiments are conducted to unite the measurement results of the binocular vision system at the coaxial position into the world coordinate system at the initial position. The 3D calibration board moves along the Y direction after rotating 180 degrees. The coaxial distances along the Y direction are 1428 mm, 1296 mm and 1358 mm in the three experiments. When the calibration is performed, the 3D calibration board is put at the position of the initial world coordinate system. The world coordinates of the 18 feature points are calculated by the size of the checkerboard. Then the transition matrices of the two cameras are calibrated based on the world coordinates of the feature points and the image coordinates of the 3D calibration board captured by the binocular system at the initial position. Furthermore, the 3D calibration board is rotated by a known angle. As the projection point o1 is the rotation axis projected in the horizontal plane, the coordinate (h, k) of the projection point o1 is obtained from Eqs. (10), (11). The known angle in the test is 4 degrees. The coordinates of the rotation axis in the three experiments are (258.118, 251.362), (277.352, 212.805) and (257.953, 245.333), respectively. Finally, the 3D calibration board is rotated by 180 degrees and moved to the place in front of the binocular system at the coaxial position. The angle α between the ox axis and the Y direction of the 3D calibration board is 45 degrees. The 3D calibration board is rotated and moved to the coaxial position. The world coordinate of the feature point (x2, y2) is obtained from Eq. (14).

According to the calibration processes of the coaxial position, the 3D world coordinates at different positions are reconstructed in Fig. 3. In the experiments, we make a comparison between the world coordinates of the reconstructed feature points and the true feature points. The coordinates of the reconstructed feature points are obtained from the image captured by the binocular system. In Fig. 3, the reconstructed feature points on the 3D calibration board are rotated by 180 degrees from the initial position. Then the board moves along the coaxial direction and finally reaches the coaxial position. The reconstructed and the true coordinates of the feature points at the coaxial position are in coincidence.

FIG. 3.The true and reconstructed coordinates of the feature points at the initial position, after rotating 180 degrees and moving to the coaxial position with a known distance in the Y direction. (a) The distance in the Y direction is 1428 mm, (b) The distance in the Y direction is 1296 mm, (c) The distance in the Y direction is 1358 mm.

In order to further analyze the accuracy of the reconstructed results, in the three experiments, the differences between the true and reconstructed 3D coordinates of the feature points at the initial position are indicated in Figs. 4(a)~4(c), respectively. The differences between the true and reconstructed 3D coordinates of the feature points at the coaxial position are indicated in Figs. 4(d)~4(f), respectively. Ex, Ey, Ez are the differences between the true and reconstructed 3D coordinates of the feature points at the x, y and z direction, respectively. E is the root-mean-square of the errors of the feature points.

FIG. 4.The differences between the true and reconstructed 3D coordinates of the feature points at the initial position and the coaxial position in the three experiments. (a) The first group experiments at the initial position, (b) The second group experiments at the initial position, (c) The third group experiments at the initial position, (d) The first group experiments at the coaxial position with the distance 1428 mm in the Y direction, (e) The second group experiments at the coaxial position with the distance 1296 mm in the Y direction, (f) The third group experiments at the coaxial position with the distance 1358 mm in the Y direction.

In Figs. 4(a)~4(c), the scope of the maximal errors about the 3D calibration board at the initial position is −1.5 mm−1.5 mm in the three experiments. The root-mean-square errors of the first group are 0.725 mm, 0.461 mm and 0.348 mm in the x, y and z directions, respectively. The total root-mean-square error of the group is 0.535 mm. The root-mean-square errors of the second group are 0.733 mm, 0.581 mm and 0.261 mm in the x, y and z directions, respectively. The total root-mean-square error of the group is 0.561 mm. The root-mean-square errors of the third group are 0.574 mm, 0.646 mm and 0.311 mm in the x, y and z directions, respectively. The total root-mean-square error of the group is 0.530 mm. In Figs. 4(d)~4(f), the scope of the maximal errors about the 3D calibration board at the coaxial position is also −1.5 mm−1.5 mm in the three experiments. The root-mean-square errors of the first group are 0.732 mm, 0.607 mm and 0.282 mm in the x, y and z directions, respectively. The total root-mean-square error of the group is 0.573 mm. The root-mean-square errors of the second group are 0.563 mm, 0.642 mm and 0.289 mm in the x, y and z directions, respectively. The total root-mean-square error of the group is 0.520 mm. The root-mean-square errors of the third group are 0.561 mm, 0.687 mm and 0.222 mm in the x, y and z directions, respectively. The total root-mean-square error of the group is 0.528 mm. In conclusion, the scope of the reconstructed errors after the rotation and movement of the 3D calibration board is the same as the scope of the reconstructed error at the initial position.

In Fig. 4(a), the errors’ scope is −1.5 mm-1.5 mm in the x direction. The errors’ scope is −1.5 mm−1 mm in the y direction. The errors of the feature points are in a dispersal distribution in the y direction. The errors mainly distribute in the scope of −0.5 mm−0.5 mm in the z direction. In Fig. 4(d), the errors dispersedly distribute in the scopes of −1 mm−1 mm and −1.5 mm−1.5 mm in the x and y directions, respectively. However, the errors in the z direction mainly distribute in −0.5 mm−0.5 mm. In Figs. 4(b)~4(f), the scopes of the errors in the x and y directions are both −1.5 mm−1.5 mm. The errors in the z direction mainly distribute in the scope of −0.5 mm−0.5 mm. Therefore, the errors in the z direction are less than the errors in the x and y directions. The direction with the minimal reconstructed errors is the perpendicular direction to the baseline and the measurement distance.

Three global calibration experiments are performed to unite the measurement results of the binocular vision system at the diagonal position into the world coordinate system at the initial position. The 3D calibration board moves along the Y direction and X direction after rotating a small angle. The distances along the Y direction are 1121 mm, 1262 mm and 1289 mm in the three experiments. The distances along the X direction are 1271 mm, 1339 mm and 1381 mm in the three experiments. The coordinates of the rotation axis in the three experiments are (260.467, 243.225), (270.628, 253.230) and (253.413, 243.256), respectively. Finally, the 3D calibration board is rotated by 180 degrees and moved to the place in front of the binocular system at the diagonal position. The world coordinate of the feature point (x3, y3) is obtained from Eq. (16).

According to the calibration processes of the diagonal position, the 3D world coordinates at different positions are reconstructed in Fig. 5. In Fig. 5, the reconstructed feature points on the 3D calibration board are rotated by 180 degrees from the initial position. Then the board moves along the Y direction, X direction and finally reaches the diagonal position. The true and reconstructed the coordinates of the feature points at the diagonal position are basically in coincidence.

FIG. 5.The true and reconstructed coordinates of the feature points at the initial position, after rotating 180 degrees and moving to the diagonal position with the known distances in the Y, X directions. (a) The distances in the Y direction and the X direction are 1121 mm, 1271 mm, respectively. (b) The distances in the Y direction and the X direction are 1262 mm, 1339 mm, respectively. (c) The distances in the Y direction and the X direction are 1289 mm, 1381 mm, respectively.

In the three experiments, the differences between the true and reconstructed 3D coordinates of the feature points at the initial position are indicated in Figs. 6(a)~6(c), respectively. The scope of the maximal errors about the 3D calibration board at the initial position is −1.5 mm−1.5 mm. The root-mean-square errors of the first experiment are 0.690 mm, 0.571 mm and 0.312 mm in the x, y and z directions, respectively. The total root-mean-square error of the group is 0.548 mm. The root-mean-square errors of the second experiment are 0.629 mm, 0.550 mm and 0.301 mm in the x, y and z directions, respectively. The total root-mean-square error of the group is 0.513 mm. The root-mean-square errors of the third experiment are 0.757 mm, 0.528 mm and 0.321 mm in the x, y and z directions, respectively. The total root-mean-square error of the group is 0.564 mm. The differences between the true and reconstructed 3D coordinates of the feature points at the diagonal position are indicated in Figs. 6(d)~6(f), respectively. In the three experiments, the scope of the maximal errors about the 3D calibration board at the diagonal position is also −1.5 mm−1.5 mm. The root-mean-square errors of the first group are 0.572 mm, 0.571 mm and 0.284 mm in the x, y and z directions, respectively. The total root-mean-square error of the group is 0.495 mm. The root-mean-square errors of the second group are 0.696 mm, 0.604 mm and 0.283 mm in the x, y and z directions, respectively. The total root-mean-square error of the group is 0.556 mm. The root-mean-square errors of the third group are 0.819 mm, 0.489 mm and 0.318 mm in the x, y and z directions, respectively. The total root-mean-square error of the group is 0.627 mm. In conclusion, the scope of the reconstructed errors after the rotation and movement of the 3D calibration board is the same as the scope of the reconstructed error at the initial position.

FIG. 6.The differences between the true and reconstructed 3D coordinates of the feature points at the initial position and the diagonal position in the three experiments. (a) The first group experiments at the initial position. (b) The second group experiments at the initial position. (c) The third group experiments at the initial position. (d) The first group experiments at the diagonal position with the distances 1121 mm, 1271 mm in the Y direction and the X direction, respectively. (e) The second group experiments at the diagonal position with the distances 1262 mm, 1339 mm in the Y direction and the X direction, respectively. (f) The third group experiments at the diagonal position with the distances 1289 mm, 1381 mm in the Y direction and the X direction, respectively.

In Fig. 6(a), the scopes of the errors in the x direction and y direction are −1.5 mm−1.5 mm, −1.5 mm−1 mm, respectively. The errors in the z direction mainly distribute in −0.5 mm−0.5 mm. In Fig. 6(d), the scopes of the errors in the x direction and y direction are −1 mm−1 mm, −1.5 mm−1.5 mm, respectively. The errors of the x and y directions are in a dispersal distribution. However, the errors in the z direction mainly distribute in the scope of −0.5 mm−0.5 mm. In Figs. 6(b)~6(f), the scopes of the errors in the x and y direction are both −1.5 mm−1.5 mm. The errors in the z direction mainly distribute in −0.5 mm−0.5 mm. Therefore, the minimal reconstructed errors also distribute in the direction that is perpendicular to the baseline and the measurement distance.

We perform calibration experiments for a binocular vision system with a 2D calibration board. The projection matrices of two cameras are determined by Zhang’s method [27]. The experimental results are shown in Figs. 7(a)~7(d). The measurement results are also evaluated by reconstructed errors. The experimental root-mean-square errors between the true and reconstructed 3D coordinates of the feature points are 0.989 mm, 1.048 mm and 0.546 mm The root-mean-square errors of the three group experiments are 0.963 mm, 0.952 mm, 0.421 mm in the x direction, 0.989 mm, 0.850 mm, 0.378 mm in the y direction and 1.014 mm, 1.292 mm, 0.758 mm in the z direction. The measurement accuracy of the three-DOF calibration method is close to the measurement accuracy of the 2D calibration method. The view block problem of multiple binocular systems is also solved by the three-DOF calibration method.

FIG. 7.The differences between the true and reconstructed 3D coordinates of the feature points using 2D board in three experiments. (a) The experimental calibration system. (b) Results of the first group experiments. (c) Results of the second group experiments. (d) Results of the third group experiments.

 

V. CONCLUSIONS

This paper addresses a global calibration method with a three-DOF calibration system. The model of the three-DOF global calibration system is studied in this paper. The 3D calibration board is moved to an arbitrary position in the model. According to the motion law of the three-DOF global calibration system at different positions, we deduce the global calibration model of the binocular systems at different positions. The measurement results of the binocular systems at different position are unified into a world coordinate system that is determined by the initial position of the calibration board. Three experiments of the binocular systems at the coaxial and diagonal positions are respectively conducted to verify the global calibration method of the vision measurement systems. The root-mean-square errors between the true and reconstructed 3D coordinates of the feature points are 0.573 mm, 0.520 mm and 0.528 mm at the coaxial position, respectively. The root-mean-square errors between the true and reconstructed 3D coordinates of the feature points are 0.495 mm, 0.556 mm and 0.627 mm at the diagonal position, respectively. The reconstructed errors of the global calibration system for the binocular measurement systems are minimal at the direction that is perpendicular to the baseline and the measurement distance. The method unities the measurement results of the binocular vision systems at different positions into the same world coordinate system, which is of great significance in the research field of the vision-based measurement.

참고문헌

  1. C. Schwarz, P. M. Prieto, E. J. Fernández, and P. Artal, “Binocular adaptive optics vision analyzer with full control over the complex pupil functions,” Opt. Lett. 36, 4779-4781 (2011). https://doi.org/10.1364/OL.36.004779
  2. M. Cho and D. Shin, “Depth resolution analysis of axially distributed stereo camera systems under fixed constrained resources,” J. Opt. Soc. Korea 17, 500-505 (2013). https://doi.org/10.3807/JOSK.2013.17.6.500
  3. J. Wang, X. J. Wang, F. Liu, Y. Gong, H. H. Wang, and Z. Qin, “Modeling of binocular stereo vision for remote coordinate measurement and fast calibration,” Opt. Laser. Eng. 54, 269-274 (2014). https://doi.org/10.1016/j.optlaseng.2013.07.021
  4. G. Xu, X. T. Li, J. Su, H. D. Pan, and G. D. Tian, “Precision evaluation of three-dimensional feature points measurement by binocular vision,” J. Opt. Soc. Korea 15, 30-37 (2011). https://doi.org/10.3807/JOSK.2011.15.1.030
  5. J. M. Ryu, J. H. Oh, and J. H. Jo, “Unified analytic calculation method for zoom loci of zoom lens systems with a finite object distance,” J. Opt. Soc. Korea 18, 134-145 (2014). https://doi.org/10.3807/JOSK.2014.18.2.134
  6. S. Di, H. Lin, and R. X. Du, “Two-dimensional (2D) displacement measurement of moving objects using a new MEMS binocular vision system,” J. Mod. Opt. 58, 694-699 (2011). https://doi.org/10.1080/09500340.2011.566636
  7. X. J. Zou, H. X. Zou, and J. Lu, “Virtual manipulator-based binocular stereo vision positioning system and errors modelling,” Mach. Vision Appl. 23, 43-63 (2012). https://doi.org/10.1007/s00138-010-0291-y
  8. X. L. Wang, “Novel calibration method for the multi-camera measurement system,” J. Opt. Soc. Korea 18, 746-752 (2014). https://doi.org/10.3807/JOSK.2014.18.6.746
  9. J. W. Kim, J. M. Ryu, J. H. Jo, and Y. J. Kim, “Evaluation of a corrected cam for an interchangeable lens with a distance window,” J. Opt. Soc. Korea 18, 23-31 (2014). https://doi.org/10.3807/JOSK.2014.18.1.023
  10. G. Aragon-Camarasa, H. Fattah, and J. P. Siebert, “Towards a unified visual framework in a binocular active robot vision system,” Robot. Auton. Syst. 58, 276-286 (2010). https://doi.org/10.1016/j.robot.2009.08.005
  11. C. Hyun, S. Kim, and H. Pahk, “Methods to measure the critical dimension of the bottoms of through-silicon vias using white-light scanning interferometry,” J. Opt. Soc. Korea 18, 531-537 (2014). https://doi.org/10.3807/JOSK.2014.18.5.531
  12. Y. Cui, F. Q. Zhou, Y. X. Wang, L. Liu, and H. Gao, “Precise calibration of binocular vision system used for vision measurement,” Opt. Express 22, 9134-9149 (2014). https://doi.org/10.1364/OE.22.009134
  13. K. Irsch, B. I. Gramatikov, Y. K. Wu, and D. L. Guyton, “New pediatric vision screener employing polarization-modulated, retinal-birefringence-scanning-based strabismus detection and bull’s eye focus detection with an improved target system: opto-mechanical design and operation,” J. Biomed. Opt. 19, 067004 (2014). https://doi.org/10.1117/1.JBO.19.6.067004
  14. P. Zhao and N. H. Wang, “Precise perimeter measurement for 3D object with a binocular stereo vision measurement system,” Optik 121, 953-957 (2010). https://doi.org/10.1016/j.ijleo.2008.12.008
  15. P. Zhao and G. Q. Ni, “Simultaneous perimeter measurement for 3D object with a binocular stereo vision measurement system,” Opt. Laser. Eng. 48, 505-511 (2010). https://doi.org/10.1016/j.optlaseng.2009.08.007
  16. W. M. Li and Y. Li, “Portable monocular light pen vision measurement system,” J. Opt. Soc. Am. A 32, 238-247 (2015). https://doi.org/10.1364/JOSAA.32.000238
  17. S. Y. Hwang and J. B. Song, “Monocular vision-based SLAM in indoor environment using corner, lamp, and door features from upward-looking camera,” IEEE T. Ind. Electron. 58, 4804-4812 (2011). https://doi.org/10.1109/TIE.2011.2109333
  18. Z. Y. Zhang and L. Yuan, “Building a 3D scanner system based on monocular vision,” Appl. Opt. 51, 1638-1644 (2012). https://doi.org/10.1364/AO.51.001638
  19. K. Sakai, M. Ogiya, and Y. Hirai, “Decoding of depth and motion in ambiguous binocular perception,” J. Opt. Soc. Am. A 28, 1445-1452 (2011). https://doi.org/10.1364/JOSAA.28.001445
  20. Y. Sando, D. Barada, and T. Yatagai, “Holographic 3D display observable for multiple simultaneous viewers from all horizontal directions by using a time division method,” Opt. Lett. 39, 5555-5557 (2014). https://doi.org/10.1364/OL.39.005555
  21. G. D. Love, D. M. Hoffman, P. J. Hands, J. Gao, A. K. Kirby, and M. S. Banks, “High-speed switchable lens enables the development of a volumetric stereoscopic display,” Opt. Express 17, 15716-15725 (2009). https://doi.org/10.1364/OE.17.015716
  22. Y. H. Lin and J. L. Wu, “Quality assessment of stereoscopic 3D image compression by binocular integration behaviors,” IEEE T. Image Process 23, 1527-1542 (2014). https://doi.org/10.1109/TIP.2014.2302686
  23. E. Peng and L. Li, “Camera calibration using one-dimensional information and its applications in both controlled and uncontrolled environments,” Pattern Recogn. 43, 1188-1198 (2010). https://doi.org/10.1016/j.patcog.2009.08.003
  24. J. H. Sun, Q. Z. Liu, Z. Liu, and G. J. Zhang, “A calibration method for stereo vision sensor with large FOV based on 1D targets,” Opt. Laser. Eng. 49, 1245-1250 (2011). https://doi.org/10.1016/j.optlaseng.2011.06.011
  25. M. Xie, Z. Z. Wei, G. J. Zhang, and X. G. Wei, “A flexible technique for calibrating relative position and orientation of two cameras with no-overlapping FOV,” Measurement 46, 34-44 (2013). https://doi.org/10.1016/j.measurement.2012.10.005
  26. D. Samper, J. Santolaria, F. J. Brosed, A. C. Majarena, and J. J. Aguilar, “Analysis of Tsai calibration method using two-and three-dimensional calibration objects,” Mach. Vision Appl. 24, 117-131 (2013). https://doi.org/10.1007/s00138-011-0398-9
  27. Z. Y. Zhang, “A flexible new technique for camera calibration,” IEEE T. Pattern Anal. 22, 1330-1334 (2000). https://doi.org/10.1109/34.888718
  28. M. A. Sutton, J. H. Yan, V. Tiwari, H. W. Schreier, and J. J. Orteu, “The effect of out-of-plane motion on 2D and 3D digital image correlation measurements,” Opt. Laser. Eng. 46, 746-757 (2008). https://doi.org/10.1016/j.optlaseng.2008.05.005
  29. L. Fauch, E. Nippolainen, V. Teplov, and A. A. Kamshilin, “Recovery of reflection spectra in a multispectral imaging system with light emitting diodes,” Opt. Express 18, 23394-23405 (2010). https://doi.org/10.1364/OE.18.023394
  30. M. Vo, Z. Y. Wang, L. Luu, and J. Ma, "Advanced geometric camera calibration for machine vision," Opt. Eng. 50, 110503 (2011). https://doi.org/10.1117/1.3647521
  31. X. A. Peng, X. L. Liu, Y. K. Yin, and A. Li, “Optical measurement network for large-scale and shell-like objects,” Opt. Lett. 36, 157-159 (2011). https://doi.org/10.1364/OL.36.000157
  32. Z. H. Zhang, H. Y. Ma, T. Guo, S. X. Zhang, and J. P. Chen, “Flexible calibration of phase calculation-based three-dimensional imaging system,” Opt. Lett. 36, 1257-1259 (2011). https://doi.org/10.1364/OL.36.001257
  33. B. Pan, D. F. Wu, and L. P. Yu, “Optimization of a three-dimensional digital image correlation system for deformation measurements in extreme environments,” Appl. Opt. 51, 4409-4419 (2012). https://doi.org/10.1364/AO.51.004409
  34. G. Xu, L. Sun, X. Li, J. Su, Z. Hao, and X. Lu, “Global calibration and equation reconstruction methods of a three dimensional curve generated from a laser plane in vision measurement,” Opt. Express 22, 22043-22055 (2014). https://doi.org/10.1364/OE.22.022043

피인용 문헌

  1. Global vision measurements of object position and orientation with non-coplanar feature points determined by point optimization in camera coordinate system vol.56, pp.1, 2017, https://doi.org/10.1364/AO.56.000105
  2. A method to calibrate a camera using perpendicularity of 2D lines in the target observations vol.6, pp.1, 2016, https://doi.org/10.1038/srep34951
  3. Optimization reconstruction of projective point of laser line coordinated by orthogonal reference vol.7, pp.1, 2017, https://doi.org/10.1038/s41598-017-15399-1
  4. 3-D Reconstruction of Binocular Vision Using Distance Objective Generated From Two Pairs of Skew Projection Lines vol.5, 2017, https://doi.org/10.1109/ACCESS.2017.2777818
  5. Four points: one-pass geometrical camera calibration algorithm pp.1432-2315, 2019, https://doi.org/10.1007/s00371-019-01632-7