Fig. 1. A Lidar-Camera System for 3D Map Generation (a) System View (b) Transformation between Lidar and 6 Cameras
Fig. 2. 3D Transformation Relationship between a LiDAR Sensor and a Camera Defined by Three 3D Points
Fig. 3. Acquisition of Lidar and Camera Data. A ball is Moving in front of Both Sensors. At Several Positions, 3D Range and 2D Image Data are Acquired
Fig. 4. A Method of Detecting the Center Point of a Sphere using Any Four Points on the Sphere Surface
Fig. 5. Examples of the Detecting the Center of a Sphere in a 2D Image (a) Original Image (b) Spherical Object Color Detection(Adaptive Threshold) (c) Spherical object Detection(Ellipse Fitting)
Fig. 6. (a) Velodyne 16-Channel Lidar Sensor and 6 Vision cameras (b) Calibration Ball
Fig. 7. Data Acquisition of a Spherical Ball with Distance at (a) 1m (b) 2m (c) 3m
Fig. 9. Data Acquisition of a Circular Planar Object with Distance at (a) 1m (b) 2m (c) 3m
Fig. 10. Depth Data obtained from the Lidar Sensor (a) Ball Object(Hand-Held by a Person) (b) Planar Object
Fig. 11. Examples of the 3D Point Clouds of the Ball Surface
Fig. 12. Result of the Projection (a) Depth Data of the Planar Object (b) Projection Result on a Plane
Fig. 13. The Process of Finding the Center Point of a Planar Object (a) the Edge Points (b) a Center Axis (c) the Points Projected on a Plane (d) a Center Point Detected From
Fig. 14. Reprojection Error of the 1-st Camera at DIfferent Calibration Distance. Blue Bar is Average Error and Red Line is Standard Deviation
Fig. 15. The Result of Modeling the Building and Basketball Court using a Lidar Sensor and Six-Cameras (a) Indoor Data (b) Outdoor Data
Fig. 8. A Planar Object for Comparison of a Conventional Calibration Method
Table 3. Reprojection Error of Each Camera View
Table. 4. Comparison of Reprojection Error with Conventional Methods(In Pixel)
Table 1. Pseudocode of Calculating the Ball Center
Table. 2. Number Positions where the Calibration Objects are Placed for 3D and 2D Data Acquisition
References
- M. Hassanein, A. Moussa, and N. El-Sheimy, "A new automatic system calibration of multi-cameras and lidar sensors," International Archives of the Photogrammetry, Remote Sensing & Spatial Information Sciences, Vol.41, No. 23, pp.589-594, Jul. 2016.
- I. Ashraf, S. J. Hur, and Y. W. Park, "An investigation of interpolation techniques to generate 2D intensity images from lidar data," IEEE Access, Vol.5, pp. 8250-8260, Apr. 2017. https://doi.org/10.1109/ACCESS.2017.2699686
- J. P. Hwang, S. K. Park, E. T. Kim, and H. J. Kang, "Camera and LIDAR Combined System for On-Road Vehicle Detection," Journal of Institute of Control, Robotics and Systems, Vol.15, No.4, pp.390-395, Oct. 2017. https://doi.org/10.5302/J.ICROS.2009.15.4.390
- J. W. Kim, J. Y. Jeong, Y. S. Shin, Y. G. Cho, H. C. Roh, and A. Y. Kim, "Lidar configuration comparison for urban mapping system," 2017 14th International Conference on Ubiquitous Robots and Ambient Intelligence(URAI), pp. 854-857, 2017.
- Z. Zhang, "A flexible new technique for camera calibration," IEEE Transactions on Pattern Analysis and Machine Intelligence, Vol.22, No.11, pp.1330-1334, Nov. 2016. https://doi.org/10.1109/34.888718
- O. Naroditsky, A. Patterson and K. Daniilidis, "Automatic alignment of a camera with a line scan lidar system," 2011 IEEE International Conference on Robotics and Automation (ICRA), pp.3429-3434, 2011.
- Y. S. Park, S. M. Yun, S. W. Chee, K. E. Cho, K. H. Um, and S. D. Sim, "Calibration between color camera and 3D lidar instruments with a polygonal planar board," Sensors, Vol.14, No.3, pp.5533-5353, Mar. 2014.
- M. Velas, M. Spanel, Z. Materna, and A. Herout, "Calibration of RGB camera With velodyne lidar," International Conference on Computer Graphics, Visualization and Computer Vision (WSCG), pp.135-144, 2014.
- T. GEE, J. James, W. V. D. Mark, A. G. Strozzi, P. Delmas, and G. Gimelfarb, "Estimating extrinsic parameters between a stereo rig and a multi-layer lidar using plane matching and circle feature extraction," 2017 Fifteenth IAPR International Conference on Machine Vision Applications(MVA), pp.21-24, 2017.
- S. Park and S. Choi, "Convenient View Calibration of Multiple RGB-D Cameras Using a Spherical Object," KIPS Transactions on Software and Data Engineering, Vol.3, No.8, pp.309-314, 2014. https://doi.org/10.3745/KTSDE.2014.3.8.309
- G. Lee, J. Lee, and S. Park, "Calibration of VLP-16 Lidar and multi-view cameras using a ball for 360 degree 3D color map acquisition," in Proceedings of 2017 IEEE International Conference on Multisensor Fusion and Integration for Intelligent Systems(MFI), 2017.
- J. H. Lee, E. S. Kim, and S. Y. Park, "Synchronization error compensation of multi-view RGB-D 3D modeling system," Asian Conference on Computer Vision(ACCV), pp.162-174, 2016.
- M. A. Fischler and C. R. Bolles, "Random sample consensus: a paradigm for model fitting with applications to image analysis and automated cartography," Communications of the ACM, Vol.24, No.6, pp.381-395, Jun. 1981. https://doi.org/10.1145/358669.358692
- M. Ruan and D. Huber, "Calibration of 3D Sensors Using a Spherical Target," 2014 2nd International Conference on 3D Vision, Vol.1, pp.187-193, Dec. 2014.
- D. Loannou, H. Walter, and A. F. Laine, "Circle recognition through a 2D Hough Transform and radius histogramming," Image and Vision Computing, pp.15-26, 1999.