DOI QR코드

DOI QR Code

Intensity and Ambient Enhanced Lidar-Inertial SLAM for Unstructured Construction Environment

비정형의 건설환경 매핑을 위한 레이저 반사광 강도와 주변광을 활용한 향상된 라이다-관성 슬램

  • Jung, Minwoo (Dept. of Civil and Environmental Engineering, KAIST) ;
  • Jung, Sangwoo (Dept. of Civil and Environmental Engineering, KAIST) ;
  • Jang, Hyesu (Dept. of Civil and Environmental Engineering, KAIST) ;
  • Kim, Ayoung (Dept. of Civil and Environmental Engineering, KAIST)
  • Received : 2021.04.27
  • Accepted : 2021.06.06
  • Published : 2021.08.31

Abstract

Construction monitoring is one of the key modules in smart construction. Unlike structured urban environment, construction site mapping is challenging due to the characteristics of an unstructured environment. For example, irregular feature points and matching prohibit creating a map for management. To tackle this issue, we propose a system for data acquisition in unstructured environment and a framework for Intensity and Ambient Enhanced Lidar Inertial Odometry via Smoothing and Mapping, IA-LIO-SAM, that achieves highly accurate robot trajectories and mapping. IA-LIO-SAM utilizes a factor graph same as Tightly-coupled Lidar Inertial Odometry via Smoothing and Mapping (LIO-SAM). Enhancing the existing LIO-SAM, IA-LIO-SAM leverages point's intensity and ambient value to remove unnecessary feature points. These additional values also perform as a new factor of the K-Nearest Neighbor algorithm (KNN), allowing accurate comparisons between stored points and scanned points. The performance was verified in three different environments and compared with LIO-SAM.

Keywords

Acknowledgement

This work is supported by the Korea Agency for Infrastructure Technology Advancement (KAIA) grant funded by the Ministry of Land, Infrastructure and Transport (National Research for Smart Construction Technology: Grant 21SMIP-A158708-02)

References

  1. T. Shan, B. Englot, D. Meyers, W. Wang, C. Ratti, and D. Rus, "LIO-SAM: Tightly-coupled Lidar Inertial Odometry via Smoothing and Mapping," 2020 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Las Vegas, NV, USA, 2020, DOI: 10.1109/IROS45743.2020.9341176.
  2. P. J. Besl and N. D. McKay, "A Method for Registration of 3D Shapes," IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 14, no. 2, pp. 239-256, 1992, DOI: 10.1109/34.121791.
  3. A. Segal, D. Haehnel, and S. Thrun, "Generalized-ICP," Robotics: Science and Systems, Seattle, USA, 2009, DOI: 10.15607/RSS.2009.
  4. D. Chetverikov, D. Svirko, D. Stepanov, and P. Krsek, "The Trimmed Iterative Closest Point algorithm," Object recognition supported by user interaction for service robots, Quebec City, QC, Canada, 2002, DOI: 10.1109/ICPR.2002.1047997.
  5. J.-M. Lee and G.-W. Kim, "A Camera Pose Estimation Method for Rectangle Feature based Visual SLAM," Journal of Korea Robotics Society, vol. 11, no. 1, Mar., 2016, DOI: 10.7746/jkros.2016.11.1.033.
  6. J. H. Lee, G. Zhang, and I. H. Suh, "Motion Estimation Using 3-D Straight Lines," Journal of Korea Robotics Society, vol. 11, no. 4, Dec., 2016, DOI: 10.7746/jkros.2016.11.4.300.
  7. D. G. Lowe, "Distinctive image features from scale invariant keypoints," International Journal of Computer Vision, vol. 60, no. 2, pp. 91-110, 2004, DOI: 10.1023/B:VISI.0000029664.99615.94.
  8. H. Bay, A. Ess, T. Tuytelaars, and L. Van Gool, "Speeded-up robust features (SURF)," Computer Vision and Image Understanding, vol. 110, no. 3, pp. 346-359, 2008, DOI: 10.1016/j.cviu.2007.09.014.
  9. E. Rublee, V. Rabaud, K. Konolige, and G. Bradski, "ORB: An efficient alternative to SIFT or SURF," 2011 International Conference on Computer Vision, pp. 2564-2571, 2011, DOI: 10.1109/ICCV.2011.6126544.
  10. J. Zhang and S. Singh, "Low-drift and Real-time Lidar Odometry and Mapping," Autonomous Robots, vol. 41, pp. 401-416, 2017, DOI: 10.1007/s10514-016-9548-2.
  11. T. Shan and B. Englot, "LeGO-LOAM: Lightweight and Ground-Optimized Lidar Odometry and Mapping on Variable Terrain," 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), pp. 4758-4765, 2018, DOI: 10.1109/IROS.2018.8594299.
  12. C. Qin, H. Ye, C. E. Pranata, J. Han, S. Zhang, and M. Liu, "LINS: A Lidar-Inertial State Estimator for Robust and Efficient Navigation," 2020 IEEE International Conference on Robotics and Automation (ICRA), pp. 8899-8906, 2020, DOI: 10.1109/ICRA40945.2020.9197567.
  13. H. Ye, Y. Chen, and M. Liu, "Tightly Coupled 3D Lidar Inertial Odometry and Mapping," 2019 International Conference on Robotics and Automation (ICRA), pp. 3144-3150, 2019, DOI:10.1109/ICRA.2019.8793511.
  14. T.-M. Nguyen, M. Cao, S. Yuan, Y. Lyu, T. H. Nguyen, and L. Xie, "LIRO: Tightly coupled lidar-inertia-ranging odometry," IEEE Int. Conf. Robot. Automat., 2020, [Online], https://arxiv.org/abs/2010.13072.
  15. C. Forster, L. Carlone, F. Dellaert, and D. Scaramuzza, "OnManifold Preintegration for Real-Time Visual-Inertial Odometry," IEEE Transactions on Robotics, vol. 33, no. 1, pp. 1-21, Feb., 2017, DOI: 10.1109/TRO.2016.2597321.
  16. H. Wang, C. Wang, and L. Xie, "Intensity Scan Context: Coding Intensity and Geometry Relations for Loop Closure Detection," 2020 IEEE International Conference on Robotics and Automation (ICRA), pp. 2095-2101, 2020, DOI: 10.1109/ICRA40945.2020.9196764.
  17. T. Shan, B. Englot, F. Duarte, C. Ratti, and R. Daniela, "Robust Place Recognition using an Imaging Lidar," IEEE International Conference on Robotics and Automation (ICRA), 2021, [Online], https://arxiv.org/abs/2103.02111v2.
  18. X. Chen, T. Labe, A. Milioto, T. Rohling, O. Vysotska, A. Haag, J. Behley, and C. Stachniss, "OverlapNet: Loop Closing for LiDAR-based SLAM," Robotics: Science and Systems (RSS), 2020, [Online], https://arxiv.org/abs/2105.11344v1.
  19. Y. S. Park, H. Jang, and A. Kim, "I-LOAM: Intensity Enhanced LiDAR Odometry and Mapping," 2020 17th International Conference on Ubiquitous Robots (UR), Kyoto, Japan, pp. 455-458, 2020, DOI: 10.1109/UR49135.2020.9144987.
  20. J. Jeong, Y. Cho, Y.-S. Shin, H. Roh, and A. Kim, "Complex Urban Dataset with Multi-level Sensors from Highly Diverse Urban Environments," International Journal of Robotics Research, vol. 38, no. 6, pp. 642-657, 2019, DOI: 10.1177/0278364919843996.