DOI QR코드

DOI QR Code

Line-Based SLAM Using Vanishing Point Measurements Loss Function

소실점 정보의 Loss 함수를 이용한 특징선 기반 SLAM

  • Hyunjun Lim (Electrical Engineering, Korea Advanced Institute of Science and Technology (KAIST)) ;
  • Hyun Myung (Electrical Engineering, KAIST)
  • Received : 2023.05.10
  • Accepted : 2023.06.07
  • Published : 2023.08.31

Abstract

In this paper, a novel line-based simultaneous localization and mapping (SLAM) using a loss function of vanishing point measurements is proposed. In general, the Huber norm is used as a loss function for point and line features in feature-based SLAM. The proposed loss function of vanishing point measurements is based on the unit sphere model. Because the point and line feature measurements define the reprojection error in the image plane as a residual, linear loss functions such as the Huber norm is used. However, the typical loss functions are not suitable for vanishing point measurements with unbounded problems. To tackle this problem, we propose a loss function for vanishing point measurements. The proposed loss function is based on unit sphere model. Finally, we prove the validity of the loss function for vanishing point through experiments on a public dataset.

Keywords

Acknowledgement

This work was financially supported in part by Institute of Information & communications Technology Planning & Evaluation (IITP) grant funded by the Korea government (MSIT) (No. 2021-0-00230, development of real·virtual environmental analysis based adaptive interaction technology). The students are supported by BK21 FOUR

References

  1. Y. He, J. Zhao, Y. Guo, W. He, and K. Yuan, "PL-VIO: Tightly coupled monocular visual-inertial odometry using point and line features," Sensors, vol. 18, no. 4, pp. 1159, Apr., 2018, [Online], https://www.mdpi.com/1424-8220/18/4/1159.
  2. K. Jung, Y. Kim, H. Lim, and H. Myung, "ALVIO: Adaptive line and point feature-based visual inertial odometry for robust localization in indoor environments," International Conference on Robot Intelligence Technology and Applications (RiTA), Singapore, 2021, pp. 171-184, [Online], https://link.springer.com/chapter/10.1007/978-981-16-4803-8_19.
  3. H. Lim, Y. Kim, K. Jung, S. Hu, and H. Myung, "Avoiding Degeneracy for Monocular Visual SLAM with Point and Line Features," IEEE International Conference on Robotics and Automation (ICRA), Xi'an, China, 2021, DOI: 10.1109/ICRA48506.2021.9560911.
  4. H. Lim, J. Jeon, and H. Myung, "UV-SLAM: Unconstrained Line-Based SLAM Using Vanishing Points for Structural Mapping," IEEE Robotics and Automation Letters, vol. 7, no. 2, pp. 1518-1525, Apr., 2022, DOI: 10.1109/LRA.2022.3140816.
  5. A. Bartoli and P. Sturm, "Structure-from-motion using lines: Representation, triangulation, and bundle adjustment," Computer Vision and Image Understanding, vol. 100, no. 3, pp. 416-441, Dec., 2005, DOI: 10.1016/j.cviu.2005.06.001.
  6. F. Zheng, G. Tsai, Z. Zhang, S. Liu, C.-C. Chu, and H. Hu, "Trifo-VIO: Robust and Efficient Stereo Visual Inertial Odometry Using Points and Lines," IEEE International Workshop on Intelligent Robots and Systems (IROS), Madrid, Spain, 2018, DOI: 10.1109/IROS.2018.8594354.
  7. Y. Yang, P. Geneva, K. Eckenhoff, and G. Huang, "Visual-Inertial Odometry with Point and Line Features, IEEE International Workshop on Intelligent Robots and Systems (IROS), Macau, China, 2019, DOI: 10.1109/IROS40897.2019.8967905.
  8. A. I. Mourikis and S. I. Roumeliotis, "A Multi-State Constraint Kalman Filter for Vision-aided Inertial Navigation," IEEE International Conference on Robotics and Automation (ICRA), Rome, Italy, pp. 3565-3572, 2007, DOI: 10.1109/ROBOT.2007.364024.
  9. A. Pumarola, A. Vakhitov, A. Agudo, A. Sanfeliu, and F. Moreno-Noguer, "PL-SLAM: Real-time Monocular Visual SLAM with Points and Lines," IEEE International Conference on Robotics and Automation (ICRA), Singapore, pp. 4503-4508, 2017, DOI: 10.1109/ICRA.2017.7989522.
  10. X. Zuo, X. Xie, Y. Liu, and G. Huang, "Robust visual SLAM with Point and Line Features," IEEE International Workshop on Intelligent Robots and Systems (IROS), Vancouver, Canada, pp. 1775-1782, 2017, DOI: 10.1109/IROS.2017.8205991.
  11. S. J. Lee and S. S. Hwang, "Elaborate Monocular Point and Line SLAM with Robust Initialization," International Conference on Computer Vision (ICCV), Seoul, Korea, pp. 1121-1129, 2019, DOI: 10.1109/ICCV.2019.00121.
  12. R. Mur-Artal, J. M. M. Montiel, and J. D. Tardos, "ORB-SLAM: A Versatile and Accurate Monocular SLAM System," IEEE Transactions on Robotics, vol. 31, no. 5, pp. 1147-1163, Aug., 2015, DOI: 10.1109/TRO.2015.2463671.
  13. T. Qin, P. Li, and S. Shen, "VINS-mono: A Robust and Versatile Monocular Visual-Inertial State Estimator," IEEE Transactions on Robotics, vol. 34, no. 4, pp. 1004-1020, Aug., 2018, DOI: 10.1109/TRO.2018.2853729.
  14. P. Kim, B. Coltin, and H. J. Kim, "Low-Drift Visual Odometry in Structured Environments by Decoupling Rotational and Translational Motion," IEEE International Conference on Robotics and Automation (ICRA), Brisbane, Australia, 2018, DOI: 10.1109/ICRA.2018.8463207.
  15. Y. Li, N. Brasch, Y. Wang, N. Navab, and F. Tombari, "Structure-SLAM: Low-Drift Monocular SLAM in Indoor Environments," IEEE Robotics and Automation Letters, vol. 5, no. 4, pp. 6583-6590, Oct., 2020, DOI: 10.1109/LRA.2020.3015456.
  16. P. Kim, B. Coltin, and H. J. Kim, "Indoor RGB-D compass from a single line and plane," Conference on Computer Vision and Pattern Recognition (CVPR), Salt Lake City, USA, 2018, DOI: 10.1109/CVPR.2018.00491.
  17. R. Yunus, Y. Li, and F. Tombari, "ManhattanSLAM: Robust Planar Tracking and Mapping Leveraging Mixture of Manhattan Frames," IEEE International Conference on Robotics and Automation (ICRA), Xi'an, China, pp. 6687-6693, 2021, DOI: 10.1109/ICRA48506.2021.9562030.
  18. H. Zhou, D. Zou, L. Pei, R. Ying, P. Liu, and W. Yu, "StructSLAM: Visual SLAM with Building Structure Lines," IEEE Transactions on Vehicular Technology, vol. 64, no. 4, pp. 1364-1375, Apr., 2015, DOI: 10.1109/TVT.2015.2388780.
  19. D. Zou, Y. Wu, L. Pei, H. Ling, and W. Yu, "StructVIO: Visual-Inertial Odometry With Structural Regularity of Man-Made Environments," IEEE Transactions on Robotics, vol. 35, no. 4, pp. 999-1013, Aug., 2019, DOI: 10.1109/TRO.2019.2915140.
  20. J. Lee and S.-Y. Park, "PLF-VINS: Real-time monocular visual-inertial SLAM with point-line fusion and parallel-line fusion," IEEE Robotics and Automation Letters, vol. 6, no. 4, pp. 7033-7040, Oct., 2021, DOI: 10.1109/LRA.2021.3095518.
  21. J. Ma, X. Wang, Y. He, X. Mei, and J. Zhao, "Line-based stereo SLAM by Junction Matching and Vanishing Point Alignment," IEEE Access, vol. 7, pp. 181800-181811, Dec., 2019, DOI: 10.1109/ACCESS.2019.2960282.
  22. J. Shi and Tomasi, "Good features to track," IEEE Computer Society Conference on Computer Vision and Pattern Recognition Workshops (CVPRW), Seattle, USA, pp. 593-600, 1994, DOI: 10.1109/CVPR.1994.323794.
  23. B. D. Lucas and T. Kanade, "An Iterative Image Registration Technique with an Application to Stereo Vision," International Joint Conference on Artificial Intelligence, Vancouver, Canada, pp. 674-679, 1981, [Online], https://onlinelibrary.wiley.com/doi/10.1002/rob.20360.
  24. C. Forster, L. Carlone, F. Dellaert, and D. Scaramuzza, "On-Manifold Preintegration for Real-Time Visual-Inertial Odometry," IEEE Transactions on Robotics, vol. 33, no. 1, pp. 1-21, Feb., 2017, DOI: 10.1109/TRO.2016.2597321.
  25. G. Sibley, L. Matthies, and G. Sukhatme, "Sliding window filter with application to planetary landing," Journal of Field Robotics, vol. 27, no. 5, pp. 587-608, Aug., 2010, DOI: 10.1002/rob.20360.
  26. R. G. Von Gioi, J. Jakubowicz, J.-M. Morel, and G. Randall, "LSD: A Fast Line Segment Detector with a False Detection Control," IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 32, no. 4, pp. 722-732, Apr., 2008, DOI: 10.1109/TPAMI.2008.300.
  27. L. Zhang and R. Koch, "An efficient and robust line segment matching approach based on LBD descriptor and pairwise geometric consistency," Journal of Visual Communication and Image Representation, vol. 24, no. 7, pp. 794-805, Oct., 2013, DOI: 10.1016/j.jvcir.2013.05.006.
  28. R. Toldo and A. Fusiello, "Robust Multiple Structures Estimation with J-Linkage," European Conference on Computer Vision (ECCV), Berlin, Germany, pp. 537-547, 2008, DOI: 10.1007/978-3-540-88682-2_41.
  29. S. Agarwal, K. Mierle, and Others, "Ceres solver," [Online], http://ceres-solver.org, Accessed: May 1, 2023.
  30. M. Burri, J. Nikolic, P. Gohl, T. Schneider, J. Rehder, S. Omari, M. W. Achtelik, and R. Siegwart, "The EuRoC micro aerial vehicle datasets," The International Journal of Robotics Research, vol. 35, no. 10, pp. 1157-1163, Jan., 2016, DOI: 10.1177/0278364915620033.
  31. Z. Zhang and D. Scaramuzza, "A Tutorial on Quantitative Trajectory Evaluation for Visual (-Inertial) Odometry," IEEE International Workshop on Intelligent Robots and Systems (IROS), Madrid, Spain, pp. 7244-7251, 2018, DOI: 10.1109/IROS.2018.8593941.