DOI QR코드

DOI QR Code

Infrastructure 2D Camera-based Real-time Vehicle-centered Estimation Method for Cooperative Driving Support

협력주행 지원을 위한 2D 인프라 카메라 기반의 실시간 차량 중심 추정 방법

  • Ik-hyeon Jo (Dept. of Smart ICT Convergence, Univ. of Seoul National University of Science and Technology) ;
  • Goo-man Park (Dept. of Smart ICT Convergence, Univ. of Seoul National University of Science and Technology)
  • 조익현 (서울과학기술대학교 ICT융합공학과, (주)싸인텔레콤 기업부설연구소) ;
  • 박구만 (서울과학기술대학교 ICT융합공학과)
  • Received : 2023.12.06
  • Accepted : 2023.12.26
  • Published : 2024.02.28

Abstract

Existing autonomous driving technology has been developed based on sensors attached to the vehicles to detect the environment and formulate driving plans. On the other hand, it has limitations, such as performance degradation in specific situations like adverse weather conditions, backlighting, and obstruction-induced occlusion. To address these issues, cooperative autonomous driving technology, which extends the perception range of autonomous vehicles through the support of road infrastructure, has attracted attention. Nevertheless, the real-time analysis of the 3D centroids of objects, as required by international standards, is challenging using single-lens cameras. This paper proposes an approach to detect objects and estimate the centroid of vehicles using the fixed field of view of road infrastructure and pre-measured geometric information in real-time. The proposed method has been confirmed to effectively estimate the center point of objects using GPS positioning equipment, and it is expected to contribute to the proliferation and adoption of cooperative autonomous driving infrastructure technology, applicable to both vehicles and road infrastructure.

기존의 자율주행 기술은 차량에 부착된 센서를 사용하여 환경을 감지하고 주행 계획을 수립하는 방식으로 개발되었으나, 악천후나 역광, 장애물로 인한 가려짐 등 특정 상황에서 성능이 저하되는 문제점이 있다. 이러한 문제점을 해결하기 위해 도로 인프라의 지원을 통해 자율주행 차량의 인지 범위를 확장하는 협력형 자율주행 기술이 주목받고 있으나, 단안 카메라에서는 국제 표준에서 요구하는 객체의 3D 중심점을 실시간으로 분석해내기 어렵다는 문제점이 있다. 이에 본 논문에서는 도로 인프라의 고정된 화각과 사전에 측정된 기하학적 정보를 활용하여 객체를 검출하고 실시간으로 차량의 중심점을 추정하는 방법을 제안하였다. GPS 위치 측정 장비를 활용하여 객체의 중심점을 효과적으로 추정할 수 있음을 확인하였으며, 제안된 방법은 차량 및 도로 인프라 간의 협력형 자율주행 기술에 적용 가능하여, 협력형 자율주행 인프라의 보급 및 확산에 기여할 수 있을 것으로 기대된다.

Keywords

Acknowledgement

This work is supported by the Korea Agency for Infrastructure Technology Advancement (KAIA) grant funded by the Ministry of Land, Infrastructure and Transport (Grant RS-2021-KA160501).

References

  1. Al-qaness, M. A., Abbasi, A. A., Fan, H., Ibrahim, R. A., Alsamhi, S. H. and Hawbani, A.(2021), "An improved YOLO-based road traffic monitoring system", Computing, vol. 103, pp.211-230. https://doi.org/10.1007/s00607-020-00869-8
  2. Bai, Z., Wu, G., Qi, X., Liu, Y., Oguchi, K. and Barth, M. J.(2022), "Infrastructure-based object detection and tracking for cooperative driving automation: A survey", 2022 IEEE Intelligent Vehicles Symposium(IV), pp.1366-1373.
  3. Guo, E., Chen, Z., Rahardja, S. and Yang, J.(2021), "3d detection and pose estimation of vehicle in cooperative vehicle infrastructure system", IEEE Sensors Journal, vol. 21, no. 19, pp.21759-21771. https://doi.org/10.1109/JSEN.2021.3101497
  4. Ignatious, H. A. and Khan, M.(2022), "An overview of sensors in Autonomous Vehicles", Procedia Computer Science, vol. 198, pp.736-741. https://doi.org/10.1016/j.procs.2021.12.315
  5. Jeon, H. M., Yang, I., Kim, H. S, Lee, J. H., Kim, S. K., Jang, J. Y. and Kim, J. Y.(2022), "Some Lessons Learned from Previous Studies in Cooperative Driving Automation", The Journal of The Korea Institute of Intelligent Transport Systems, vol. 21, no. 4, pp.62-77. https://doi.org/10.12815/kits.2022.21.4.62
  6. Jo, I. H., Sa, J. M, Kim, Y. Y. and Nam, B.(2022), "Improve Ultra-low Latency Vehicle Identification and Traffic Precision Based Yolo-v4", Journal of Korean Society of Transportation, vol. 40, no. 6, pp.908-920. https://doi.org/10.7470/jkst.2022.40.6.908
  7. Kim, S. H., Han, D. H., Lee, H. and Kim, S. H.(2022), "Required Function Design of Traffic Data Collection and Provision System for Connected Automated Vehicle Operation", Korean Society of Transportation, vol. 19, no. 3, pp.39-44. https://doi.org/10.4285/ATW2022.F-1437
  8. Ku, J., Pon, A. D. and Waslander, S. L.(2019), "Monocular 3D Object Detection Leveraging Accurate Proposals and Shape Reconstruction," 2019 IEEE/CVF Conference on Computer Vision and Pattern Recognition(CVPR), Long Beach, CA, USA, pp.11859-11868.
  9. Ku, J., Pon, A. D. and Waslander, S. L.(2019), "Monocular 3d object detection leveraging accurate proposals and shape reconstruction", Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition, pp.11867-11876.
  10. Roh, C. G. and Kim, H. S.(2019), "Dynamic Information Platform for Connected Automated Driving at the Urban Road", Korean Society of Transportation, vol. 16, no. 6, pp.71-78.
  11. Society of Automotive Engineers(SAE) International(2020), V2X Communications Message Set Dictionary (J2735TM Standard), Issued 2006.12., Revised 2020.07.
  12. Society of Automotive Engineers(SAE) International(2021), Taxonomy and Definitions for Terms Related to Driving Automation Systems for On-Road Motor Vehicles (J3016TM Standard), Issued 2014.01., Revised 2021.04.
  13. Society of Automotive Engineers(SAE) International(2022), V2X V2X Sensor-Sharing for Cooperative and Automated Driving (J3224TM Standard), Issued 2022.08.
  14. Wang, C. Y., Bochkovskiy, A. and Liao, H. Y. M.(2021), "Scaled-yolov4: Scaling cross stage partial network", Proceedings of the IEEE/cvf Conference on Computer Vision and Pattern Recognition, pp.13029-13038.
  15. Wang, J., Zhang, L., Huang, Y., Zhao, J. and Bella, F.(2020), "Safety of autonomous vehicles", Journal of Advanced Transportation, pp.1-13.
  16. Zhang, Z.(2000), "A flexible new technique for camera calibration", IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 22, no. 11, pp.1330-1334. https://doi.org/10.1109/34.888718
  17. Zou, Z., Chen, K., Shi, Z., Guo, Y. and Ye, J.(2023), "Object detection in 20 years: A survey", Proceedings of the IEEE.