DOI QR코드

DOI QR Code

TRT Pose를 이용한 모바일 로봇의 사람 추종 기법

Development of Human Following Method of Mobile Robot Using TRT Pose

  • 투고 : 2020.11.05
  • 심사 : 2020.11.24
  • 발행 : 2020.12.31

초록

In this paper, we propose a method for estimating a walking direction by which a mobile robots follows a person using TRT (Tensor RT) pose, which is motion recognition based on deep learning. Mobile robots can measure individual movements by recognizing key points on the person's pelvis and determine the direction in which the person tries to move. Using these information and the distance between robot and human, the mobile robot can follow the person stably keeping a safe distance from people. The TRT Pose only extracts key point information to prevent privacy issues while a camera in the mobile robot records video. To validate the proposed technology, experiment is carried out successfully where human walks away or toward the mobile robot in zigzag form and the robot continuously follows human with prescribed distance.

키워드

참고문헌

  1. A.Glandon, L.Vidyaratne, N. Sadeghzadehyazdi, Nibir K. Dhar, Jide O. Familoni, Scott T. Acton, K. M. Iftekharuddin, "3D Skeleton Estimation and Human Identity Recognition Using Lidar Full Motion Video," Proceedings of 2019 International Joint Conference on Neural Networks (IJCNN), pp. N-20332, 2019.
  2. A. Jinguji, T. Fujii, S. Sato, H. Nakahara, "An FPGA Realization of OpenPose Based on a Sparse Weight Convolutional Neural Network," Proceedings of International Conference on Field-Programmable Technology (FPT), pp. 313-316, 2018.
  3. S. Qiao, Y. Wang, J. Li, "Real-time Human Gesture Grading Based on OpenPose," Proceedings of 10th International Congress on Image and Signal Processing, BioMedical Engineering and Informatics (CISP-BMEI), pp. 14-16, 2017.
  4. S. Liu, G. Hua, Y. Li, "2.5D Human Pose Estimation for Shadow Puppet Animation," KSII Transactions on Internet and Information Systems, Vol. 13, No. 4, pp. 2042-2059, 2019. https://doi.org/10.3837/tiis.2019.04.017
  5. S. Park, D. Lee, "A Development of Visual Human Tracking System for Path Guidance Robot," The Korean Institute of Electrical Engineers, pp. 39-40, 2019 (in Korean).
  6. Y. Xiao, V.R. Kamat, C.C. Menassa, "Human Tracking from Single RGB-D Camera Using Online Learning," Image and Vision Computing, Vol. 88, pp. 67-75, 2019. https://doi.org/10.1016/j.imavis.2019.05.003
  7. R. Algabri, M.-T. Choi, "Deep-learning-based Indoor Human Following of Mobile Robot Using Color Feature," Sensors, Vol. 20, No. 2699, pp. 1-19, 2020. https://doi.org/10.1109/JSEN.2019.2959158
  8. S. Lee, J.W. Choi, C.V. Dang, J.-W. Kim, "Development of Human Following Method of Mobile Robot Using QR Code and 2D LiDAR Sensor," IEMEK J. Embed. Sys. Appl., Vol. 15, No. 1, pp. 35-42, 2019 (in Korean).
  9. https://developer.nvidia.com/embedded/community/jetson-projects
  10. https://github.com/NVIDIA-AI-IOT/trt_pose
  11. https://www.intelrealsense.com/developers/
  12. https://emanual.robotis.com/docs/en/platform/turtlebot3/overview/