DOI QR코드

DOI QR Code

Pose Estimation of Ground Test Bed using Ceiling Landmark and Optical Flow Based on Single Camera/IMU Fusion

천정부착 랜드마크와 광류를 이용한 단일 카메라/관성 센서 융합 기반의 인공위성 지상시험장치의 위치 및 자세 추정

  • Received : 2011.03.04
  • Accepted : 2011.12.11
  • Published : 2012.01.01

Abstract

In this paper, the pose estimation method for the satellite GTB (Ground Test Bed) using vision/MEMS IMU (Inertial Measurement Unit) integrated system is presented. The GTB for verifying a satellite system on the ground is similar to the mobile robot having thrusters and a reaction wheel as actuators and floating on the floor by compressed air. The EKF (Extended Kalman Filter) is also used for fusion of MEMS IMU and vision system that consists of a single camera and infrared LEDs that is ceiling landmarks. The fusion filter generally utilizes the position of feature points from the image as measurement. However, this method can cause position error due to the bias of MEMS IMU when the camera image is not obtained if the bias is not properly estimated through the filter. Therefore, it is proposed that the fusion method which uses the position of feature points and the velocity of the camera determined from optical flow of feature points. It is verified by experiments that the performance of the proposed method is robust to the bias of IMU compared to the method that uses only the position of feature points.

Keywords

References

  1. D. Tornqvist, T. B. Schön, L. Karlsson, and F. Gustafsson, "Particle filter SLAM with high dimensional vehicle model," Journal of Intelligent and Robotic Systems, vol. 55, no. 4-5, pp. 249-266, Aug. 2007. https://doi.org/10.1007/s10846-008-9301-y
  2. S. B. Chun, D. H. Won, T. S. Kang, S. K. Sung, E. S. Lee, J. S. Cho, and Y. J. Lee, "Estimation of precise relative position using ins/vision sensor integrated system," Journal of The Korean Society for Aeronautical and Space Sciences (in Korean), vol. 36, no. 9, pp. 891-897, Sep. 2008. https://doi.org/10.5139/JKSAS.2008.36.9.891
  3. P. Gemeiner, P. Einramhof, and M. Vincze, "Simultaneous motion and structure estimation by fusion of inertial and vision data," International Journal of Robotics Research, vol. 26, no. 6, pp. 591-605, Jun. 2007. https://doi.org/10.1177/0278364907080058
  4. S. G. Chroust and M. Vincze, "Fusion of vision and inertial data for motion and structure estimation," Journal of Robotic Systems, vol. 21, no. 2, pp. 73-83, Feb. 2004. https://doi.org/10.1002/rob.10129
  5. S. Heo, O. Shin, and C. G. Park, "Motion and structure estimation using fusion of intertial and vision data for helmet tracker," International Journal of Aeronautical & Space Sciences, vol. 11, no. 1, Mar. 2010.
  6. S. I. Roumeliotis, A. E. Johnson, and J. F. Montgomery, "Augmenting Inertial Navigation with Image-Based Motion Estimation," Proc. of the 19th Conf. Robotics & Automation, pp. 4326-4333, 2002.
  7. D. I. B. Randeniya, S. Sarker, and M. Gunaratne, "Vision-IMU integration using a slow-frame-rate monocular vision system in an actual roadway setting," IEEE Transaction on Intelligent Transportation Systems, vol. 11, no. 2, pp. 256-266, May 2010. https://doi.org/10.1109/TITS.2009.2038276
  8. M. Na, D. Zheng, and P. Jia, "Modified grid algorithm for noisy All-Sky autonomous star identification," IEEE Transaction on Aerospace and Electronic Systems, vol. 45, no. 2, pp. 516-522, Jun. 2009. https://doi.org/10.1109/TAES.2009.5089538
  9. D. J. Heeger and A. D. Jepson, "Subspace methods for recovering rigid motion I: algorithm and implementation," International Journal of Computer Vision, vol. 7, no. 2, pp. 95- 117, Jan. 1991. https://doi.org/10.1007/BF00128130