Improved View-Based Navigation for Obstacle Avoidance using Ego-Motion

  • Hagiwara, Yoshinobu (National Institute of Informatics) ;
  • Suzuki, Akimasa (Department of Software and Information Science, Iwate Prefectural University) ;
  • Kim, Youngbok (Department of System Mechanical Engineering, Pukyong National University) ;
  • Choi, Yongwoon (Department of Information System Science, Soka University)
  • Received : 2013.09.22
  • Accepted : 2013.10.06
  • Published : 2013.10.31


In this study, we propose an improved view-based navigation method for obstacle avoidance and evaluate the effectiveness of the method in real environments with real obstacles. The proposed method possesses the ability to estimate the position and rotation of a mobile robot, even if the mobile robot strays from a recording path for the purpose of avoiding obstacles. In order to achieve this, ego-motion estimation was incorporated into the existing view-based navigation system. The ego-motion is calculated from SURF points between a current view and a recorded view using a Kinect sensor. In conventional view-based navigation systems, it is difficult to generate alternate paths to avoid obstacles. The proposed method is anticipated to allow a mobile robot greater flexibility in path planning to avoid humans and objects expected in real environments. Based on experiments performed in an indoor environment using a mobile robot, we evaluated the measurement accuracy of the proposed method, and confirmed its feasibility for robot navigation in museums and shopping mall.


  1. Y. Shimosasa, 2002, "Business of Guard Robot", Journal of Robotics Society of Japan, Vol. 20, No. 7, pp. 692-695.
  2. M. Imai, T. Ono, M. Imai, and H. Ishiguro, 2002, "Robovie: Communication technologies for a social robot", Artificial Life and Robotics, Vol. 6, Issue 1-2, pp. 73-77.
  3. Communication Robot PaPeRo,
  4. H. Durrant-Whyte, 2006, "Simultaneous localization and mapping : part I", IEEE Robotics Automat. Mag., Vol. 13, pp. 99-108.
  5. T. Bailey, 2006, "Simultaneous localization and mapping (SLAM) : part II", IEEE Robotics Automat. Mag., Vol. 13, pp. 108-117.
  6. Y. Matsumoto, K. Sasaki, M. Inaba, and H. Inoue, 2000, "View-Based approach to robot navigation", Proc. of IEEE International Conference on Intelligent Robots and Systems, Vol. 3, pp. 1702-1708.
  7. Y. Yamagi, J. Ido, K. Takemura, Y. Matsumoto, J. Takamatsu, and T. Ogasawara, 2009, "View sequence based indoor/outdoor navigation robust to illumination cChanges", Proc. of IEEE International Conference on Intelligent Robotics and Systems, pp. 1229-1234.
  8. J. Miura, Y. Negishi, and Y. Shirai, 2002, "Mobile robot map generation by integrating omnidirectional stereo and laser range finder", Proc. of IEEE International conference on Intelligent Robots and Systems, Vol. 1, pp. 250-255.
  9. H. Bay, T. Tuytelaars, and Luc Van Gool, 2006, "SURF: Speeded up robust features", Computer Vision-ECCV 2006, Vol. 3951, pp. 404-417.
  10. M. A. Fischler, and R. C. Bolles, 1981, "Random sample consensus: A paradigm for model fitting with applications to image analysis and automated cartography", Communication of the ACM, 24, pp. 381-395.
  11. Y. Hagiwara, Y. Choi, H. Imamura, and K. Watanabe, 2010, "A view-based navigation using a robust image feature against illumination changes", Proc. CD-ROM of IEEJ Image Electronics and Visual Computing Workshop, pp. 2-5.