• Title/Summary/Keyword: Vision-Based Navigation

Search Result 192, Processing Time 0.026 seconds

A Hybrid Positioning System for Indoor Navigation on Mobile Phones using Panoramic Images

  • Nguyen, Van Vinh;Lee, Jong-Weon
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • v.6 no.3
    • /
    • pp.835-854
    • /
    • 2012
  • In this paper, we propose a novel positioning system for indoor navigation which helps a user navigate easily to desired destinations in an unfamiliar indoor environment using his mobile phone. The system requires only the user's mobile phone with its basic equipped sensors such as a camera and a compass. The system tracks user's positions and orientations using a vision-based approach that utilizes $360^{\circ}$ panoramic images captured in the environment. To improve the robustness of the vision-based method, we exploit a digital compass that is widely installed on modern mobile phones. This hybrid solution outperforms existing mobile phone positioning methods by reducing the error of position estimation to around 0.7 meters. In addition, to enable the proposed system working independently on mobile phone without the requirement of additional hardware or external infrastructure, we employ a modified version of a fast and robust feature matching scheme using Histogrammed Intensity Patch. The experiments show that the proposed positioning system achieves good performance while running on a mobile phone with a responding time of around 1 second.

Direct Depth and Color-based Environment Modeling and Mobile Robot Navigation (스테레오 비전 센서의 깊이 및 색상 정보를 이용한 환경 모델링 기반의 이동로봇 주행기술)

  • Park, Soon-Yong;Park, Mignon;Park, Sung-Kee
    • The Journal of Korea Robotics Society
    • /
    • v.3 no.3
    • /
    • pp.194-202
    • /
    • 2008
  • This paper describes a new method for indoor environment mapping and localization with stereo camera. For environmental modeling, we directly use the depth and color information in image pixels as visual features. Furthermore, only the depth and color information at horizontal centerline in image is used, where optical axis passes through. The usefulness of this method is that we can easily build a measure between modeling and sensing data only on the horizontal centerline. That is because vertical working volume between model and sensing data can be changed according to robot motion. Therefore, we can build a map about indoor environment as compact and efficient representation. Also, based on such nodes and sensing data, we suggest a method for estimating mobile robot positioning with random sampling stochastic algorithm. With basic real experiments, we show that the proposed method can be an effective visual navigation algorithm.

  • PDF

Relative Navigation for Autonomous Aerial Refueling Using Infra-red based Vision Systems (자동 공중급유를 위한 적외선 영상기반 상대 항법)

  • Yoon, Hyungchul;Yang, Youyoung;Leeghim, Henzeh
    • Journal of the Korean Society for Aeronautical & Space Sciences
    • /
    • v.46 no.7
    • /
    • pp.557-566
    • /
    • 2018
  • In this paper, a vision-based relative navigation system is addressed for autonomous aerial refueling. In the air-to-air refueling, it is assumed that the tanker has the drogue, and the receiver has the probe. To obtain the relative information from the drogue, a vision-based imaging technology by infra-red camera is applied. In this process, the relative information is obtained by using Gaussian Least Squares Differential Correction (GLSDC), and Levenberg-Marquadt(LM), where the drouge geometric information calculated through image processing is used. These two approaches proposed in this paper are analyzed through numerical simulations.

Vision-Based Relative State Estimation Using the Unscented Kalman Filter

  • Lee, Dae-Ro;Pernicka, Henry
    • International Journal of Aeronautical and Space Sciences
    • /
    • v.12 no.1
    • /
    • pp.24-36
    • /
    • 2011
  • A new approach for spacecraft absolute attitude estimation based on the unscented Kalman filter (UKF) is extended to relative attitude estimation and navigation. This approach for nonlinear systems has faster convergence than the approach based on the standard extended Kalman filter (EKF) even with inaccurate initial conditions in attitude estimation and navigation problems. The filter formulation employs measurements obtained from a vision sensor to provide multiple line(-) of(-) sight vectors from the spacecraft to another spacecraft. The line-of-sight measurements are coupled with gyro measurements and dynamic models in an UKF to determine relative attitude, position and gyro biases. A vector of generalized Rodrigues parameters is used to represent the local error-quaternion between two spacecraft. A multiplicative quaternion-error approach is derived from the local error-quaternion, which guarantees the maintenance of quaternion unit constraint in the filter. The scenario for bounded relative motion is selected to verify this extended application of the UKF. Simulation results show that the UKF is more robust than the EKF under realistic initial attitude and navigation error conditions.

Real-time Humanoid Robot Trajectory Estimation and Navigation with Stereo Vision (스테레오 비전을 이용한 실시간 인간형 로봇 궤적 추출 및 네비게이션)

  • Park, Ji-Hwan;Jo, Sung-Ho
    • Journal of KIISE:Software and Applications
    • /
    • v.37 no.8
    • /
    • pp.641-646
    • /
    • 2010
  • This paper presents algorithms for real-time navigation of a humanoid robot with a stereo vision but no other sensors. Using the algorithms, a robot can recognize its 3D environment by retrieving SIFT features from images, estimate its position through the Kalman filter, and plan its path to reach a destination avoiding obstacles. Our approach focuses on estimating the robot’s central walking path trajectory rather than its actual walking motion by using an approximate model. This strategy makes it possible to apply mobile robot localization approaches to humanoid robot localization. Simple collision free path planning and motion control enable the autonomous robot navigation. Experimental results demonstrate the feasibility of our approach.

Vision-Based Robust Control of Robot Manipulators with Jacobian Uncertainty (자코비안 불확실성을 포함하는 로봇 매니퓰레이터의 영상기반 강인제어)

  • Kim, Chin-Su;Jie, Min-Seok;Lee, Kang-Woong
    • Journal of Advanced Navigation Technology
    • /
    • v.10 no.2
    • /
    • pp.113-120
    • /
    • 2006
  • In this paper, a vision-based robust controller for tracking the desired trajectory a robot manipulator is proposed. The trajectory is generated to move the feature point into the desired position which the robot follows to reach to the desired position. To compensate the parametric uncertainties of the robot manipulator which contain in the control input, the robust controller is proposed. In addition, if there are uncertainties in the Jacobian, to compensate it, a vision-based robust controller which has control input is proposed as well in this paper. The stability of the closed-loop system is shown by Lyapunov method. The performance of the proposed method is demonstrated by simulations and experiments on a two degree of freedom 5-link robot manipulators.

  • PDF

Navigation and Localization of Mobile Robot Based on Vision and Sensor Network Using Fuzzy Rules (퍼지 규칙을 이용한 비전 및 무선 센서 네트워크 기반의 이동로봇의 자율 주행 및 위치 인식)

  • Heo, Jun-Young;Kang, Geun-Tack;Lee, Won-Chang
    • Proceedings of the IEEK Conference
    • /
    • 2008.06a
    • /
    • pp.673-674
    • /
    • 2008
  • This paper presents a new navigation algorithm of an autonomous mobile robot with vision and IR sensors, Zigbee Sensor Network using fuzzy rules. We also show that the developed mobile robot with the proposed algorithm is navigating very well in complex unknown environments.

  • PDF

Intelligent System based on Command Fusion and Fuzzy Logic Approaches - Application to mobile robot navigation (명령융합과 퍼지기반의 지능형 시스템-이동로봇주행적용)

  • Jin, Taeseok;Kim, Hyun-Deok
    • Journal of the Korea Institute of Information and Communication Engineering
    • /
    • v.18 no.5
    • /
    • pp.1034-1041
    • /
    • 2014
  • This paper propose a fuzzy inference model for obstacle avoidance for a mobile robot with an active camera, which is intelligently searching the goal location in unknown environments using command fusion, based on situational command using an vision sensor. Instead of using "physical sensor fusion" method which generates the trajectory of a robot based upon the environment model and sensory data. In this paper, "command fusion" method is used to govern the robot motions. The navigation strategy is based on the combination of fuzzy rules tuned for both goal-approach and obstacle-avoidance. We describe experimental results obtained with the proposed method that demonstrate successful navigation using real vision data.

Vision-based Reduction of Gyro Drift for Intelligent Vehicles (지능형 운행체를 위한 비전 센서 기반 자이로 드리프트 감소)

  • Kyung, MinGi;Nguyen, Dang Khoi;Kang, Taesam;Min, Dugki;Lee, Jeong-Oog
    • Journal of Institute of Control, Robotics and Systems
    • /
    • v.21 no.7
    • /
    • pp.627-633
    • /
    • 2015
  • Accurate heading information is crucial for the navigation of intelligent vehicles. In outdoor environments, GPS is usually used for the navigation of vehicles. However, in GPS-denied environments such as dense building areas, tunnels, underground areas and indoor environments, non-GPS solutions are required. Yaw-rates from a single gyro sensor could be one of the solutions. In dealing with gyro sensors, the drift problem should be resolved. HDR (Heuristic Drift Reduction) can reduce the average heading error in straight line movement. However, it shows rather large errors in some moving environments, especially along curved lines. This paper presents a method called VDR (Vision-based Drift Reduction), a system which uses a low-cost vision sensor as compensation for HDR errors.

Design for Back-up of Ship's Navigation System using UAV in Radio Frequency Interference Environment (전파간섭환경에서 UAV를 활용한 선박의 백업항법시스템 설계)

  • Park, Sul Gee;Son, Pyo-Woong
    • Journal of Advanced Navigation Technology
    • /
    • v.23 no.4
    • /
    • pp.289-295
    • /
    • 2019
  • Maritime back-up navigation system in port approach requires a horizontal accuracy of 10 meters in IALA (International Association of Lighthouse Authorities) recommendations. eLoran which is a best back-up navigation system that satisfies accuracy requirement has poor navigation performance depending signal environments. Especially, noise caused by multipath and electronic devices around eLoran antenna affects navigation performance. In this paper, Ship based Navigation Back-up system using UAV on Interference is designed to satisfy horizontal accuracy requirement. To improve the eLoran signal environment, UAVs are equipped with camera, IMU sensor and eLoran antenna and receivers. This proposed system is designed to receive eLoran signal through UAV-based receiver and control UAV's position and attitude within Landmark around area. The ship-based positioning using eLoran signal, vision and attitude information received from UAV satisfy resilient and robust navigation requirements.