• Title/Summary/Keyword: fused navigation

Search Result 44, Processing Time 0.028 seconds

Fused Navigation of Unmanned Surface Vehicle and Detection of GPS Abnormality (무인 수상정의 융합 항법 및 GPS 이상 검출)

  • Ko, Nak Yong;Jeong, Seokki
    • Journal of Institute of Control, Robotics and Systems
    • /
    • v.22 no.9
    • /
    • pp.723-732
    • /
    • 2016
  • This paper proposes an approach to fused navigation of an unmanned surface vehicle(USV) and to detection of the outlier or interference of global positioning system(GPS). The method fuses available sensor measurements through extended Kalman filter(EKF) to find the location and attitude of the USV. The method uses error covariance of EKF for detection of GPS outlier or interference. When outlier or interference of the GPS is detected, the method excludes GPS data from navigation process. The measurements to be fused for the navigation are GPS, acceleration, angular rate, magnetic field, linear velocity, range and bearing to acoustic beacons. The method is tested through simulated data and measurement data produced through ground navigation. The results show that the method detects GPS outlier or interference as well as the GPS recovery, which frees navigation from the problem of GPS abnormality.

Dual Foot-PDR System Considering Lateral Position Error Characteristics

  • Lee, Jae Hong;Cho, Seong Yun;Park, Chan Gook
    • Journal of Positioning, Navigation, and Timing
    • /
    • v.11 no.1
    • /
    • pp.35-44
    • /
    • 2022
  • In this paper, a dual foot (DF)-PDR system is proposed for the fusion of integration (IA)-based PDR systems independently applied on both shoes. The horizontal positions of the two shoes estimated from each PDR system are fused based on a particle filter. The proposed method bounds the position error even if the walking time increases without an additional sensor. The distribution of particles is a non-Gaussian distribution to express the lateral error due to systematic drift. Assuming that the shoe position is the pedestrian position, the multi-modal position distribution can be fused into one using the Gaussian sum. The fused pedestrian position is used as a measurement of each particle filter so that the position error is corrected. As a result, experimental results show that position of pedestrians can be effectively estimated by using only the inertial sensors attached to both shoes.

A Tracking Algorithm for Autonomous Navigation of AGVs: Federated Information Filter

  • Kim, Yong-Shik;Hong, Keum-Shik
    • Journal of Navigation and Port Research
    • /
    • v.28 no.7
    • /
    • pp.635-640
    • /
    • 2004
  • In this paper, a tracking algorithm for autonomous navigation of automated guided vehicles (AGVs) operating in container terminals is presented. The developed navigation algorithm takes the form of a federated information filter used to detect other AGVs and avoid obstacles using fused information from multiple sensors. Being equivalent to the Kalman filter (KF) algebraically, the information filter is extended to N-sensor distributed dynamic systems. In multi-sensor environments, the information-based filter is easier to decentralize, initialize, and fuse than a KF-based filter. It is proved that the information state and the information matrix of the suggested filter, which are weighted in terms of an information sharing factor, are equal to those of a centralized information filter under the regular conditions. Numerical examples using Monte Carlo simulation are provided to compare the centralized information filter and the proposed one.

Command Fusion for Navigation of Mobile Robots in Dynamic Environments with Objects

  • Jin, Taeseok
    • Journal of information and communication convergence engineering
    • /
    • v.11 no.1
    • /
    • pp.24-29
    • /
    • 2013
  • In this paper, we propose a fuzzy inference model for a navigation algorithm for a mobile robot that intelligently searches goal location in unknown dynamic environments. Our model uses sensor fusion based on situational commands using an ultrasonic sensor. Instead of using the "physical sensor fusion" method, which generates the trajectory of a robot based upon the environment model and sensory data, a "command fusion" method is used to govern the robot motions. The navigation strategy is based on a combination of fuzzy rules tuned for both goal-approach and obstacle-avoidance based on a hierarchical behavior-based control architecture. To identify the environments, a command fusion technique is introduced where the sensory data of the ultrasonic sensors and a vision sensor are fused into the identification process. The result of experiment has shown that highlights interesting aspects of the goal seeking, obstacle avoiding, decision making process that arise from navigation interaction.

AGV Navigation Using a Space and Time Sensor Fusion of an Active Camera

  • Jin, Tae-Seok;Lee, Bong-Ki;Lee, Jang-Myung
    • Journal of Navigation and Port Research
    • /
    • v.27 no.3
    • /
    • pp.273-282
    • /
    • 2003
  • This paper proposes a sensor-fusion technique where rho data sets for the previous moments are properly transformed and fused into the current data sets to enable accurate measurement, such as, distance to an obstacle and location of the service robot itself. In the conventional fusion schemes, the measurement is dependent only on the current data sets. As the results, more of sensors are required to measure a certain physical promoter or to improve the accuracy of the measurement. However, in this approach, intend of adding more sensors to the system, the temporal sequence of the data sets are stored and utilized for the measurement improvement. Theoretical basis is illustrated by examples md the effectiveness is proved through the simulation. Finally, the new space and time sensor fusion (STSF) scheme is applied to the control of a mobile robot in the indoor environment and the performance was demonstrated by the real experiments.

Landmark Detection Based on Sensor Fusion for Mobile Robot Navigation in a Varying Environment

  • Jin, Tae-Seok;Kim, Hyun-Sik;Kim, Jong-Wook
    • International Journal of Fuzzy Logic and Intelligent Systems
    • /
    • v.10 no.4
    • /
    • pp.281-286
    • /
    • 2010
  • We propose a space and time based sensor fusion method and a robust landmark detecting algorithm based on sensor fusion for mobile robot navigation. To fully utilize the information from the sensors, first, this paper proposes a new sensor-fusion technique where the data sets for the previous moments are properly transformed and fused into the current data sets to enable an accurate measurement. Exploration of an unknown environment is an important task for the new generation of mobile robots. The mobile robots may navigate by means of a number of monitoring systems such as the sonar-sensing system or the visual-sensing system. The newly proposed, STSF (Space and Time Sensor Fusion) scheme is applied to landmark recognition for mobile robot navigation in an unstructured environment as well as structured environment, and the experimental results demonstrate the performances of the landmark recognition.

Implementation of underwater precise navigation system for a remotely operated mine disposal vehicle

  • Kim, Ki-Hun;Lee, Chong-Moo;Choi, Hyun-Taek;Lee, Pan-Mook
    • International Journal of Ocean System Engineering
    • /
    • v.1 no.2
    • /
    • pp.102-109
    • /
    • 2011
  • This paper describes the implementation of a precise underwater navigation solution using a multiple sensor fusion technique based on USBL, GPS, DVL and AHRS measurements for the operation of a remotely operated mine disposal vehicle (MDV). The estimation of accurate 6DOF positions and attitudes is the key factor in executing dangerous and complicated missions. To implement the precise underwater navigation, two strategies are chosen in this paper. Firstly, the sensor frame alignment to the body frame is conducted to enhance the performance of a standalone dead-reckoning algorithm. Secondly, absolute position data measured by USBL is fused to prevent cumulative integration error. The heading alignment error is identified by comparing the measured absolute positions with the DR algorithm results. The performance of the developed approach is evaluated with the experimental data acquired by MDV in the South-sea trial.

Fuzzy Neural Network Based Sensor Fusion and It's Application to Mobile Robot in Intelligent Robotic Space

  • Jin, Tae-Seok;Lee, Min-Jung;Hashimoto, Hideki
    • International Journal of Fuzzy Logic and Intelligent Systems
    • /
    • v.6 no.4
    • /
    • pp.293-298
    • /
    • 2006
  • In this paper, a sensor fusion based robot navigation method for the autonomous control of a miniature human interaction robot is presented. The method of navigation blends the optimality of the Fuzzy Neural Network(FNN) based control algorithm with the capabilities in expressing knowledge and learning of the networked Intelligent Robotic Space(IRS). States of robot and IR space, for examples, the distance between the mobile robot and obstacles and the velocity of mobile robot, are used as the inputs of fuzzy logic controller. The navigation strategy is based on the combination of fuzzy rules tuned for both goal-approach and obstacle-avoidance. To identify the environments, a sensor fusion technique is introduced, where the sensory data of ultrasonic sensors and a vision sensor are fused into the identification process. Preliminary experiment and results are shown to demonstrate the merit of the introduced navigation control algorithm.

Virtual Environment Building and Navigation of Mobile Robot using Command Fusion and Fuzzy Inference

  • Jin, Taeseok
    • Journal of the Korean Society of Industry Convergence
    • /
    • v.22 no.4
    • /
    • pp.427-433
    • /
    • 2019
  • This paper propose a fuzzy inference model for map building and navigation for a mobile robot with an active camera, which is intelligently navigating to the goal location in unknown environments using sensor fusion, based on situational command using an active camera sensor. Active cameras provide a mobile robot with the capability to estimate and track feature images over a hallway field of view. In this paper, instead of using "physical sensor fusion" method which generates the trajectory of a robot based upon the environment model and sensory data. Command fusion method is used to govern the robot navigation. The navigation strategy is based on the combination of fuzzy rules tuned for both goal-approach and obstacle-avoidance. To identify the environments, a command fusion technique is introduced, where the sensory data of active camera sensor for navigation experiments are fused into the identification process. Navigation performance improves on that achieved using fuzzy inference alone and shows significant advantages over command fusion techniques. Experimental evidences are provided, demonstrating that the proposed method can be reliably used over a wide range of relative positions between the active camera and the feature images.

S-100 Drawing Instruction Analysis for S-101 ENC Development (S-101 전자해도 구현을 위한 S-100 Drawing Instruction 분석)

  • Kim, Youngjin;Park, Suhyun;Park, Daewon
    • Proceedings of the Korean Institute of Information and Commucation Sciences Conference
    • /
    • 2015.10a
    • /
    • pp.259-261
    • /
    • 2015
  • In order to provide a safe and secure electronic navigational chart information services, AIS information, information on aids to navigation, maritime safety information, weather information, information on a variety of birds such as the data fused S-100 standards-based e-Navigation system should be established. S-101 ENC implement Rendering Engine analyzes the Drawing Instruction set generated by Portrayal Engine of the S-100 General Portrayal Model for is a job to be followed to establish the e-Navigation System to implement the S-101 ENC Should be. In this paper, we analyze the Drawing Instruction of the existing S-57 ENC and S-101 ENC is the basis for the implementation of the S-101 ENC.

  • PDF