• 제목/요약/키워드: Navigation Sensor

검색결과 1,003건 처리시간 0.032초

무인잠수정의 수중합법을 위한 센서융합 (Sensor Fusion for Underwater Navigation of Unmanned Underwater Vehicle)

  • 서주노
    • 한국군사과학기술학회지
    • /
    • 제8권4호
    • /
    • pp.14-23
    • /
    • 2005
  • In this paper we propose a sensor fusion method for the navigation algorithm which can be used to estimate state vectors such as position and velocity for its motion control using multi-sensor output measurements. The output measurement we will use in estimating the state is a series of known multi-sensor asynchronous outputs with measurement noise. This paper investigates the Extended Kalman Filtering method to merge asynchronous heading, heading rate, velocity of DVL, and SSBL information to produce a single state vector. Different complexity of Kalman Filter, with. biases and measurement noise, are investigated with theoretically data from MOERI's SAUV. All levels of complexity of the Kalman Filters are shown to be much more close and smooth to real trajectories then the basic underwater acoustic navigation system commonly used aboard underwater vehicle.

Virtual Environment Building and Navigation of Mobile Robot using Command Fusion and Fuzzy Inference

  • Jin, Taeseok
    • 한국산업융합학회 논문집
    • /
    • 제22권4호
    • /
    • pp.427-433
    • /
    • 2019
  • This paper propose a fuzzy inference model for map building and navigation for a mobile robot with an active camera, which is intelligently navigating to the goal location in unknown environments using sensor fusion, based on situational command using an active camera sensor. Active cameras provide a mobile robot with the capability to estimate and track feature images over a hallway field of view. In this paper, instead of using "physical sensor fusion" method which generates the trajectory of a robot based upon the environment model and sensory data. Command fusion method is used to govern the robot navigation. The navigation strategy is based on the combination of fuzzy rules tuned for both goal-approach and obstacle-avoidance. To identify the environments, a command fusion technique is introduced, where the sensory data of active camera sensor for navigation experiments are fused into the identification process. Navigation performance improves on that achieved using fuzzy inference alone and shows significant advantages over command fusion techniques. Experimental evidences are provided, demonstrating that the proposed method can be reliably used over a wide range of relative positions between the active camera and the feature images.

SDINS에서 의사 자이로 바이어스 보상 기법 (Compensation of Pseudo Gyro Bias in SDINS)

  • 박정민
    • Journal of Positioning, Navigation, and Timing
    • /
    • 제13권2호
    • /
    • pp.179-187
    • /
    • 2024
  • The performance of a Strapdown Inertial Navigation System (SDINS) relies heavily on the accuracy of sensor error calibration. Systematic calibration is usually employed when only a 2-axis turntable is available. For systematic calibration, the body frame is commonly defined with respect to sensor axes for ease of computation. The drawback of this approach is that sensor axes may undergo time-varying deflection under temperature change, causing pseudo gyro bias. The effect of pseudo gyro bias on navigation performance is negligible for low grade navigation systems. However, for higher grade systems undergoing rapid temperature change, the error is no longer negligible. This paper describes in detail conditions leading to the presence of pseudo gyro bias, and proposes two techniques for mitigating the error. Experimental results show that applying these techniques improves navigation performance for precision SDINS, especially under rapid temperature change.

확률론에 기반한 점자블록 추종 알고리즘 및 센서장치의 개발 (Development of Sensor Device and Probability-based Algorithm for Braille-block Tracking)

  • 노치원;이성하;강성철;홍석교
    • 제어로봇시스템학회논문지
    • /
    • 제13권3호
    • /
    • pp.249-255
    • /
    • 2007
  • Under the situation of a fire, it is difficult for a rescue robot to use sensors such as vision sensor, ultrasonic sensor or laser distance sensor because of diffusion, refraction or block of light and sound by dense smoke. But, braille blocks that are installed for the visaully impaired at public places such as subway stations can be used as a map for autonomous mobile robot's localization and navigation. In this paper, we developed a laser sensor stan device which can detect braille blcoks in spite of dense smoke and integrated the device to the robot developed to carry out rescue mission in various hazardous disaster areas at KIST. We implemented MCL algorithm for robot's attitude estimation according to the scanned data and transformed a braille block map to a topological map and designed a nonlinear path tracking controller for autonomous navigation. From various simulations and experiments, we could verify that the developed laser sensor device and the proposed localization method are effective to autonomous tracking of braille blocks and the autonomous navigation robot system can be used for rescue under fire.

An analysis on the Earth geoid surface variation effect for use of the tilt sensor in celestial navigation system

  • Suk, Byong-Suk;Yoon, Jae-Cheol;Lyou, Joon
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 제어로봇시스템학회 2005년도 ICCAS
    • /
    • pp.1867-1870
    • /
    • 2005
  • The celestial navigation is one of alternatives to GPS system and can be used as a backup of GPS. In the celestial navigation system using more than two star trackers, the vehicle's ground position can be solved based on the star trackers' attitude information if the vehicle's local vertical or horizontal angle is given. In order to determine accurate ground position of flight vehicle, the high accurate local vertical angle measurement is one of the most important factors for navigation performance. In this paper, the Earth geophysical deflection was analyzed in the assumption of using the modern electrolyte tilt sensor as a local vertical sensor for celestial navigation system. According to the tilt sensor principle, the sensor measures the tilt angle from gravity direction which depends on the Earth geoid surface at a given position. In order to determine the local vertical angle from tilt sensor measurement, the relationship between the direction of gravity and the direction of the Earth center should be analyzed. Using a precision orbit determination software which includes the JGM-3 Earth geoid model, the direction of the Earth center and the direction of gravity are extracted and analyzed. Appling vector inner product and cross product to the both extracted vectors, the magnitude and phase of deflection angle between the direction of gravity and the direction of the Earth center are achieved successfully. And the result shows that the angle differences vary as a function of latitude and altitude. The maximum 0.094$^{circ}$angle difference occurs at 45$^{circ}$latitude in case of 1000 Km altitude condition.

  • PDF

농업기계 내비게이션을 위한 INS/GPS 통합 연구 (Study on INS/GPS Sensor Fusion for Agricultural Vehicle Navigation System)

  • 노광모;박준걸;장영창
    • Journal of Biosystems Engineering
    • /
    • 제33권6호
    • /
    • pp.423-429
    • /
    • 2008
  • This study was performed to investigate the effects of inertial navigation system (INS) / global positioning system (GPS) sensor fusion for agricultural vehicle navigation. An extended Kalman filter algorithm was adopted for INS/GPS sensor fusion in an integrated mode, and the vehicle dynamic model was used instead of the navigation state error model. The INS/GPS system was consisted of a low-cost gyroscope, an odometer and a GPS receiver, and its performance was tested through computer simulations. When measurement noises of GPS receiver were 10, 1.0, 0.5, and 0.2 m ($1{\sigma}$), RMS position and heading errors of INS/GPS system at 5 m/s straight path were remarkably reduced with 10%, 35%, 40%, and 60% of those obtained from the GPS receiver, respectively. The decrease of position and heading errors by using INS/GPS rather than stand-alone GPS can provide more stable steering of agricultural equipments. Therefore, the low-cost INS/GPS system using the extended Kalman filter algorithm may enable the self-autonomous navigation to meet required performance like stable steering or more less position errors even in slow-speed operation.

Sensor Data Fusion for Navigation of Mobile Robot With Collision Avoidance and Trap Recovery

  • Jeon, Young-Su;Ahn, Byeong-Kyu;Kuc, Tae-Yong
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 제어로봇시스템학회 2003년도 ICCAS
    • /
    • pp.2461-2466
    • /
    • 2003
  • This paper presents a simple sensor fusion algorithm using neural network for navigation of mobile robots with obstacle avoidance and trap recovery. The multiple sensors input sensor data to the input layer of neural network activating the input nodes. The multiple sensors used include optical encoders, ultrasonic sensors, infrared sensors, a magnetic compass sensor, and GPS sensors. The proposed sensor fusion algorithm is combined with the VFH(Vector Field Histogram) algorithm for obstacle avoidance and AGPM(Adaptive Goal Perturbation Method) which sets adaptive virtual goals to escape trap situations. The experiment results show that the proposed low-level fusion algorithm is effective for real-time navigation of mobile robot.

  • PDF

AGV Navigation Using a Space and Time Sensor Fusion of an Active Camera

  • Jin, Tae-Seok;Lee, Bong-Ki;Lee, Jang-Myung
    • 한국항해항만학회지
    • /
    • 제27권3호
    • /
    • pp.273-282
    • /
    • 2003
  • This paper proposes a sensor-fusion technique where rho data sets for the previous moments are properly transformed and fused into the current data sets to enable accurate measurement, such as, distance to an obstacle and location of the service robot itself. In the conventional fusion schemes, the measurement is dependent only on the current data sets. As the results, more of sensors are required to measure a certain physical promoter or to improve the accuracy of the measurement. However, in this approach, intend of adding more sensors to the system, the temporal sequence of the data sets are stored and utilized for the measurement improvement. Theoretical basis is illustrated by examples md the effectiveness is proved through the simulation. Finally, the new space and time sensor fusion (STSF) scheme is applied to the control of a mobile robot in the indoor environment and the performance was demonstrated by the real experiments.

Fuzzy Neural Network Based Sensor Fusion and It's Application to Mobile Robot in Intelligent Robotic Space

  • Jin, Tae-Seok;Lee, Min-Jung;Hashimoto, Hideki
    • International Journal of Fuzzy Logic and Intelligent Systems
    • /
    • 제6권4호
    • /
    • pp.293-298
    • /
    • 2006
  • In this paper, a sensor fusion based robot navigation method for the autonomous control of a miniature human interaction robot is presented. The method of navigation blends the optimality of the Fuzzy Neural Network(FNN) based control algorithm with the capabilities in expressing knowledge and learning of the networked Intelligent Robotic Space(IRS). States of robot and IR space, for examples, the distance between the mobile robot and obstacles and the velocity of mobile robot, are used as the inputs of fuzzy logic controller. The navigation strategy is based on the combination of fuzzy rules tuned for both goal-approach and obstacle-avoidance. To identify the environments, a sensor fusion technique is introduced, where the sensory data of ultrasonic sensors and a vision sensor are fused into the identification process. Preliminary experiment and results are shown to demonstrate the merit of the introduced navigation control algorithm.

Position DOP Analysis for Sensor Placement in the TDOA-based Localization System

  • Lim, Deok-Won;Kang, Hee-Won;Lee, Sang-Jeong;Hwang, Dong-Hwan
    • Journal of Electrical Engineering and Technology
    • /
    • 제7권6호
    • /
    • pp.1009-1013
    • /
    • 2012
  • A relationship between the sensor placement and the PDOP (Position Dilution of Precision) is derived in the TDOA-based localization system. And the geometric condition of the sensor placement is analyzed in order to get a minimum PDOP based on the derived relationship. Through computer simulations, effect of the sensor placement on the PDOP is observed.