• Title/Summary/Keyword: Robot navigation

Search Result 825, Processing Time 0.029 seconds

AGV Navigation Using a Space and Time Sensor Fusion of an Active Camera

  • Jin, Tae-Seok;Lee, Bong-Ki;Lee, Jang-Myung
    • Journal of Navigation and Port Research
    • /
    • v.27 no.3
    • /
    • pp.273-282
    • /
    • 2003
  • This paper proposes a sensor-fusion technique where rho data sets for the previous moments are properly transformed and fused into the current data sets to enable accurate measurement, such as, distance to an obstacle and location of the service robot itself. In the conventional fusion schemes, the measurement is dependent only on the current data sets. As the results, more of sensors are required to measure a certain physical promoter or to improve the accuracy of the measurement. However, in this approach, intend of adding more sensors to the system, the temporal sequence of the data sets are stored and utilized for the measurement improvement. Theoretical basis is illustrated by examples md the effectiveness is proved through the simulation. Finally, the new space and time sensor fusion (STSF) scheme is applied to the control of a mobile robot in the indoor environment and the performance was demonstrated by the real experiments.

Robust Vision-Based Autonomous Navigation Against Environment Changes (환경 변화에 강인한 비전 기반 로봇 자율 주행)

  • Kim, Jungho;Kweon, In So
    • IEMEK Journal of Embedded Systems and Applications
    • /
    • v.3 no.2
    • /
    • pp.57-65
    • /
    • 2008
  • Recently many researches on intelligent robots have been studied. An intelligent robot is capable of recognizing environments or objects to autonomously perform specific tasks using sensor readings. One of fundamental problems in vision-based robot applications is to recognize where it is and to decide safe path to perform autonomous navigation. However, previous approaches only consider well-organized environments that there is no moving object and environment changes. In this paper, we introduce a novel navigation strategy to handle occlusions caused by moving objects using various computer vision techniques. Experimental results demonstrate the capability to overcome such difficulties for autonomous navigation.

  • PDF

A Correction System of Odometry Error for Map Building of Mobile Robot Based on Sensor fusion

  • Hyun, Woong-Keun
    • Journal of information and communication convergence engineering
    • /
    • v.8 no.6
    • /
    • pp.709-715
    • /
    • 2010
  • This paper represents a map building and localization system for mobile robot. Map building and navigation is a complex problem because map integrity cannot be sustained by odometry alone due to errors introduced by wheel slippage, distortion and simple linealized odometry equation. For accurate localization, we propose sensor fusion system using encoder sensor and indoor GPS module as relative sensor and absolute sensor, respectively. To build a map, we developed a sensor based navigation algorithm and grid based map building algorithm based on Embedded Linux O.S. A wall following decision engine like an expert system was proposed for map building navigation. We proved this system's validity through field test.

Loosely Coupled LiDAR-visual Mapping and Navigation of AMR in Logistic Environments (실내 물류 환경에서 라이다-카메라 약결합 기반 맵핑 및 위치인식과 네비게이션 방법)

  • Choi, Byunghee;Kang, Gyeongsu;Roh, Yejin;Cho, Younggun
    • The Journal of Korea Robotics Society
    • /
    • v.17 no.4
    • /
    • pp.397-406
    • /
    • 2022
  • This paper presents an autonomous mobile robot (AMR) system and operation algorithms for logistic and factory facilities without magnet-lines installation. Unlike widely used AMR systems, we propose an EKF-based loosely coupled fusion of LiDAR measurements and visual markers. Our method first constructs occupancy grid and visual marker map in the mapping process and utilizes prebuilt maps for precise localization. Also, we developed a waypoint-based navigation pipeline for robust autonomous operation in unconstrained environments. The proposed system estimates the robot pose using by updating the state with the fusion of visual marker and LiDAR measurements. Finally, we tested the proposed method in indoor environments and existing factory facilities for evaluation. In experimental results, this paper represents the performance of our system compared to the well-known LiDAR-based localization and navigation system.

Development of a new omnidirectional robot with one spherical wheel (하나의 구형바퀴를 가지는 새로운 전 방향 이동로보트의 개발)

  • 최병준;이연정
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 1997.10a
    • /
    • pp.1605-1608
    • /
    • 1997
  • In this paper, a new onmidirectional robot with one spherical wheel is porposed. The peculiar structure of the proposed mobile robot makes it possible not only to move sideways but to be easy to implement. The wheel is derived by two stepping motors and equipped with 8-infrared sensors. To prove the validity of the proposed robot, the experiment of going through a way is performed.

  • PDF

NAVUNGATION CONTROL OF A MOBILE ROBOT (이동로보트의 궤도관제기법)

  • 홍문성;이상용;한민용
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 1989.10a
    • /
    • pp.226-229
    • /
    • 1989
  • This paper presents a navigation control method for a vision guided robot. The robot is equipped with one camera, an IBM/AT compatible PC, and a sonar system. The robot can either follow track specified on a monitor screen or navigate to a destination avoiding any obstacles on its way. The robot finds its current position as well as its moving direction by taking an image of a circular pattern placed on the ceiling.

  • PDF

Vision Navigation System by Autonomous Mobile Robot

  • Shin S.Y.;Lee, J.H.;Kang H.
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 2001.10a
    • /
    • pp.146.3-146
    • /
    • 2001
  • It has been integrated into several navigation systems. This paper shows that system recognizes difficult indoor roads and open area without any specific mark such as painted guide tine or tape. In this method, Robot navigates with visual sensors, which uses visual information to navigate itself along the road. An Artificial Neural Network System was used to decide where to move. It is designed with USB web camera as visual sensor.

  • PDF

Absolute Positioning System for Mobile Robot Navigation in an Indoor Environment (ICCAS 2004)

  • Yun, Jae-Mu;Park, Jin-Woo;Choi, Ho-Seek;Lee, Jang-Myung
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 2004.08a
    • /
    • pp.1448-1451
    • /
    • 2004
  • Position estimation is one of the most important functions for the mobile robot navigating in the unstructured environment. Most of previous localization schemes estimate current position and pose of mobile robot by applying various localization algorithms with the information obtained from sensors which are set on the mobile robot, or by recognizing an artificial landmark attached on the wall, or objects of the environment as natural landmark in the indoor environment. Several drawbacks about them have been brought up. To compensate the drawbacks, a new localization method that estimates the absolute position of the mobile robot by using a fixed camera on the ceiling in the corridor is proposed. And also, it can improve the success rate for position estimation using the proposed method, which calculates the real size of an object. This scheme is not a relative localization, which decreases the position error through algorithms with noisy sensor data, but a kind of absolute localization. The effectiveness of the proposed localization scheme is demonstrated through the experiments.

  • PDF