• 제목/요약/키워드: Vision-Based Navigation

검색결과 192건 처리시간 0.022초

Development of an IGVM Integrated Navigation System for Vehicular Lane-Level Guidance Services

  • Cho, Seong Yun
    • Journal of Positioning, Navigation, and Timing
    • /
    • 제5권3호
    • /
    • pp.119-129
    • /
    • 2016
  • This paper presents an integrated navigation system for accurate navigation solution-based safety and convenience services in the vehicular augmented reality (AR)-head up display (HUD) system. For lane-level guidance service, especially, an accurate navigation system is essential. To achieve this, an inertial navigation system (INS)/global positioning system (GPS)/vision/digital map (IGVM) integrated navigation system has been developing. In this paper, the concept of the integrated navigation system is introduced and is implemented based on a multi-model switching filter and vehicle status decided by using the GPS data and inertial measurement unit (IMU) measurements. The performance of the implemented navigation system is verified experimentally.

Integrated Navigation Design Using a Gimbaled Vision/LiDAR System with an Approximate Ground Description Model

  • Yun, Sukchang;Lee, Young Jae;Kim, Chang Joo;Sung, Sangkyung
    • International Journal of Aeronautical and Space Sciences
    • /
    • 제14권4호
    • /
    • pp.369-378
    • /
    • 2013
  • This paper presents a vision/LiDAR integrated navigation system that provides accurate relative navigation performance on a general ground surface, in GNSS-denied environments. The considered ground surface during flight is approximated as a piecewise continuous model, with flat and slope surface profiles. In its implementation, the presented system consists of a strapdown IMU, and an aided sensor block, consisting of a vision sensor and a LiDAR on a stabilized gimbal platform. Thus, two-dimensional optical flow vectors from the vision sensor, and range information from LiDAR to ground are used to overcome the performance limit of the tactical grade inertial navigation solution without GNSS signal. In filter realization, the INS error model is employed, with measurement vectors containing two-dimensional velocity errors, and one differenced altitude in the navigation frame. In computing the altitude difference, the ground slope angle is estimated in a novel way, through two bisectional LiDAR signals, with a practical assumption representing a general ground profile. Finally, the overall integrated system is implemented, based on the extended Kalman filter framework, and the performance is demonstrated through a simulation study, with an aircraft flight trajectory scenario.

영상기반항법을 위한 파티클 필터 기반의 특징점 추적 필터 설계 (Particle Filter Based Feature Points Tracking for Vision Based Navigation System)

  • 원대희;성상경;이영재
    • 한국항공우주학회지
    • /
    • 제40권1호
    • /
    • pp.35-42
    • /
    • 2012
  • 본 논문은 영상기반항법에서 특징점의 이동변위가 큰 경우에도 추적 성능을 확보할 수 있는 파티클 필터 기반의 특징점 추적 필터를 설계하였다. 기존 KLT(Kanade-Lucas-Tomasi) 알고리즘에서 이동량이 큰 경우의 추적 성능을 향상시키기 위해 특징점의 동역학 모델을 적용하였고, 불규칙적인 영상정보의 특성을 반영하기 위해 파티클 필터를 사용하였다. 저장된 이미지로 KLT 알고리즘과의 특징점 추적 성능을 비교한 결과 제안한 알고리즘은 큰 이동량을 갖는 경우에도 추적 기능을 유지하는 것을 확인하였다.

환경 변화에 강인한 비전 기반 로봇 자율 주행 (Robust Vision-Based Autonomous Navigation Against Environment Changes)

  • 김정호;권인소
    • 대한임베디드공학회논문지
    • /
    • 제3권2호
    • /
    • pp.57-65
    • /
    • 2008
  • Recently many researches on intelligent robots have been studied. An intelligent robot is capable of recognizing environments or objects to autonomously perform specific tasks using sensor readings. One of fundamental problems in vision-based robot applications is to recognize where it is and to decide safe path to perform autonomous navigation. However, previous approaches only consider well-organized environments that there is no moving object and environment changes. In this paper, we introduce a novel navigation strategy to handle occlusions caused by moving objects using various computer vision techniques. Experimental results demonstrate the capability to overcome such difficulties for autonomous navigation.

  • PDF

동적 환경에 강인한 장면 인식 기반의 로봇 자율 주행 (Scene Recognition based Autonomous Robot Navigation robust to Dynamic Environments)

  • 김정호;권인소
    • 로봇학회논문지
    • /
    • 제3권3호
    • /
    • pp.245-254
    • /
    • 2008
  • Recently, many vision-based navigation methods have been introduced as an intelligent robot application. However, many of these methods mainly focus on finding an image in the database corresponding to a query image. Thus, if the environment changes, for example, objects moving in the environment, a robot is unlikely to find consistent corresponding points with one of the database images. To solve these problems, we propose a novel navigation strategy which uses fast motion estimation and a practical scene recognition scheme preparing the kidnapping problem, which is defined as the problem of re-localizing a mobile robot after it is undergone an unknown motion or visual occlusion. This algorithm is based on motion estimation by a camera to plan the next movement of a robot and an efficient outlier rejection algorithm for scene recognition. Experimental results demonstrate the capability of the vision-based autonomous navigation against dynamic environments.

  • PDF

AGV 운행을 위한 비전기반 유도선 해석 기술 (A Vision Based Guideline Interpretation Technique for AGV Navigation)

  • 변성민;김민환
    • 한국멀티미디어학회논문지
    • /
    • 제15권11호
    • /
    • pp.1319-1329
    • /
    • 2012
  • AGV는 최근 생산라인에서 활용이 증대되고 있으며, 저렴하고 속도가 빠른 자기 테이프 유도 방식의 AGV가 널리 사용되고 있다. 그러나 이러한 방식의 AGV 운행 시스템은 고가의 설치비와 운행경로 변경의 유연성 저하 등으로 인해 다품종 소량 생산 시스템이나 협업 기반 생산 시스템에 적용하기 어려운 단점이 있다. 본 논문에서는 설치 및 변경이 매우 용이한 색 테이프 또는 페인트 기반의 유도선을 카메라 비전을 이용하여 검출하고 해석하는 기술을 제시한다. AGV 운행경로의 자유로운 설정 및 변경이 가능하도록 분기 지점이나 합류 지점과 같은 복잡한 구조의 유도선 부분도 자동으로 분석하는 방법을 제시하며, 또한 안정적인 AGV 운행이 가능하도록 적합한 유도선 추적방향을 결정하는 방법도 제시한다. 제시한 기술을 구현 적용한 실제 산업용 AGV의 실시간 운행 실험을 통해, 제시한 기술이 산업현장에서 실제로 안정적으로 적용 가능함을 확인하였다.

의미론적 분할된 항공 사진을 활용한 영상 기반 항법 (Vision-based Navigation using Semantically Segmented Aerial Images)

  • 홍경우;김성중;박준우;방효충;허준회;김진원;박장호;서송원
    • 한국항공우주학회지
    • /
    • 제48권10호
    • /
    • pp.783-789
    • /
    • 2020
  • 영상 기반 항법은 GPS/INS 통합 항법 시스템의 취약점을 보강할 수 있는 보조 항법 기술로 비행체에서 촬영한 항공 영상과 기존의 데이터베이스를 비교하여 비행체의 위치를 구한다. 하지만 데이터베이스가 생성된 시점은 항공 영상 촬영 시점과 다를 수밖에 없으며, 이러한 시점 차이로 인해 두 영상 간의 다른 특징점들이 생성된다. 즉, 유사하지만 다른 두 영상이므로 일반적인 영상 대조 알고리즘을 항법 문제에 적용하기 힘들다. 따라서 본 논문에서는 인공지능 기법인 의미론적 분할을 활용하여 항공 영상에서 항법에 필요한 정보를 분류한 후 영상 대조를 수행하는 방법을 제안한다. 의미론적 분할로 시점 변화, 촬영 조건 변화가 있더라도 강건하게 두 영상이 정합 되도록 한다. 제안한 방법은 시뮬레이션과 비행 실험을 통해 성능을 확인하며, 일반적인 영상 대조 알고리즘을 이용하여 항법을 수행한 결과와 비교한다.

Corridor Navigation of the Mobile Robot Using Image Based Control

  • Han, Kyu-Bum;Kim, Hae-Young;Baek, Yoon-Su
    • Journal of Mechanical Science and Technology
    • /
    • 제15권8호
    • /
    • pp.1097-1107
    • /
    • 2001
  • In this paper, the wall following navigation algorithm of the mobile robot using a mono vision system is described. The key points of the mobile robot navigation system are effective acquisition of the environmental information and fast recognition of the robot position. Also, from this information, the mobile robot should be appropriately controlled to follow a desired path. For the recognition of the relative position and orientation of the robot to the wall, the features of the corridor structure are extracted using the mono vision system, then the relative position, the offset distance and steering angle of the robot from the wall, is derived for a simple corridor geometry. For the alleviation of the computation burden of the image processing, the Kalman filter is used to reduce search region in the image space for line detection. Next, the robot is controlled by this information to follow the desired path. The wall following control scheme by the PD control scheme is composed of two control parts, the approaching control and the orientation control, and each control is performed by steering and forward-driving motion of the robot. To verify the effectiveness of the proposed algorithm, the real time navigation experiments are performed. Through the result of the experiments, the effectiveness and flexibility of the suggested algorithm are verified in comparison with a pure encoder-guided mobile robot navigation system.

  • PDF

무인로봇 정밀위치추정을 위한 전술통신 및 영상 기반의 통합항법 성능 분석 (The Performance Analysis of Integrated Navigation System Based on the Tactical Communication and VISION for the Accurate Localization of Unmanned Robot)

  • 최지훈;박용운;송재복;권인소
    • 한국군사과학기술학회지
    • /
    • 제14권2호
    • /
    • pp.271-280
    • /
    • 2011
  • This paper presents a navigation system based on the tactical communication and vision system in outdoor environments which is applied to unmanned robot for perimeter surveillance operations. GPS errors of robot are compensated by the reference station of C2(command and control) vehicle and WiBro(Wireless Broadband) is used for the communication between two systems. In the outdoor environments, GPS signals can be easily blocked due to trees and buildings. In this environments, however, vision system is very efficient because there are many features. With the feature MAP around the operation environments, the robot can estimate the position by the image matching and pose estimation. In the navigation system, thus, operation modes is switched by navigation manager according to some environment conditions. The experimental results show that the unmanned robot can estimate the position very accurately in outdoor environment.