• Title/Summary/Keyword: Autonomous Landing

Search Result 37, Processing Time 0.031 seconds

Vision-based Autonomous Landing System of an Unmanned Aerial Vehicle on a Moving Vehicle (무인 항공기의 이동체 상부로의 영상 기반 자동 착륙 시스템)

  • Jung, Sungwook;Koo, Jungmo;Jung, Kwangyik;Kim, Hyungjin;Myung, Hyun
    • The Journal of Korea Robotics Society
    • /
    • v.11 no.4
    • /
    • pp.262-269
    • /
    • 2016
  • Flight of an autonomous unmanned aerial vehicle (UAV) generally consists of four steps; take-off, ascent, descent, and finally landing. Among them, autonomous landing is a challenging task due to high risks and reliability problem. In case the landing site where the UAV is supposed to land is moving or oscillating, the situation becomes more unpredictable and it is far more difficult than landing on a stationary site. For these reasons, the accurate and precise control is required for an autonomous landing system of a UAV on top of a moving vehicle which is rolling or oscillating while moving. In this paper, a vision-only based landing algorithm using dynamic gimbal control is proposed. The conventional camera systems which are applied to the previous studies are fixed as downward facing or forward facing. The main disadvantage of these system is a narrow field of view (FOV). By controlling the gimbal to track the target dynamically, this problem can be ameliorated. Furthermore, the system helps the UAV follow the target faster than using only a fixed camera. With the artificial tag on a landing pad, the relative position and orientation of the UAV are acquired, and those estimated poses are used for gimbal control and UAV control for safe and stable landing on a moving vehicle. The outdoor experimental results show that this vision-based algorithm performs fairly well and can be applied to real situations.

Vision Processing for Precision Autonomous Landing Approach of an Unmanned Helicopter (무인헬기의 정밀 자동착륙 접근을 위한 영상정보 처리)

  • Kim, Deok-Ryeol;Kim, Do-Myoung;Suk, Jin-Young
    • Journal of Institute of Control, Robotics and Systems
    • /
    • v.15 no.1
    • /
    • pp.54-60
    • /
    • 2009
  • In this paper, a precision landing approach is implemented based on real-time image processing. A full-scale landmark for automatic landing is used. canny edge detection method is applied to identify the outside quadrilateral while circular hough transform is used for the recognition of inside circle. Position information on the ground landmark is uplinked to the unmanned helicopter via ground control computer in real time so that the unmanned helicopter control the air vehicle for accurate landing approach. Ground test and a couple of flight tests for autonomous landing approach show that the image processing and automatic landing operation system have good performance for the landing approach phase at the altitude of $20m{\sim}1m$ above ground level.

Design and Fabrication of Multi-rotor system for Vision based Autonomous Landing (영상 기반 자동 착륙용 멀티로터 시스템 설계 및 개발)

  • Kim, Gyou-Beom;Song, Seung-Hwa;Yoon, Kwang-Joon
    • The Journal of the Institute of Internet, Broadcasting and Communication
    • /
    • v.12 no.6
    • /
    • pp.141-146
    • /
    • 2012
  • This paper introduces development of multi-rotor system and vision based autonomous landing system. Multi-rotor platform is modeled by rigid body motion with Newton Euler concept. Also Multi-rotor platform is simulated and tuned by LQR control algorithm. Vision based Autonomous Landing system uses a single camera that is mounted Multi-rotor system. Augmented reality algorithm is used as marker detection algorithm and autonomous landing code is test with GCS for the precision landing.

Autonomous Landing on Small Bodies based on Discrete Sliding Mode Control (이산 슬라이딩 모드 제어를 이용한 소천체 자율 착륙 기법)

  • Lee, Juyoung
    • Journal of the Korean Society for Aeronautical & Space Sciences
    • /
    • v.45 no.8
    • /
    • pp.647-661
    • /
    • 2017
  • This paper presents a robust method for autonomously landing on small bodies. Autonomous landing is accomplished by generating and following reference position and attitude profiles. The position and attitude tracking controllers are based on discrete sliding mode control, which explicitly treats the discrete and impulsive natures of thruster operation. Vision-based inertial navigation is used for autonomous navigation for landing. Numerical simulation is carried out to evaluate the performance of the proposed method in a realistic situation with environmental uncertainties.

Vision-based Obstacle State Estimation and Collision Prediction using LSM and CPA for UAV Autonomous Landing (무인항공기의 자동 착륙을 위한 LSM 및 CPA를 활용한 영상 기반 장애물 상태 추정 및 충돌 예측)

  • Seongbong Lee;Cheonman Park;Hyeji Kim;Dongjin Lee
    • Journal of Advanced Navigation Technology
    • /
    • v.25 no.6
    • /
    • pp.485-492
    • /
    • 2021
  • Vision-based autonomous precision landing technology for UAVs requires precise position estimation and landing guidance technology. Also, for safe landing, it must be designed to determine the safety of the landing point against ground obstacles and to guide the landing only when the safety is ensured. In this paper, we proposes vision-based navigation, and algorithms for determining the safety of landing point to perform autonomous precision landings. To perform vision-based navigation, CNN technology is used to detect landing pad and the detection information is used to derive an integrated navigation solution. In addition, design and apply Kalman filters to improve position estimation performance. In order to determine the safety of the landing point, we perform the obstacle detection and position estimation in the same manner, and estimate the speed of the obstacle using LSM. The collision or not with the obstacle is determined based on the CPA calculated by using the estimated state of the obstacle. Finally, we perform flight test to verify the proposed algorithm.

Design of Multisensor Navigation System for Autonomous Precision Approach and Landing

  • Soon, Ben K.H.;Scheding, Steve;Lee, Hyung-Keun;Lee, Hung-Kyu
    • Proceedings of the Korean Institute of Navigation and Port Research Conference
    • /
    • v.1
    • /
    • pp.377-382
    • /
    • 2006
  • Precision approach and landing of aircraft in a remote landing zone autonomously present several challenges. Firstly, the exact location, orientation and elevation of the landing zone are not always known; secondly, the accuracy of the navigation solution is not always sufficient for this type of precision maneuver if there is no DGPS availability within close proximity. This paper explores an alternative approach for estimating the navigation parameters of the aircraft to the landing area using only time-differenced GPS carrier phase measurement and range measurements from a vision system. Distinct ground landmarks are marked before the landing zone. The positions of these landmarks are extracted from the vision system then the ranges relative to these locations are used as measurements for the extended Kalman filter (EKF) in addition to the precise time-differenced GPS carrier phase measurements. The performance of this navigation algorithm is demonstrated using simulation.

  • PDF

Design and Flight Test of Autonomous Landing Approach Algorithm for UAV (무인 항공기의 자동 착륙 접근 알고리즘 설계 및 비행시험)

  • Jeong, Minjeong;Ryu, Han-Seok;Park, Sanghyuk
    • Journal of the Korean Society for Aeronautical & Space Sciences
    • /
    • v.41 no.6
    • /
    • pp.458-464
    • /
    • 2013
  • This paper presents an algorithm for autonomous landing approach of a unmanned aerial vehicle. The main purpose of the autonomous landing approach in this study is to help a safe landing at night. From any initial position of the aircraft when this function is engaged, a flight path command is generated from the initial position. The shortest combination of an initial circular arc, a straight line segment, and a final circular arc is chosen for the flight path that will lead the aircraft to one end of runway for a landing. The algorithm is initially validated through numerous simulations with various initial conditions of aircraft. Then it is successfully validated through a number of flight tests.

Guidance Law for Vision-Based Automatic Landing of UAV

  • Min, Byoung-Mun;Tahk, Min-Jea;Shim, Hyun-Chul David;Bang, Hyo-Choong
    • International Journal of Aeronautical and Space Sciences
    • /
    • v.8 no.1
    • /
    • pp.46-53
    • /
    • 2007
  • In this paper, a guidance law for vision-based automatic landing of unmanned aerial vehicles (UAVs) is proposed. Automatic landing is a challenging but crucial capability for UAVs to achieve a fully autonomous flight. In an autonomous landing maneuver of UAVs, the decision of where to landing and the generation of guidance command to achieve a successful landing are very significant problem. This paper is focused on the design of guidance law applicable to automatic landing problem of fixed-wing UAV and rotary-wing UAV, simultaneously. The proposed guidance law generates acceleration command as a control input which derived from a specified time-to-go ($t_go$) polynomial function. The coefficient of $t_go$-polynomial function are determined to satisfy some terminal constraints. Nonlinear simulation results using a fixed-wing and rotary-wing UAV models are presented.

Design of Deep Learning-Based Automatic Drone Landing Technique Using Google Maps API (구글 맵 API를 이용한 딥러닝 기반의 드론 자동 착륙 기법 설계)

  • Lee, Ji-Eun;Mun, Hyung-Jin
    • Journal of Industrial Convergence
    • /
    • v.18 no.1
    • /
    • pp.79-85
    • /
    • 2020
  • Recently, the RPAS(Remote Piloted Aircraft System), by remote control and autonomous navigation, has been increasing in interest and utilization in various industries and public organizations along with delivery drones, fire drones, ambulances, agricultural drones, and others. The problems of the stability of unmanned drones, which can be self-controlled, are also the biggest challenge to be solved along the development of the drone industry. drones should be able to fly in the specified path the autonomous flight control system sets, and perform automatically an accurate landing at the destination. This study proposes a technique to check arrival by landing point images and control landing at the correct point, compensating for errors in location data of the drone sensors and GPS. Receiving from the Google Map API and learning from the destination video, taking images of the landing point with a drone equipped with a NAVIO2 and Raspberry Pi, camera, sending them to the server, adjusting the location of the drone in line with threshold, Drones can automatically land at the landing point.