• Title/Summary/Keyword: Vision-based formation flight

Search Result 3, Processing Time 0.016 seconds

Monocular Vision-Based Guidance and Control for a Formation Flight

  • Cheon, Bong-kyu;Kim, Jeong-ho;Min, Chan-oh;Han, Dong-in;Cho, Kyeum-rae;Lee, Dae-woo;Seong, kie-jeong
    • International Journal of Aeronautical and Space Sciences
    • /
    • v.16 no.4
    • /
    • pp.581-589
    • /
    • 2015
  • This paper describes a monocular vision-based formation flight technology using two fixed wing unmanned aerial vehicles. To measuring relative position and attitude of a leader aircraft, a monocular camera installed in the front of the follower aircraft captures an image of the leader, and position and attitude are measured from the image using the KLT feature point tracker and POSIT algorithm. To verify the feasibility of this vision processing algorithm, a field test was performed using two light sports aircraft, and our experimental results show that the proposed monocular vision-based measurement algorithm is feasible. Performance verification for the proposed formation flight technology was carried out using the X-Plane flight simulator. The formation flight simulation system consists of two PCs playing the role of leader and follower. When the leader flies by the command of user, the follower aircraft tracks the leader by designed guidance and a PI control law, and all the information about leader was measured using monocular vision. This simulation shows that guidance using relative attitude information tracks the leader aircraft better than not using attitude information. This simulation shows absolute average errors for the relative position as follows: X-axis: 2.88 m, Y-axis: 2.09 m, and Z-axis: 0.44 m.

Pose Estimation of Leader Aircraft for Vision-based Formation Flight (영상기반 편대비행을 위한 선도기 자세예측 알고리즘)

  • Heo, Jin-Woo;Kim, Jeong-Ho;Han, Dong-In;Lee, Dae-Woo;Cho, Kyeum-Rae;Hur, Gi-Bong
    • Journal of the Korean Society for Aeronautical & Space Sciences
    • /
    • v.41 no.7
    • /
    • pp.532-538
    • /
    • 2013
  • This paper describes a vision-based only attitude estimation technique for the leader in the formation flight. The feature points in image obtained from the X-PLANE simulator are extracted by the SURF(Speed Up Robust Features) algorithm. We use POSIT(Pose from Orthography and Scaling with Iteration) algorithm to estimate attitude. Finally we verify that attitude estimation using vision only can yield small estimated error of $1.1{\sim}1.76^{\circ}$.

FPGA based HW/SW co-design for vision based real-time position measurement of an UAV

  • Kim, Young Sik;Kim, Jeong Ho;Han, Dong In;Lee, Mi Hyun;Park, Ji Hoon;Lee, Dae Woo
    • International Journal of Aeronautical and Space Sciences
    • /
    • v.17 no.2
    • /
    • pp.232-239
    • /
    • 2016
  • Recently, in order to increase the efficiency and mission success rate of UAVs (Unmanned Aerial Vehicles), the necessity for formation flights is increased. In general, GPS (Global Positioning System) is used to obtain the relative position of leader with respect to follower in formation flight. However, it can't be utilized in environment where GPS jamming may occur or communication is impossible. Therefore, in this study, monocular vision is used for measuring relative position. General PC-based vision processing systems has larger size than embedded systems and is hard to install on small vehicles. Thus FPGA-based processing board is used to make our system small and compact. The processing system is divided into two blocks, PL(Programmable Logic) and PS(Processing system). PL is consisted of many parallel logic arrays and it can handle large amount of data fast, and it is designed in hardware-wise. PS is consisted of conventional processing unit like ARM processor in hardware-wise and sequential processing algorithm is installed on it. Consequentially HW/SW co-designed FPGA system is used for processing input images and measuring a relative 3D position of the leader, and this system showed RMSE accuracy of 0.42 cm ~ 0.51 cm.