DOI QR코드

DOI QR Code

A Study on Control of Drone Swarms Using Depth Camera

Depth 카메라를 사용한 군집 드론의 제어에 대한 연구

  • Received : 2017.12.01
  • Accepted : 2018.06.05
  • Published : 2018.08.01

Abstract

General methods of controlling a drone are divided into manual control and automatic control, which means a drone moves along the route. In case of manual control, a man should be able to figure out the location and status of a drone and have a controller to control it remotely. When people control a drone, they collect information about the location and position of a drone with the eyes and have its internal information such as the battery voltage and atmospheric pressure delivered through telemetry. They make a decision about the movement of a drone based on the gathered information and control it with a radio device. The automatic control method of a drone finding its route itself is not much different from manual control by man. The information about the position of a drone is collected with the gyro and accelerator sensor, and the internal information is delivered to the CPU digitally. The location information of a drone is collected with GPS, atmospheric pressure sensors, camera sensors, and ultrasound sensors. This paper presents an investigation into drone control by a remote computer. Instead of using the automatic control function of a drone, this approach involves a computer observing a drone, determining its movement based on the observation results, and controlling it with a radio device. The computer with a Depth camera collects information, makes a decision, and controls a drone in a similar way to human beings, which makes it applicable to various fields. Its usability is enhanced further since it can control common commercial drones instead of specially manufactured drones for swarm flight. It can also be used to prevent drones clashing each other, control access to a drone, and control drones with no permit.

Keywords

References

  1. S. Anweiler, D. Piwowarski (2017), "Multicopter platform prototype for environmental monitoring", Journal of Cleaner Production 155, pp. 204-211. https://doi.org/10.1016/j.jclepro.2016.10.132
  2. David Hoeller, Anton Ledergerber, Michael Hamer, Raffaello D'Andrea (2017), "Augmenting Ultra-Wideband Localization with Computer Vision for Accurate Flight," IFAC-Papers OnLine, vol. 50, Issue 1, Pages 12734-12740. https://doi.org/10.1016/j.ifacol.2017.08.1826
  3. Kyong-Ho Han, Seong-Ho Lee (2017), "Implementation of Multi-channel Communication System for Drone Swarms Control," The Transactions of the Korean Institute of Electrical Engineers, vol. 66, no. 1, pp. 3866-3873.
  4. Seong-Ho Lee, Kyong-Ho Han (2014), "IDetection of Moving Objects using Depth Frame Data of 3D Sensor," The Journal of The Institute of Internet, Broadcasting and Communication(IIBC), vol. 14, no. 5, pp. 243-248. https://doi.org/10.7236/JIIBC.2014.14.5.243
  5. Hongyun Lin a, Chunyu Lin a, Yao Zhao a, Anhong Wang (2017), "3D saliency detection based on background detection," Journal of Visual Communication and Image Representation, vol. 48, pp. 238-253. https://doi.org/10.1016/j.jvcir.2017.06.011
  6. Daniya Zamalieva, Alper Yilmaz(2014), "Background subtraction for the moving camera: A geometric approach," Computer Vision and Image Understanding, vol. 127, pp. 73-85. https://doi.org/10.1016/j.cviu.2014.06.007