• Title/Summary/Keyword: Handy motion control

Search Result 2, Processing Time 0.018 seconds

Implementation and Verification of Deep Learning-based Automatic Object Tracking and Handy Motion Control Drone System (심층학습 기반의 자동 객체 추적 및 핸디 모션 제어 드론 시스템 구현 및 검증)

  • Kim, Youngsoo;Lee, Junbeom;Lee, Chanyoung;Jeon, Hyeri;Kim, Seungpil
    • IEMEK Journal of Embedded Systems and Applications
    • /
    • v.16 no.5
    • /
    • pp.163-169
    • /
    • 2021
  • In this paper, we implemented a deep learning-based automatic object tracking and handy motion control drone system and analyzed the performance of the proposed system. The drone system automatically detects and tracks targets by analyzing images obtained from the drone's camera using deep learning algorithms, consisting of the YOLO, the MobileNet, and the deepSORT. Such deep learning-based detection and tracking algorithms have both higher target detection accuracy and processing speed than the conventional color-based algorithm, the CAMShift. In addition, in order to facilitate the drone control by hand from the ground control station, we classified handy motions and generated flight control commands through motion recognition using the YOLO algorithm. It was confirmed that such a deep learning-based target tracking and drone handy motion control system stably track the target and can easily control the drone.

A Study on the Effect of Pre-cue in Simple Reactions on Control-on-Display Interfaces

  • Lim, Ji-Hyoun;Choi, Jun-Young;Kim, Young-Su
    • Journal of the Ergonomics Society of Korea
    • /
    • v.30 no.4
    • /
    • pp.563-569
    • /
    • 2011
  • Objective: This study focuses on the effects of pre-cues informing the location of upcoming visual stimulus on finger movement response in the context of control-on-display interfaces. Background: Previous research on pre-cues focus on attention allocation and motion studies were limited to indirect control conditions. The design of this study aimed to collect data on the exact landing point for finger-tap responses to a given visual stimulus. Method: Controlled visual stimuli and tasks were presented on a UI evaluation system built using mobile web standards; response accuracy and response time were measured and collected as appropriate. Among the 16 recruited participants, 11 completed the experiment. Results: Providing pre-cue on the location of stimulus affected response time and response accuracy. The response bias, which is a distance from the center of stimulus to the finger-tap location, was larger when the pre-cue was given during a one-handed operation. Conclusion: Given a pre-cue, response time decreases, but with accuracy penalized. Application: In designing touch-screen UI's - more strictly, visual components also acting as controllers - designers would do well to balance human perceptual and cognitive characteristics strategically.