• Title/Summary/Keyword: ego-motion compensation

Search Result 5, Processing Time 0.02 seconds

Fire Detection Algorithm for a Quad-rotor using Ego-motion Compensation (Ego-Motion 보정기법을 적용한 쿼드로터의 화재 감지 알고리즘)

  • Lee, Young-Wan;Kim, Jin-Hwang;Oh, Jeong-Ju;Kim, Hakil
    • Journal of Institute of Control, Robotics and Systems
    • /
    • v.21 no.1
    • /
    • pp.21-27
    • /
    • 2015
  • A conventional fire detection has been developed based on images captured from a fixed camera. However, It is difficult to apply current algorithms to a flying Quad-rotor to detect fire. To solve this problem, we propose that the fire detection algorithm can be modified for Quad-rotor using Ego-motion compensation. The proposed fire detection algorithm consists of color detection, motion detection, and fire determination using a randomness test. Color detection and randomness test are adapted similarly from an existing algorithm. However, Ego-motion compensation is adapted on motion detection for compensating the degree of Quad-rotor's motion using Planar Projective Transformation based on Optical Flow, RANSAC Algorithm, and Homography. By adapting Ego-motion compensation on the motion detection step, it has been proven that the proposed algorithm has been able to detect fires 83% of the time in hovering mode.

Stabilization of Target Tracking with 3-axis Motion Compensation for Camera System on Flying Vehicle

  • Sun, Yanjie;Jeon, Dongwoon;Kim, Doo-Hyun
    • IEMEK Journal of Embedded Systems and Applications
    • /
    • v.9 no.1
    • /
    • pp.43-52
    • /
    • 2014
  • This paper presents a tracking system using images captured from a camera on a moving platform. A camera on an unmanned flying vehicle generally moves and shakes due to external factors such as wind and the ego-motion of the machine itself. This makes it difficult to track a target properly, and sometimes the target cannot be kept in view of the camera. To deal with this problem, we propose a new system for stable tracking of a target under such conditions. The tracking system includes target tracking and 3-axis camera motion compensation. At the same time, we consider the simulation of the motion of flying vehicles for efficient and safe testing. With 3-axis motion compensation, our experimental results show that robustness and stability are improved.

Human Tracking and Body Silhouette Extraction System for Humanoid Robot (휴머노이드 로봇을 위한 사람 검출, 추적 및 실루엣 추출 시스템)

  • Kwak, Soo-Yeong;Byun, Hye-Ran
    • The Journal of Korean Institute of Communications and Information Sciences
    • /
    • v.34 no.6C
    • /
    • pp.593-603
    • /
    • 2009
  • In this paper, we propose a new integrated computer vision system designed to track multiple human beings and extract their silhouette with an active stereo camera. The proposed system consists of three modules: detection, tracking and silhouette extraction. Detection was performed by camera ego-motion compensation and disparity segmentation. For tracking, we present an efficient mean shift based tracking method in which the tracking objects are characterized as disparity weighted color histograms. The silhouette was obtained by two-step segmentation. A trimap is estimated in advance and then this was effectively incorporated into the graph cut framework for fine segmentation. The proposed system was evaluated with respect to ground truth data and it was shown to detect and track multiple people very well and also produce high quality silhouettes. The proposed system can assist in gesture and gait recognition in field of Human-Robot Interaction (HRI).

Fast Video Stabilization Method Using Integral Image (적분 영상을 이용한 고속 비디오 안정화 기법)

  • Kwon, Young-Man;Lim, Myung-Jae;Oh, Byung-Hun
    • The Journal of the Institute of Internet, Broadcasting and Communication
    • /
    • v.10 no.5
    • /
    • pp.13-20
    • /
    • 2010
  • We proposed a new technique to perform fast video stabilization using integral image in this article. In the proposed technique, it evaluate local and global motion by the block matching using the generated integral image for each frame and compensate the motion like jitter. We made the various experimental jitter patterns to evaluate the effectiveness of the proposed technique and evaluated stabilization capability and execution time with the existing ones. Through the experiment, we found that the execution time of proposed technique was faster than that of existing techniques and the compensation of jitter was well done.

Grandmothers' Experience of Child Care with Grandchildren: A Grounded Theory Approach (손자녀를 돌보는 외할머니의 양육경험: 근거이론적 접근)

  • Won, Mi-Ra;Lee, Sun-Hee;Kim, Hyun-Kyoung;Yoo, Hye-Yeong;Park, Jung-Uk
    • Korean Parent-Child Health Journal
    • /
    • v.15 no.1
    • /
    • pp.39-49
    • /
    • 2012
  • Purpose: This study examined grandmothers' experience of child care with grandchildren to understand how it affects their stage of life based on the methodology of grounded theory. Methods: This study used grounded theory, a method of qualitative study, as the theoretical foundation. The subjects of this study were 10 grandmothers who have reared their grandchildren for at least 6 months and volunteered to participate in the study after listening to an explanation. Resources were collected through in-depth interviews and the interviews were recorded and dictated. Results: The core category of participants in this study was "solidifying the family relationship". There were five steps in the process of "solidifying the family relationship". The first step of process was "affection motion" of rearing to help their daughters. The second step was "conflict" due to physical constraint and burden of child care. The third step was "acceptance" based on compensation and support and the fourth step was "development" in search of their ego. The last step was "integration" of family relationship. Conclusion: This study provided basic data for appropriate nursing intervention to grandmother to care for their grandchildren according to the five steps of grandmother's child care experience process.

  • PDF