• Title/Summary/Keyword: Optical Camera

Search Result 1,231, Processing Time 0.039 seconds

Collaborative Obstacle Avoidance Method of Surface and Aerial Drones based on Acoustic Information and Optical Image (음향정보 및 광학영상 기반의 수상 및 공중 드론의 협력적 장애물회피 기법)

  • Man, Dong-Woo;Ki, Hyeon-Seung;Kim, Hyun-Sik
    • The Transactions of The Korean Institute of Electrical Engineers
    • /
    • v.64 no.7
    • /
    • pp.1081-1087
    • /
    • 2015
  • Recently, the researches of aerial drones are actively executed in various areas, the researches of surface drones and underwater drones are also executed in marine areas. In case of surface drones, they essentially utilize acoustic information by the sonar and consequently have the local information in the obstacle avoidance as the sonar has the limitations due to the beam width and detection range. In order to overcome this, more global method that utilizes optical images by the camera is required. Related to this, the aerial drone with the camera is desirable as the obstacle detection of the surface drone with the camera is impossible in case of the existence of clutters. However, the dynamic-floating aerial drone is not desirable for the long-term operation as its power consumption is high. To solve this problem, a collaborative obstacle avoidance method based on the acoustic information by the sonar of the surface drone and the optical image by the camera of the static-floating aerial drone is proposed. To verify the performance of the proposed method, the collaborative obstacle avoidances of a MSD(Micro Surface Drone) with an OAS(Obstacle Avoidance Sonar) and a BMAD(Balloon-based Micro Aerial Drone) with a camera are executed. The test results show the possibility of real applications and the need for additional studies.

Transition-based Data Decoding for Optical Camera Communications Using a Rolling Shutter Camera

  • Kim, Byung Wook;Lee, Ji-Hwan;Jung, Sung-Yoon
    • Current Optics and Photonics
    • /
    • v.2 no.5
    • /
    • pp.422-430
    • /
    • 2018
  • Rolling shutter operation of CMOS cameras can be utilized in optical camera communications in order to transmit data from an LED to mobile devices such as smart-phones. From temporally modulated light, a spatial flicker pattern is obtained in the captured image, and this is used for signal recovery. Due to the degradation of rolling shutter images caused by light smear, motion blur, and focus blur, the conventional decoding schemes for rolling shutter cameras based on the pattern width for 'OFF' and 'ON' cannot guarantee robust communications performance for practical uses. Aside from conventional techniques, such as polynomial fitting, histogram equalization can be used for blurry light mitigation, but it requires additional computation abilities resulting in burdens on mobile devices. This paper proposes a transition-based decoding scheme for rolling shutter cameras in order to offer simple and robust data decoding in the presence of image degradation. Based on the designed synchronization pulse and modulated data symbols according to the LED dimming level, the decoding process is conducted by observing the transition patterns of two sequential symbol pulses. For this, the extended symbol pulse caused by consecutive symbol pulses with the same level determines whether the second pulse should be included for the next bit decoding or not. The proposed method simply identifies the transition patterns of sequential symbol pulses other than the pattern width of 'OFF' and 'ON' for data decoding, and thus, it is simpler and more accurate. Experimental results ensured that the transition-based decoding scheme is robust even in the presence of blurry lights in the captured image at various dimming levels

The Overview of CEU Development for a Payload

  • Kong, Jong-Pil;Heo, Haeng-Pal;Kim, Young-Sun;Park, Jong-Euk;Chang, Young-Jun
    • Proceedings of the KSRS Conference
    • /
    • v.2
    • /
    • pp.797-799
    • /
    • 2006
  • The Electro-optical camera subsystem as a payload of a satellite system consists of OM (optical module) and CEU(camera electronics unit), and most performances of the camera subsystem depend a lot on the CEU in which TDI CCDs(Time Delayed Integration Charge Coupled Device) take the main role of imaging by converting the light intensity into measurable voltage signal. Therefore it is required to specify and design the CEU very carefully at the early stage of development with overall specifications, design considerations, calibration definition, test methods for key performance parameters. This paper describes the overview of CEU development. It lists key requirement characteristics of CEU hardware and design considerations. It also describes what kinds of calibration are required for the CEU and defines the test and evaluation conditions in verifying requirement specifications of the CEU, which are used during acceptance test, considering the fact that CEU performance results change a lot depending on test and evaluation conditions such as operational line rate, TDI level, and light intensity level, so on.

  • PDF

Pose Estimation of Ground Test Bed using Ceiling Landmark and Optical Flow Based on Single Camera/IMU Fusion (천정부착 랜드마크와 광류를 이용한 단일 카메라/관성 센서 융합 기반의 인공위성 지상시험장치의 위치 및 자세 추정)

  • Shin, Ok-Shik;Park, Chan-Gook
    • Journal of Institute of Control, Robotics and Systems
    • /
    • v.18 no.1
    • /
    • pp.54-61
    • /
    • 2012
  • In this paper, the pose estimation method for the satellite GTB (Ground Test Bed) using vision/MEMS IMU (Inertial Measurement Unit) integrated system is presented. The GTB for verifying a satellite system on the ground is similar to the mobile robot having thrusters and a reaction wheel as actuators and floating on the floor by compressed air. The EKF (Extended Kalman Filter) is also used for fusion of MEMS IMU and vision system that consists of a single camera and infrared LEDs that is ceiling landmarks. The fusion filter generally utilizes the position of feature points from the image as measurement. However, this method can cause position error due to the bias of MEMS IMU when the camera image is not obtained if the bias is not properly estimated through the filter. Therefore, it is proposed that the fusion method which uses the position of feature points and the velocity of the camera determined from optical flow of feature points. It is verified by experiments that the performance of the proposed method is robust to the bias of IMU compared to the method that uses only the position of feature points.

Novel Calibration Method for the Multi-Camera Measurement System

  • Wang, Xinlei
    • Journal of the Optical Society of Korea
    • /
    • v.18 no.6
    • /
    • pp.746-752
    • /
    • 2014
  • In a multi-camera measurement system, the determination of the external parameters is one of the vital tasks, referred to as the calibration of the system. In this paper, a new geometrical calibration method, which is based on the theory of the vanishing line, is proposed. Using a planar target with three equally spaced parallel lines, the normal vector of the target plane can be confirmed easily in every camera coordinate system of the measurement system. By moving the target into more than two different positions, the rotation matrix can be determined from related theory, i.e., the expression of the same vector in different coordinate systems. Moreover, the translation matrix can be derived from the known distance between the adjacent parallel lines. In this paper, the main factors effecting the calibration are analyzed. Simulations show that the proposed method achieves robustness and accuracy. Experimental results show that the calibration can reach 1.25 mm with the range about 0.5m. Furthermore, this calibration method also can be used for auto-calibration of the multi-camera mefasurement system as the feature of parallels exists widely.

Three-Dimensional Measurement of Moving Surface Using Circular Dynamic Stereo

  • Lee, Man-Hyung;Hong, Suh-Il
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 2001.10a
    • /
    • pp.101.3-101
    • /
    • 2001
  • By setting a refractor with a certain angle against the optical axis of the CCD camera lens, the image of a measuring point recorded on the image plane is displaced by the corresponding amounts related to the distance between the camera and the measuring point. When the refractor that keeps the angle against the optical axis is rotated physically at high speed during the exposure of the camera, the image of a measuring point draws an annular streak. Since the size of the annular streak is inversely proportional to the distance between the camera and the measuring point, the 3D position of the measuring point can be obtained by processing the streak. In this paper, for one of the applications of our system, the measurement of a moving surface is introduced. In order to measure the moving surface, multi laser spots are projected on the surface of object. Each position of ...

  • PDF