• Title/Summary/Keyword: EO/IR

Search Result 106, Processing Time 0.023 seconds

A Study of Test Method for Position Reporting Accuracy of Airborne Camera (항공기 탑재용 카메라 위치출력오차 측정방안 연구)

  • Song, Dae-Buem;Yoon, Yong-Eun
    • Journal of the Korea Institute of Military Science and Technology
    • /
    • v.16 no.5
    • /
    • pp.646-652
    • /
    • 2013
  • PRA(Position Reporting Accuracy) for EO/IR(Electro-Optic/Infrared) airborne camera is an important factor in geo-pointing accuracy. Generally, rate table is used to measure PRA of gimbal actuated camera like EO/IR. However, it is not always possible to fix an EUT(Equipment for Under Test) to rate table due to capacity limit of the table on the size and weight of the object(EUT). Our EO/IR is too big and heavy to emplace on it. Therefore, we propose a new verification method of PRA for airborne camera and assess the validity of our proposition. In this method we use collimator, angle measuring instrument, 6 dof motion simulator, optical surface plate, leveling laser, inclinometer and poster(for alignment).

Establishment of Test & Evaluation Criteria in the Military Electro-Optical / Infrared Devices (군용 EO/IR장비의 시험평가 기준 정립방안)

  • Park, Jong Wan
    • Journal of the Korea Institute of Military Science and Technology
    • /
    • v.19 no.5
    • /
    • pp.613-617
    • /
    • 2016
  • Development-concerned parties, business managers and test evaluators have experienced conflicts among themselves due to the absence of standardized criteria for military electro-optical(EO) and infrared(IR) devices in test evaluation planning and evaluation phases. Therefore, establishment of objectified test and evaluation criteria for EO/IR devices is required. This paper applies South Korea's weather measurement average of minimum 15 km for visibility range, 3 bar from Johnson criteria for EO device and 4 bar from NATO's STANAG-4347 for IR device for target type, and 50 % probability for evaluation criteria, respectively. Based upon these criteria, this paper will establish suitable criteria that are improved for development weapon system in consideration of required capability of demanding forces and field environment.

Multi-task Architecture for Singe Image Dynamic Blur Restoration and Motion Estimation (단일 영상 비균일 블러 제거를 위한 다중 학습 구조)

  • Jung, Hyungjoo;Jang, Hyunsung;Ha, Namkoo;Yeon, Yoonmo;Kwon, Ku yong;Sohn, Kwanghoon
    • Journal of Korea Multimedia Society
    • /
    • v.22 no.10
    • /
    • pp.1149-1159
    • /
    • 2019
  • We present a novel deep learning architecture for obtaining a latent image from a single blurry image, which contains dynamic motion blurs through object/camera movements. The proposed architecture consists of two sub-modules: blur image restoration and optical flow estimation. The tasks are highly related in that object/camera movements make cause blurry artifacts, whereas they are estimated through optical flow. The ablation study demonstrates that training multi-task architecture simultaneously improves both tasks compared to handling them separately. Objective and subjective evaluations show that our method outperforms the state-of-the-arts deep learning based techniques.

Imaging Performance Analysis of an EO/IR Dual Band Airborne Camera

  • Lee, Jun-Ho;Jung, Yong-Suk;Ryoo, Seung-Yeol;Kim, Young-Ju;Park, Byong-Ug;Kim, Hyun-Jung;Youn, Sung-Kie;Park, Kwang-Woo;Lee, Haeng-Bok
    • Journal of the Optical Society of Korea
    • /
    • v.15 no.2
    • /
    • pp.174-181
    • /
    • 2011
  • An airborne sensor is developed for remote sensing on an aerial vehicle (UV). The sensor is an optical payload for an eletro-optical/infrared (EO/IR) dual band camera that combines visible and IR imaging capabilities in a compact and lightweight package. It adopts a Ritchey-Chr$\'{e}$tien telescope for the common front end optics with several relay optics that divide and deliver EO and IR bands to a charge-coupled-device (CCD) and an IR detector, respectively. The EO/IR camera for dual bands is mounted on a two-axis gimbal that provides stabilized imaging and precision pointing in both the along and cross-track directions. We first investigate the mechanical deformations, displacements and stress of the EO/IR camera through finite element analysis (FEA) for five cases: three gravitational effects and two thermal conditions. For investigating gravitational effects, one gravitational acceleration (1 g) is given along each of the +x, +y and +z directions. The two thermal conditions are the overall temperature change to $30^{\circ}C$ from $20^{\circ}C$ and the temperature gradient across the primary mirror pupil from $-5^{\circ}C$ to $+5^{\circ}C$. Optical performance, represented by the modulation transfer function (MTF), is then predicted by integrating the FEA results into optics design/analysis software. This analysis shows the IR channel can sustain imaging performance as good as designed, i.e., MTF 38% at 13 line-pairs-per-mm (lpm), with refocus capability. Similarly, the EO channel can keep the designed performance (MTF 73% at 27.3 lpm) except in the case of the overall temperature change, in which the EO channel experiences slight performance degradation (MTF 16% drop) for $20^{\circ}C$ overall temperate change.

Automatic FOD Detection Test Using EO/ IR Laser Light Camera (EO / IR Laser Light 카메라를 이용한 FOD 자동탐지 시험)

  • Shin, Hyun-Sung;Hong, Gyo-Young;Hong, Jae-Beom;Choi, Young-Soo;Kim, Yun-Seob
    • Journal of Advanced Navigation Technology
    • /
    • v.21 no.6
    • /
    • pp.638-642
    • /
    • 2017
  • FOD is a generic term for substances with potential threats that can pose a fatal risk to aircraft. Therefore, FOD should be noted in all areas of the airport. Especially, the method of detecting and collecting FOD in runway and aircraft movements is very low efficiency and economical efficiency of airport operation, so it is essential to develop FOD automatic detection system suitable for domestic environment. As part of the aviation safety technology development project, the development of an automatic detection system for foreign matter in the moving area of the aircraft inside the airport is underway. In this paper, it is confirmed that EO / IR camera is used for detection of foreign objects at Taean Airfield of Hanseo University. EO camera is used during the day and IR camera is used at night.

Synthesis of Biodegradable Polymers with Carbon Dioixde (이산화탄소를 이용한 생분해성 고분자의 합성)

  • Shin Sang Chul;Shin Jae Shik;Lee Yoon Rae
    • Journal of the Korea Academia-Industrial cooperation Society
    • /
    • v.5 no.6
    • /
    • pp.521-525
    • /
    • 2004
  • Biodegradation of poly(ethylene carbonate) (PEC) and their terpolymers has been investigated in vitro. PEC has been synthesized with ethylene oxide (EO) and carbon dioxide, which is one of the greenhouse gases using Zinc glutarate has been used as catalyst Carbonate terpolymers have been prepared by the use of EO, cyclohexene oxide(CHO), and carbon dioxide. High biodegradability of PEC and terpolymers with EO. has been observed. Very low biodegradation of poly(propylene carbonate) (PPC) and poly(cyclohexene carbonate) (PCHC) has been shown. The weight loss, FT-IR and SEM have been employed to characterize biodegradability.

  • PDF

Deep Multi-task Network for Simultaneous Hazy Image Semantic Segmentation and Dehazing (안개영상의 의미론적 분할 및 안개제거를 위한 심층 멀티태스크 네트워크)

  • Song, Taeyong;Jang, Hyunsung;Ha, Namkoo;Yeon, Yoonmo;Kwon, Kuyong;Sohn, Kwanghoon
    • Journal of Korea Multimedia Society
    • /
    • v.22 no.9
    • /
    • pp.1000-1010
    • /
    • 2019
  • Image semantic segmentation and dehazing are key tasks in the computer vision. In recent years, researches in both tasks have achieved substantial improvements in performance with the development of Convolutional Neural Network (CNN). However, most of the previous works for semantic segmentation assume the images are captured in clear weather and show degraded performance under hazy images with low contrast and faded color. Meanwhile, dehazing aims to recover clear image given observed hazy image, which is an ill-posed problem and can be alleviated with additional information about the image. In this work, we propose a deep multi-task network for simultaneous semantic segmentation and dehazing. The proposed network takes single haze image as input and predicts dense semantic segmentation map and clear image. The visual information getting refined during the dehazing process can help the recognition task of semantic segmentation. On the other hand, semantic features obtained during the semantic segmentation process can provide cues for color priors for objects, which can help dehazing process. Experimental results demonstrate the effectiveness of the proposed multi-task approach, showing improved performance compared to the separate networks.

Deep Unsupervised Learning for Rain Streak Removal using Time-varying Rain Streak Scene (시간에 따라 변화하는 빗줄기 장면을 이용한 딥러닝 기반 비지도 학습 빗줄기 제거 기법)

  • Cho, Jaehoon;Jang, Hyunsung;Ha, Namkoo;Lee, Seungha;Park, Sungsoon;Sohn, Kwanghoon
    • Journal of Korea Multimedia Society
    • /
    • v.22 no.1
    • /
    • pp.1-9
    • /
    • 2019
  • Single image rain removal is a typical inverse problem which decomposes the image into a background scene and a rain streak. Recent works have witnessed a substantial progress on the task due to the development of convolutional neural network (CNN). However, existing CNN-based approaches train the network with synthetically generated training examples. These data tend to make the network bias to the synthetic scenes. In this paper, we present an unsupervised framework for removing rain streaks from real-world rainy images. We focus on the natural phenomena that static rainy scenes capture a common background but different rain streak. From this observation, we train siamese network with the real rain image pairs, which outputs identical backgrounds from the pairs. To train our network, a real rainy dataset is constructed via web-crawling. We show that our unsupervised framework outperforms the recent CNN-based approaches, which are trained by supervised manner. Experimental results demonstrate that the effectiveness of our framework on both synthetic and real-world datasets, showing improved performance over previous approaches.

Experiment on Automatic Detection of Airport Debris (FOD) using EO/IR Cameras and Radar (EO/IR 카메라 및 레이더를 이용한 공항 이물질(FOD) 자동탐지 실험)

  • Hong, Jae-Beom;Kang, Min-Soo;Kim, Yun-Seob;Kim, Min-Soo;Hong, Gyo-Young
    • Journal of Advanced Navigation Technology
    • /
    • v.22 no.6
    • /
    • pp.522-529
    • /
    • 2018
  • FOD refers to various metals and non-metallic foreign substances that pose a risk to aircraft. FODs occur in all areas and time zones, including runways, taxiways, and maintenance facilities, and pose a fatal hazard to aircraft safety during aircraft movements and take-off and landing. Rapid and effective detection and removal of FODs in the runway is required. As part of recent developments in aviation safety technologies, automatic detection of debris in runways in airports is under way. In this paper, we conducted an automated detection test using the EO/IR camera and radar at the Taean campus of Hansu University to confirm normal detection during the day and night.

Design of Control Logic, and Experiment for Large Torque CMG (대형 토크 제어모멘트자이로의 제어로직 설계 및 실험)

  • Lee, Jong-Kuk;Song, Tae-Seong;Kang, Jeong-Min;Song, Deok-Ki;Kwon, Jun-Beom;Seo, Joong-Bo;Oh, Hwa-Suk;Cheon, Dong-Ik;Park, Sang-Sup;Lee, Jun-Yong
    • Journal of the Korean Society for Aeronautical & Space Sciences
    • /
    • v.49 no.4
    • /
    • pp.291-299
    • /
    • 2021
  • This paper presents the control logic for the momentum wheel and gimbals in the CMG system. First, the design of the control logic for the momentum wheel is described in consideration of the power consumption and stability. Second, the design of the control logic for the gimbals considering the resonance of the vibration absorber and stability is explained. Third, the measurement configuration for the force and torque generated by the CMG is described. Fourth, the results of the frequency and time response test of the momentum wheel and gimbals are shown. Last, the measurements of the force and the torque generated through the CMG are explained.