• Title/Summary/Keyword: camera image

Search Result 4,918, Processing Time 0.029 seconds

Development of a real-time crop recognition system using a stereo camera

  • Baek, Seung-Min;Kim, Wan-Soo;Kim, Yong-Joo;Chung, Sun-Ok;Nam, Kyu-Chul;Lee, Dae Hyun
    • Korean Journal of Agricultural Science
    • /
    • v.47 no.2
    • /
    • pp.315-326
    • /
    • 2020
  • In this study, a real-time crop recognition system was developed for an unmanned farm machine for upland farming. The crop recognition system was developed based on a stereo camera, and an image processing framework was proposed that consists of disparity matching, localization of crop area, and estimation of crop height with coordinate transformations. The performance was evaluated by attaching the crop recognition system to a tractor for five representative crops (cabbage, potato, sesame, radish, and soybean). The test condition was set at 3 levels of distances to the crop (100, 150, and 200 cm) and 5 levels of camera height (42, 44, 46, 48, and 50 cm). The mean relative error (MRE) was used to compare the height between the measured and estimated results. As a result, the MRE of Chinese cabbage was the lowest at 1.70%, and the MRE of soybean was the highest at 4.97%. It is considered that the MRE of the crop which has more similar distribution lower. the results showed that all crop height was estimated with less than 5% MRE. The developed crop recognition system can be applied to various agricultural machinery which enhances the accuracy of crop detection and its performance in various illumination conditions.

Timeline Synchronization of Multiple Videos Based on Waveform (소리 파형을 이용한 다수 동영상간 시간축 동기화 기법)

  • Kim, Shin;Yoon, Kyoungro
    • Journal of Broadcast Engineering
    • /
    • v.23 no.2
    • /
    • pp.197-205
    • /
    • 2018
  • Panoramic image is one of the technologies that are commonly used today. However, technical difficulties still exist in panoramic video production. Without a special camera such as a 360-degree camera, making panoramic video becomes more difficult. In order to make a panoramic video, it is necessary to synchronize the timeline of multiple videos shot at multiple locations. However, the timeline synchronization method using the internal clock of the camera may cause an error due to the difference of the internal hardware. In order to solve this problem, timeline synchronization between multiple videos using visual information or auditory information has been studied. However, there is a problem in accuracy and processing time when using video information, and there is a problem in that, when using audio information, there is no synchronization when there is sensitivity to noise or there is no melody. Therefore, in this paper, we propose a timeline synchronization method between multiple video using audio waveform. It shows higher synchronization accuracy and temporal efficiency than the video information based time synchronization method.

Development of Three-Dimensional Gamma-ray Camera (방사선원 3차원 위치탐지를 위한 방사선 영상장치 개발)

  • Lee, Nam-Ho;Hwang, Young-Gwan;Park, Soon-Yong
    • Journal of the Korea Institute of Information and Communication Engineering
    • /
    • v.19 no.2
    • /
    • pp.486-492
    • /
    • 2015
  • Radiation source imaging system is essential for protecting of radiation leakage accidents and minimizing damages from the radioactive materials, and is expected to play an important role in the nuclear plant decommissioning area. In this study, the stereoscopic camera principle was applied to develop a new radiation imaging device technology that can extract the radiation three-dimensional position information. This radiation three-dimensional imaging device (K3-RIS) was designed as a compact structure consisting of a radiation sensor, a CCD camera, and a pan-tilt only. It features the acquisition of stereoscopic radiation images by position change control, high-resolution detection by continuous scan mode control, and stereoscopic image signal processing. The performance analysis test of K3-RIS was conducted for a gamma-ray source(Cs-137) in radiation calibration facility. The test result showed that a performance error with less than 3% regardless of distances of the objects.

An Adaptive Switching Mechanism for Three-Dimensional Hybrid Cameras (하이브리드 입체 카메라의 적응적인 스위칭 메커니즘)

  • Jang, Seok-Woo;Choi, Hyun-Jun;Lee, Suk-Yun;Huh, Moon-Haeng
    • Journal of the Korea Academia-Industrial cooperation Society
    • /
    • v.14 no.3
    • /
    • pp.1459-1466
    • /
    • 2013
  • Recently, various types of three-dimensional cameras have been used to analyze surrounding environments. In this paper, we suggest a mechanism of adaptively switching active and passive cameras of hybrid cameras, which can extract 3D image information more accurately. The suggested method first obtains brightness and texture features representing the environment from input images. It then adaptively selects active and passive cameras by generating rules that reflect the extracted features. In experimental results, we show that a hybrid 3D camera consisting of passive and active cameras is set up and the proposed method can effectively choose appropriate cameras in the hybrid camera and make it possible to extract three dimensional information more accurately.

Development of PKNU3: A small-format, multi-spectral, aerial photographic system

  • Lee Eun-Khung;Choi Chul-Uong;Suh Yong-Cheol
    • Korean Journal of Remote Sensing
    • /
    • v.20 no.5
    • /
    • pp.337-351
    • /
    • 2004
  • Our laboratory originally developed the compact, multi-spectral, automatic aerial photographic system PKNU3 to allow greater flexibility in geological and environmental data collection. We are currently developing the PKNU3 system, which consists of a color-infrared spectral camera capable of simultaneous photography in the visible and near-infrared bands; a thermal infrared camera; two computers, each with an 80-gigabyte memory capacity for storing images; an MPEG board that can compress and transfer data to the computers in real-time; and the capability of using a helicopter platform. Before actual aerial photographic testing of the PKNU3, we experimented with each sensor. We analyzed the lens distortion, the sensitivity of the CCD in each band, and the thermal response of the thermal infrared sensor before the aerial photographing. As of September 2004, the PKNU3 development schedule has reached the second phase of testing. As the result of two aerial photographic tests, R, G, B and IR images were taken simultaneously; and images with an overlap rate of 70% using the automatic 1-s interval data recording time could be obtained by PKNU3. Further study is warranted to enhance the system with the addition of gyroscopic and IMU units. We evaluated the PKNU 3 system as a method of environmental remote sensing by comparing each chlorophyll image derived from PKNU 3 photographs. This appraisement was backed up with existing study that resulted in a modest improvement in the linear fit between the measures of chlorophyll and the RVI, NDVI and SAVI images stem from photographs taken by Duncantech MS 3100 which has same spectral configuration with MS 4000 used in PKNU3 system.

Analysis of the MSC(Multi-Spectral Camera) Operational Parameters

  • Yong, Sang-Soon;Kong, Jong-Pil;Heo, Haeng-Pal;Kim, Young-Sun
    • Korean Journal of Remote Sensing
    • /
    • v.18 no.1
    • /
    • pp.53-59
    • /
    • 2002
  • The MSC is a payload on the KOMPSAT-2 satellite to perform the earth remote sensing. The instrument images the earth using a push-broom motion with a swath width of 15 km and a GSD(Ground Sample Distance) of 1 m over the entire FOV(Field Of View) at altitude 685 km. The instrument is designed to haute an on-orbit operation duty cycle of 20% over the mission lifetime of 3 years with the functions of programmable gain/offset and on-board image data compression/storage. The MSC instrument has one channel for panchromatic imaging and four channel for multi-spectral imaging covering the spectral range from 450nm to 900nm using TDI(Time Belayed Integration) CCD(Charge Coupled Device) FPA(Focal Plane Assembly). The MSC hardware consists of three subsystem, EOS(Electro Optic camera Subsystem), PMU(Payload Management Unit) and PDTS(Payload Data Transmission Subsystem) and each subsystems are currently under development and will be integrated and verified through functional and space environment tests. Final verified MSC will be delivered to spacecraft bus for AIT(Assembly, Integration and Test) and then COMSAT-2 satellite will be launched after verification process through IST(Integrated Satellite Test). In this paper, the introduction of MSC, the configuration of MSC electronics including electrical interlace and design of CEU(Camera Electronic Unit) in EOS are described. MSC Operation parameters induced from the operation concept are discussed and analyzed to find the influence of system for on-orbit operation in future.

Model Calculation of Total Radiances for KOMPSAT-2 MSC (다목적실용위성 2호 MSC 총복사량의 모델 계산)

  • 김용승;강치호
    • Korean Journal of Remote Sensing
    • /
    • v.17 no.3
    • /
    • pp.211-218
    • /
    • 2001
  • We have performed the calculation of total radiances for the KOMPSAT-2 Multispectral Camera (MSC) using a radiative transfer model of MODTRAN and examined its results. To simulate four seasonal conditions in the model calculation, we used model atmospheres of mid-latitude winter and summer for calculations of January 15 and July 15, and US standard for April 15 and October 15, respectively. Orbital parameters of KOMPSAT-2 and the seasonal solar zenith angles were taken into account. We assumed that the meteorological range is the tropospheric aerosol extinction of 50 km and surface albedo is the global average of clear-sky albedo of 0.135. MSC contract values are found to be considerably greater in the MSC spectral range than the total radiances calculated with the above general conditions. It is also shown that the spectral behavior of model results with the constant surface albedo differs from the pattern of MSC contract values. From these results, it can be inferred that the forthcoming MSC images would be somewhat dark.

Behavior of the Ultrasonically-atomized Liquid-fuel Flame Injected through a Slit-jet Nozzle (Slit-jet 노즐을 통해 분사되는 초음파 무화 액체연료 화염의 거동)

  • Kim, Min Cheol;Kim, Min Sung;Kim, Jeong Soo
    • Journal of the Korean Society of Propulsion Engineers
    • /
    • v.22 no.6
    • /
    • pp.1-10
    • /
    • 2018
  • An experimental study was performed for the behavior of the burner flame which results from burning of the liquid hydrocarbon fuel atomized by an ultrasonic transducer. Configurations of the flame and combustion-field were caught by both high-speed camera and thermo-graphic camera, and those images were analyzed in detail through a image post-processing. As a result, the combustion-field grew and reaction-temperature rose due to the strengthening of combustion reaction with the increasing flow-rate of carrier-gas. In addition, a phenomenon of flame flickering was discussed through the comparative analysis of the variational behavior between the visible flame and IR (Infrared) flame-field. Also, the flickering frequency of the flame was confirmed through FFT (Fast Fourier Transform) analysis employing the flame area.

Development non-smoking billboard using augmented reality function (증강현실기능을 이용한 금연 광고판 개발)

  • Hong, Jeong-Soo;Lee, Jin-Dong;Yun, Yong-Gyu;Yoo, Jeong-Ki
    • Proceedings of the Korean Institute of Information and Commucation Sciences Conference
    • /
    • 2016.10a
    • /
    • pp.274-276
    • /
    • 2016
  • Recently due to increase of tobacco users, many problems have been issued. Not only smoking in public places, smoking indoors as well causes harm to non-smoking people. Smoking booths that are installed but the quality is considerably less and purification devices are not correctly installed, which leads to harm the people around the smoking booth. In this paper, we introduce the "Augmented Reality Billboard" in order for smokers to effectively recognize the non-smoking warning image and healthy warning messages, Kinect Camera Sensor and Augmented Reality (AR) functions are used to recognize the motion of a person to coordinate the corresponding coordinate values.

  • PDF

Observations of the Aurora by Visible All-Sky Camera at Jang Bogo Station, Antarctica

  • Jee, Geonhwa;Ham, Young-Bae;Choi, Yoonseung;Kim, Eunsol;Lee, Changsup;Kwon, Hyuckjin;Trondsen, Trond S.;Kim, Ji Eun;Kim, Jeong-Han
    • Journal of Astronomy and Space Sciences
    • /
    • v.38 no.4
    • /
    • pp.203-215
    • /
    • 2021
  • The auroral observation has been started at Jang Bogo Station (JBS), Antarctica by using a visible All-sky camera (v-ASC) in 2018 to routinely monitor the aurora in association with the simultaneous observations of the ionosphere, thermosphere and magnetosphere at the station. In this article, the auroral observations are introduced with the analysis procedure to recognize the aurora from the v-ASC image data and to compute the auroral occurrences and the initial results on their spatial and temporal distributions are presented. The auroral occurrences are mostly confined to the northern horizon in the evening sector and extend to the zenith from the northwest to cover almost the entire sky disk over JBS at around 08 MLT (magnetic local time; 03 LT) and then retract to the northeast in the morning sector. At near the magnetic local noon, the occurrences are horizontally distributed in the northern sky disk, which shows the auroral occurrences in the cusp region. The results of the auroral occurrences indicate that JBS is located most of the time in the polar cap near the poleward boundary of the auroral oval in the nightside and approaches closer to the oval in the morning sector. At around 08 MLT (03 LT), JBS is located within the auroral oval and then moves away from it, finally being located in the cusp region at the magnetic local noon, which indicates that the location of JBS turns out to be ideal to investigate the variabilities of the poleward boundary of the auroral oval from long-term observations of the auroral occurrences. The future plan for the ground auroral observations near JBS is presented.