• Title/Summary/Keyword: UAV 원격탐사

Search Result 118, Processing Time 0.023 seconds

NDVI Based on UAVs Mapping to Calculate the Damaged Areas of Chemical Accidents (화학물질사고 피해영역 산출을 위한 드론맵핑 기반의 정규식생지수 활용방안 연구)

  • Lim, Eontaek;Jung, Yonghan;Kim, Seongsam
    • Korean Journal of Remote Sensing
    • /
    • v.38 no.6_3
    • /
    • pp.1837-1846
    • /
    • 2022
  • The annual increase in chemical accidents is causing damage to life and the environment due to the spread and residual of substances. Environmental damage investigation is more difficult to determine the geographical scope and timing than human damage investigation. Considering the reality that there is a lack of professional investigation personnel, it is urgent to develop an efficient quantitative evaluation method. In order to improve this situation, this paper conducted a chemical accidents investigation using unmanned aerial vehicles(UAV) equipped with various sensors. The damaged area was calculated by Ortho-image and strength of agreement was calculated using the normalized difference vegetation index image. As a result, the Cohen's Kappa coefficient was 0.649 (threshold 0.7). However, there is a limitation in that analysis has been performed based on the pixel of the normalized difference vegetation index. Therefore, there is a need for a chemical accident investigation plan that overcomes the limitations.

Development of Chinese Cabbage Detection Algorithm Based on Drone Multi-spectral Image and Computer Vision Techniques (드론 다중분광영상과 컴퓨터 비전 기술을 이용한 배추 객체 탐지 알고리즘 개발)

  • Ryu, Jae-Hyun;Han, Jung-Gon;Ahn, Ho-yong;Na, Sang-Il;Lee, Byungmo;Lee, Kyung-do
    • Korean Journal of Remote Sensing
    • /
    • v.38 no.5_1
    • /
    • pp.535-543
    • /
    • 2022
  • A drone is used to diagnose crop growth and to provide information through images in the agriculture field. In the case of using high spatial resolution drone images, growth information for each object can be produced. However, accurate object detection is required and adjacent objects should be efficiently classified. The purpose of this study is to develop a Chinese cabbage object detection algorithm using multispectral reflectance images observed from drone and computer vision techniques. Drone images were captured between 7 and 15 days after planting a Chinese cabbage from 2018 to 2020 years. The thresholds of object detection algorithm were set based on 2019 year, and the algorithm was evaluated based on images in 2018 and 2019 years. The vegetation area was classified using the characteristics of spectral reflectance. Then, morphology techniques such as dilatation, erosion, and image segmentation by considering the size of the object were applied to improve the object detection accuracy in the vegetation area. The precision of the developed object detection algorithm was over 95.19%, and the recall and accuracy were over 95.4% and 93.68%, respectively. The F1-Score of the algorithm was over 0.967 for 2 years. The location information about the center of the Chinese cabbage object extracted using the developed algorithm will be used as data to provide decision-making information during the growing season of crops.

A standardized procedure on building spectral library for identifying hazardous chemicals mixed in rivers using UAV-based hyperspectral technique (드론 기반 초분광 영상을 활용한 하천수 혼합 유해화학물질 식별을 위한 분광라이브러리 구축 표준화 방안)

  • Gwon, Yeonghwa;Kim, Dongsu;You, Hojun
    • Proceedings of the Korea Water Resources Association Conference
    • /
    • 2020.06a
    • /
    • pp.161-161
    • /
    • 2020
  • 최근 기후변화와 여름철 고온 등으로 인한 녹조현상, 화학물질 및 유류 유출 등 화학사고로 인한 하천의 수질오염과 관련된 사회적 관심이 높아지고 있다. 특히, 화학사고로 인한 유해화학물질 유출은 인체에 접촉 시 악영향을 끼치며, 대기·수질·토양을 오염시키고 주변 농작물의 변색이나 괴사를 유발하는 등의 피해를 야기하기 때문에 적절한 조치와 대응이 필요하다. 환경부에서는 유해화학물질 유출사고로 인한 국민건강 및 환경상의 위해를 예방하기 위해 화학물질관리법과 화학물질 등록 및 평가에 관한 법률을 제정하여 유해화학물질을 관리하고 화학사고에 대응하고 있다. 그러나, 화학사고 발생 시 공장 인근의 먼지, 악취 등을 감시하기 위해 현장인력에 의존하거나 화학물질의 유출이 우려되는 곳에 제한적으로 검출센서를 설치해 사고를 감시하고 있어 검출센서 미설치 지역에 대한 능동적 탐지가 어렵고, 화학물질의 공간적 분포 탐지가 불가능하여 초동 대응에 한계가 있다. 한편 최근 초분광 영상을 활용하여 물질 고유의 분광특성을 분석함으로써 토지피복, 식생, 수질 등의 식별에 활용되고 있다. 따라서 초분광 센서를 활용한 화학물질 감지 가능성도 보여주고 있지만, 초분광 센서를 활용한 하천의 화학물질 감지를 위한 연구는 미비한 실정이다. 이에 본 연구에서는 유해화학물질 18종을 대상으로 초분광 영상을 이용한 상호 구분이 가능한 지 확인하고자 해당 유해화학물질의 초분광 영상을 촬영하여 분광라이브러리를 구축하였다. 또한 물질별 특성을 보이는 분광밴드의 범위를 지정해 특성 분광라이브러리를 구축하였으며, 해당 과정에 대한 표준 및 절차를 제시하였다. 본 연구에서 제시한 절차에 따라 18종의 유해화학물질 분광라이브러리와 특성 분광라이브러리를 구축한 결과, 유해화학물질의 식별 가능성을 확인하였다. 향후 연구를 통해 유해화학물질 분광라이브러리 데이터베이스를 확대하고, 실시간 모니터링에 적용할 경우 신속한 화학사고 발생여부 감지 및 대응에 활용할 수 있을 것으로 사료된다.

  • PDF

Response of Structural, Biochemical, and Physiological Vegetation Indices Measured from Field-Spectrometer and Multi-Spectral Camera Under Crop Stress Caused by Herbicide (마늘의 제초제 약해에 대한 구조적, 생화학적, 생리적 계열 식생지수 반응: 지상분광계 및 다중분광카메라를 활용하여)

  • Ryu, Jae-Hyun;Moon, Hyun-Dong;Cho, Jaeil;Lee, Kyung-do;Ahn, Ho-yong;So, Kyu-ho;Na, Sang-il
    • Korean Journal of Remote Sensing
    • /
    • v.37 no.6_1
    • /
    • pp.1559-1572
    • /
    • 2021
  • The response of vegetation under the crop stress condition was evaluated using structural, biochemical, and physiological vegetation indices based on unmanned aerial vehicle (UAV) images and field-spectrometer data. A high concentration of herbicide was sprayed at the different growth stages of garlic to process crop stress, the above ground dry matter of garlic at experimental area (EA) decreased about 46.2~84.5% compared to that at control area. The structural vegetation indices clearly responded to these crop damages. Spectral reflectance at near-infrared wavelength consistently decreased at EA. Most biochemical vegetation indices reflected the crop stress conditions, but the meaning of physiological vegetation indices is not clear due to the effect of vinyl mulching. The difference of the decreasing ratio of vegetation indices after the herbicide spray was 2.3% averagely in the case of structural vegetation indices and 1.3~4.1% in the case of normalization-based vegetation indices. These results meant that appropriate vegetation indices should be utilized depending on the types of crop stress and the cultivation environment and the normalization-based vegetation indices measured from the different spatial scale has the minimized difference.

Multi-resolution SAR Image-based Agricultural Reservoir Monitoring (농업용 저수지 모니터링을 위한 다해상도 SAR 영상의 활용)

  • Lee, Seulchan;Jeong, Jaehwan;Oh, Seungcheol;Jeong, Hagyu;Choi, Minha
    • Korean Journal of Remote Sensing
    • /
    • v.38 no.5_1
    • /
    • pp.497-510
    • /
    • 2022
  • Agricultural reservoirs are essential structures for water supplies during dry period in the Korean peninsula, where water resources are temporally unequally distributed. For efficient water management, systematic and effective monitoring of medium-small reservoirs is required. Synthetic Aperture Radar (SAR) provides a way for continuous monitoring of those, with its capability of all-weather observation. This study aims to evaluate the applicability of SAR in monitoring medium-small reservoirs using Sentinel-1 (10 m resolution) and Capella X-SAR (1 m resolution), at Chari (CR), Galjeon (GJ), Dwitgol (DG) reservoirs located in Ulsan, Korea. Water detected results applying Z fuzzy function-based threshold (Z-thresh) and Chan-vese (CV), an object detection-based segmentation algorithm, are quantitatively evaluated using UAV-detected water boundary (UWB). Accuracy metrics from Z-thresh were 0.87, 0.89, 0.77 (at CR, GJ, DG, respectively) using Sentinel-1 and 0.78, 0.72, 0.81 using Capella, and improvements were observed when CV was applied (Sentinel-1: 0.94, 0.89, 0.84, Capella: 0.92, 0.89, 0.93). Boundaries of the waterbody detected from Capella agreed relatively well with UWB; however, false- and un-detections occurred from speckle noises, due to its high resolution. When masked with optical sensor-based supplementary images, improvements up to 13% were observed. More effective water resource management is expected to be possible with continuous monitoring of available water quantity, when more accurate and precise SAR-based water detection technique is developed.

Review of applicability of Turbidity-SS relationship in hyperspectral imaging-based turbid water monitoring (초분광영상 기반 탁수 모니터링에서의 탁도-SS 관계식 적용성 검토)

  • Kim, Jongmin;Kim, Gwang Soo;Kwon, Siyoon;Kim, Young Do
    • Journal of Korea Water Resources Association
    • /
    • v.56 no.12
    • /
    • pp.919-928
    • /
    • 2023
  • Rainfall characteristics in Korea are concentrated during the summer flood season. In particular, when a large amount of turbid water flows into the dam due to the increasing trend of concentrated rainfall due to abnormal rainfall and abnormal weather conditions, prolonged turbid water phenomenon occurs due to the overturning phenomenon. Much research is being conducted on turbid water prediction to solve these problems. To predict turbid water, turbid water data from the upstream inflow is required, but spatial and temporal data resolution is currently insufficient. To improve temporal resolution, the development of the Turbidity-SS conversion equation is necessary, and to improve spatial resolution, multi-item water quality measurement instrument (YSI), Laser In-Situ Scattering and Transmissometry (LISST), and hyperspectral sensors are needed. Sensor-based measurement can improve the spatial resolution of turbid water by measuring line and surface unit data. In addition, in the case of LISST-200X, it is possible to collect data on particle size, etc., so it can be used in the Turbidity-SS conversion equation for fraction (Clay: Silt: Sand). In addition, among recent remote sensing methods, the spatial distribution of turbid water can be presented when using UAVs with higher spatial and temporal resolutions than other payloads and hyperspectral sensors with high spectral and radiometric resolutions. Therefore, in this study, the Turbidity-SS conversion equation was calculated according to the fraction through laboratory analysis using LISST-200X and YSI-EXO, and sensor-based field measurements including UAV (Matrice 600) and hyperspectral sensor (microHSI 410 SHARK) were used. Through this, the spatial distribution of turbidity and suspended sediment concentration, and the turbidity calculated using the Turbidity-SS conversion equation based on the measured suspended sediment concentration, was presented. Through this, we attempted to review the applicability of the Turbidity-SS conversion equation and understand the current status of turbid water occurrence.

Utilization of Weather, Satellite and Drone Data to Detect Rice Blast Disease and Track its Propagation (벼 도열병 발생 탐지 및 확산 모니터링을 위한 기상자료, 위성영상, 드론영상의 공동 활용)

  • Jae-Hyun Ryu;Hoyong Ahn;Kyung-Do Lee
    • Korean Journal of Agricultural and Forest Meteorology
    • /
    • v.25 no.4
    • /
    • pp.245-257
    • /
    • 2023
  • The representative crop in the Republic of Korea, rice, is cultivated over extensive areas every year, which resulting in reduced resistance to pests and diseases. One of the major rice diseases, rice blast disease, can lead to a significant decrease in yields when it occurs on a large scale, necessitating early detection and effective control of rice blast disease. Drone-based crop monitoring techniques are valuable for detecting abnormal growth, but frequent image capture for potential rice blast disease occurrences can consume significant labor and resources. The purpose of this study is to early detect rice blast disease using remote sensing data, such as drone and satellite images, along with weather data. Satellite images was helpful in identifying rice cultivation fields. Effective detection of paddy fields was achieved by utilizing vegetation and water indices. Subsequently, air temperature, relative humidity, and number of rainy days were used to calculate the risk of rice blast disease occurrence. An increase in the risk of disease occurrence implies a higher likelihood of disease development, and drone measurements perform at this time. Spectral reflectance changes in the red and near-infrared wavelength regions were observed at the locations where rice blast disease occurred. Clusters with low vegetation index values were observed at locations where rice blast disease occurred, and the time series data for drone images allowed for tracking the spread of the disease from these points. Finally, drone images captured before harvesting was used to generate spatial information on the incidence of rice blast disease in each field.

Derivation of Green Coverage Ratio Based on Deep Learning Using MAV and UAV Aerial Images (유·무인 항공영상을 이용한 심층학습 기반 녹피율 산정)

  • Han, Seungyeon;Lee, Impyeong
    • Korean Journal of Remote Sensing
    • /
    • v.37 no.6_1
    • /
    • pp.1757-1766
    • /
    • 2021
  • The green coverage ratio is the ratio of the land area to green coverage area, and it is used as a practical urban greening index. The green coverage ratio is calculated based on the land cover map, but low spatial resolution and inconsistent production cycle of land cover map make it difficult to calculate the correct green coverage area and analyze the precise green coverage. Therefore, this study proposes a new method to calculate green coverage area using aerial images and deep neural networks. Green coverage ratio can be quickly calculated using manned aerial images acquired by local governments, but precise analysis is difficult because components of image such as acquisition date, resolution, and sensors cannot be selected and modified. This limitation can be supplemented by using an unmanned aerial vehicle that can mount various sensors and acquire high-resolution images due to low-altitude flight. In this study, we proposed a method to calculate green coverage ratio from manned or unmanned aerial images, and experimentally verified the proposed method. Aerial images enable precise analysis by high resolution and relatively constant cycles, and deep learning can automatically detect green coverage area in aerial images. Local governments acquire manned aerial images for various purposes every year and we can utilize them to calculate green coverage ratio quickly. However, acquired manned aerial images may be difficult to accurately analyze because details such as acquisition date, resolution, and sensors cannot be selected. These limitations can be supplemented by using unmanned aerial vehicles that can mount various sensors and acquire high-resolution images due to low-altitude flight. Accordingly, the green coverage ratio was calculated from the two aerial images, and as a result, it could be calculated with high accuracy from all green types. However, the green coverage ratio calculated from manned aerial images had limitations in complex environments. The unmanned aerial images used to compensate for this were able to calculate a high accuracy of green coverage ratio even in complex environments, and more precise green area detection was possible through additional band images. In the future, it is expected that the rust rate can be calculated effectively by using the newly acquired unmanned aerial imagery supplementary to the existing manned aerial imagery.