• Title/Summary/Keyword: 유출 성능

Search Result 546, Processing Time 0.027 seconds

Water temperature prediction of Daecheong Reservoir by a process-guided deep learning model (역학적 모델과 딥러닝 모델을 융합한 대청호 수온 예측)

  • Kim, Sung Jin;Park, Hyungseok;Lee, Gun Ho;Chung, Se Woong
    • Proceedings of the Korea Water Resources Association Conference
    • /
    • 2021.06a
    • /
    • pp.88-88
    • /
    • 2021
  • 최근 수자원과 수질관리 분야에 자료기반 머신러닝 모델과 딥러닝 모델의 활용이 급증하고 있다. 그러나 딥러닝 모델은 Blackbox 모델의 특성상 고전적인 질량, 운동량, 에너지 보존법칙을 고려하지 않고, 데이터에 내재된 패턴과 관계를 해석하기 때문에 물리적 법칙을 만족하지 않는 예측결과를 가져올 수 있다. 또한, 딥러닝 모델의 예측 성능은 학습데이터의 양과 변수 선정에 크게 영향을 받는 모델이기 때문에 양질의 데이터가 제공되지 않으면 모델의 bias와 variation이 클 수 있으며 정확도 높은 예측이 어렵다. 최근 이러한 자료기반 모델링 방법의 단점을 보완하기 위해 프로세스 기반 수치모델과 딥러닝 모델을 결합하여 두 모델링 방법의 장점을 활용하는 연구가 활발히 진행되고 있다(Read et al., 2019). Process-Guided Deep Learning (PGDL) 방법은 물리적 법칙을 반영하여 딥러닝 모델을 훈련시킴으로써 순수한 딥러닝 모델의 물리적 법칙 결여성 문제를 해결할 수 있는 대안으로 활용되고 있다. PGDL 모델은 딥러닝 모델에 물리적인 법칙을 해석할 수 있는 추가변수를 도입하며, 딥러닝 모델의 매개변수 최적화 과정에서 Cost 함수에 물리적 법칙을 위반하는 경우 Penalty를 추가하는 알고리즘을 도입하여 물리적 보존법칙을 만족하도록 모델을 훈련시킨다. 본 연구의 목적은 대청호의 수심별 수온을 예측하기 위해 역학적 모델과 딥러닝 모델을 융합한 PGDL 모델을 개발하고 적용성을 평가하는데 있다. 역학적 모델은 2차원 횡방향 평균 수리·수질 모델인 CE-QUAL-W2을 사용하였으며, 대청호를 대상으로 2017년부터 2018년까지 총 2년간 수온과 에너지 수지를 모의하였다. 기상(기온, 이슬점온도, 풍향, 풍속, 운량), 수문(저수위, 유입·유출 유량), 수온자료를 수집하여 CE-QUAL-W2 모델을 구축하고 보정하였으며, 모델은 저수위 변화, 수온의 수심별 시계열 변동 특성을 적절하게 재현하였다. 또한, 동일기간 대청호 수심별 수온 예측을 위한 순환 신경망 모델인 LSTM(Long Short-Term Memory)을 개발하였으며, 종속변수는 수온계 체인을 통해 수집한 수심별 고빈도 수온 자료를 사용하고 독립 변수는 기온, 풍속, 상대습도, 강수량, 단파복사에너지, 장파복사에너지를 사용하였다. LSTM 모델의 매개변수 최적화는 지도학습을 통해 예측값과 실측값의 RMSE가 최소화 되로록 훈련하였다. PGDL 모델은 동일 기간 LSTM 모델과 동일 입력 자료를 사용하여 구축하였으며, 역학적 모델에서 얻은 에너지 수지를 만족하지 않는 경우 Cost Function에 Penalty를 추가하여 물리적 보존법칙을 만족하도록 훈련하고 수심별 수온 예측결과를 비교·분석하였다.

  • PDF

Review of Thermodynamic Sorption Model for Radionuclides on Bentonite Clay (벤토나이트와 방사성 핵종의 열역학적 수착 모델 연구)

  • Jeonghwan Hwang;Jung-Woo Kim;Weon Shik Han;Won Woo Yoon;Jiyong Lee;Seonggyu Choi
    • Economic and Environmental Geology
    • /
    • v.56 no.5
    • /
    • pp.515-532
    • /
    • 2023
  • Bentonite, predominantly consists of expandable clay minerals, is considered to be the suitable buffering material in high-level radioactive waste disposal repository due to its large swelling property and low permeability. Additionally, the bentonite has large cation exchange capacity and specific surface area, and thus, it effectively retards the transport of leaked radionuclides to surrounding environments. This study aims to review the thermodynamic sorption models for four radionuclides (U, Am, Se, and Eu) and eight bentonites. Then, the thermodynamic sorption models and optimized sorption parameters were precisely analyzed by considering the experimental conditions in previous study. Here, the optimized sorption parameters showed that thermodynamic sorption models were related to experimental conditions such as types and concentrations of radionuclides, ionic strength, major competing cation, temperature, solid-to-liquid ratio, carbonate species, and mineralogical properties of bentonite. These results implied that the thermodynamic sorption models suggested by the optimization at specific experimental conditions had large uncertainty for application to various environmental conditions.

Time-series Change Analysis of Quarry using UAV and Aerial LiDAR (UAV와 LiDAR를 활용한 토석채취지의 시계열 변화 분석)

  • Dong-Hwan Park;Woo-Dam Sim
    • Journal of the Korean Association of Geographic Information Studies
    • /
    • v.27 no.2
    • /
    • pp.34-44
    • /
    • 2024
  • Recently, due to abnormal climate caused by climate change, natural disasters such as floods, landslides, and soil outflows are rapidly increasing. In Korea, more than 63% of the land is vulnerable to slope disasters due to the geographical characteristics of mountainous areas, and in particular, Quarry mines soil and rocks, so there is a high risk of landslides not only inside the workplace but also outside.Accordingly, this study built a DEM using UAV and aviation LiDAR for monitoring the quarry, conducted a time series change analysis, and proposed an optimal DEM construction method for monitoring the soil collection site. For DEM construction, UAV and LiDAR-based Point Cloud were built, and the ground was extracted using three algorithms: Aggressive Classification (AC), Conservative Classification (CC), and Standard Classification (SC). UAV and LiDAR-based DEM constructed according to the algorithm evaluated accuracy through comparison with digital map-based DEM.

A Study on the Applicability of Torrefied Wood Flour Natural Material Based Coagulant to Removal of Dissolved Organic Matter and Turbidity (용존성 유기물질 및 탁도 제거를 위한 반탄화목분 천연재료 혼합응집제의 적용성에 관한 연구)

  • PARK, Hae Keum;KANG, Seog Goo
    • Journal of the Korean Wood Science and Technology
    • /
    • v.48 no.4
    • /
    • pp.472-487
    • /
    • 2020
  • With the emergence of abnormal climate due to the rapid industrialization, the importance of water quality management and management costs are increasing every year. In Korea, for the management of total phosphorus and total nitrogen, the major materials causing the water quality pollution, coagulants are injected in sewage treatment plants to process organic compounds. However, if the coagulant is injected in an excessive amount to PAC (Poly Aluminium Chloride), a secondary pollution problem might occur. As such, a study on the applicability of natural material-based coagulant is being conducted in Korea. Thus, this study aimed to evaluate the applicability of a mixed coagulant developed by analyzing water quality pollutants T-P, T-N as well as their turbidity, in order to derive the optimum mixing ratio between PAC and torrefied wood flour for the primary settling pond effluent. Under the condition where the content of PAC (10%) and torrefied wood flour is 1%, T-P showed the maximum removal efficiency of 92%, and T-N showed approximately 22%. This indicates that removal of T-N which includes numerous positively charged organic compounds that are equivalent to mixed coagulant is not well accomplished. Turbidity showed the removal efficiency of approximately 91%. As such, 1% of torrefied wood flour was determined to be the optimum addition. As a result of analyzing the removal efficiency for organic compounds by reducing PAC concentration to 7%, T-P showed a high maximum removal efficiency of 91%, T-N showed 32%, and turbidity showed the maximum of 90%. In addition, a coagulation process is performed by using the mixed coagulant based on 1% content of torrefied wood flour produced in this study by performing a coagulation performance comparative experiment with PAC (10%). As a result, PAC concentration was reduced to 30-50%, a similar performance with other coagulants in market was secured, PAC injection amount was reduced that an economic effect can be achieved, and it is considered to perform a stable water treatment that reduces the secondary pollution problem.

Evaluation of bias and uncertainty in snow depth reanalysis data over South Korea (한반도 적설심 재분석자료의 오차 및 불확실성 평가)

  • Jeon, Hyunho;Lee, Seulchan;Lee, Yangwon;Kim, Jinsoo;Choi, Minha
    • Journal of Korea Water Resources Association
    • /
    • v.56 no.9
    • /
    • pp.543-551
    • /
    • 2023
  • Snow is an essential climate factor that affects the climate system and surface energy balance, and it also has a crucial role in water balance by providing solid water stored during the winter for spring runoff and groundwater recharge. In this study, statistical analysis of Local Data Assimilation and Prediction System (LDAPS), Modern.-Era Retrospective Analysis for Research and Applications, version 2 (MERRA-2), and ERA5-Land snow depth data were used to evaluate the applicability in South Korea. The statistical analysis between the Automated Synoptic Observing System (ASOS) ground observation data provided by the Korea Meteorological Administration (KMA) and the reanalysis data showed that LDAPS and ERA5-Land were highly correlated with a correlation coefficient of more than 0.69, but LDAPS showed a large error with an RMSE of 0.79 m. In the case of MERRA-2, the correlation coefficient was lower at 0.17 because the constant value was estimated continuously for some periods, which did not adequately simulate the increase and decrease trend between data. The statistical analysis of LDAPS and ASOS showed high and low performance in the nearby Gangwon Province, where the average snowfall is relatively high, and in the southern region, where the average snowfall is low, respectively. Finally, the error variance between the four independent snow depth data used in this study was calculated through triple collocation (TC), and a merged snow depth data was produced through weighting factors. The reanalyzed data showed the highest error variance in the order of LDAPS, MERRA-2, and ERA5-Land, and LDAPS was given a lower weighting factor due to its higher error variance. In addition, the spatial distribution of ERA5-Land snow depth data showed less variability, so the TC-merged snow depth data showed a similar spatial distribution to MERRA-2, which has a low spatial resolution. Considering the correlation, error, and uncertainty of the data, the ERA5-Land data is suitable for snow-related analysis in South Korea. In addition, it is expected that LDAPS data, which is highly correlated with other data but tends to be overestimated, can be actively utilized for high-resolution representation of regional and climatic diversity if appropriate corrections are performed.

Design and Implementation of a Web Application Firewall with Multi-layered Web Filter (다중 계층 웹 필터를 사용하는 웹 애플리케이션 방화벽의 설계 및 구현)

  • Jang, Sung-Min;Won, Yoo-Hun
    • Journal of the Korea Society of Computer and Information
    • /
    • v.14 no.12
    • /
    • pp.157-167
    • /
    • 2009
  • Recently, the leakage of confidential information and personal information is taking place on the Internet more frequently than ever before. Most of such online security incidents are caused by attacks on vulnerabilities in web applications developed carelessly. It is impossible to detect an attack on a web application with existing firewalls and intrusion detection systems. Besides, the signature-based detection has a limited capability in detecting new threats. Therefore, many researches concerning the method to detect attacks on web applications are employing anomaly-based detection methods that use the web traffic analysis. Much research about anomaly-based detection through the normal web traffic analysis focus on three problems - the method to accurately analyze given web traffic, system performance needed for inspecting application payload of the packet required to detect attack on application layer and the maintenance and costs of lots of network security devices newly installed. The UTM(Unified Threat Management) system, a suggested solution for the problem, had a goal of resolving all of security problems at a time, but is not being widely used due to its low efficiency and high costs. Besides, the web filter that performs one of the functions of the UTM system, can not adequately detect a variety of recent sophisticated attacks on web applications. In order to resolve such problems, studies are being carried out on the web application firewall to introduce a new network security system. As such studies focus on speeding up packet processing by depending on high-priced hardware, the costs to deploy a web application firewall are rising. In addition, the current anomaly-based detection technologies that do not take into account the characteristics of the web application is causing lots of false positives and false negatives. In order to reduce false positives and false negatives, this study suggested a realtime anomaly detection method based on the analysis of the length of parameter value contained in the web client's request. In addition, it designed and suggested a WAF(Web Application Firewall) that can be applied to a low-priced system or legacy system to process application data without the help of an exclusive hardware. Furthermore, it suggested a method to resolve sluggish performance attributed to copying packets into application area for application data processing, Consequently, this study provide to deploy an effective web application firewall at a low cost at the moment when the deployment of an additional security system was considered burdened due to lots of network security systems currently used.

Evaluation of Application Possibility for Floating Marine Pollutants Detection Using Image Enhancement Techniques: A Case Study for Thin Oil Film on the Sea Surface (영상 강화 기법을 통한 부유성 해양오염물질 탐지 기술 적용 가능성 평가: 해수면의 얇은 유막을 대상으로)

  • Soyeong Jang;Yeongbin Park;Jaeyeop Kwon;Sangheon Lee;Tae-Ho Kim
    • Korean Journal of Remote Sensing
    • /
    • v.39 no.6_1
    • /
    • pp.1353-1369
    • /
    • 2023
  • In the event of a disaster accident at sea, the scale of damage will vary due to weather effects such as wind, currents, and tidal waves, and it is obligatory to minimize the scale of damage by establishing appropriate control plans through quick on-site identification. In particular, it is difficult to identify pollutants that exist in a thin film at sea surface due to their relatively low viscosity and surface tension among pollutants discharged into the sea. Therefore, this study aims to develop an algorithm to detect suspended pollutants on the sea surface in RGB images using imaging equipment that can be easily used in the field, and to evaluate the performance of the algorithm using input data obtained from actual waters. The developed algorithm uses image enhancement techniques to improve the contrast between the intensity values of pollutants and general sea surfaces, and through histogram analysis, the background threshold is found,suspended solids other than pollutants are removed, and finally pollutants are classified. In this study, a real sea test using substitute materials was performed to evaluate the performance of the developed algorithm, and most of the suspended marine pollutants were detected, but the false detection area occurred in places with strong waves. However, the detection results are about three times better than the detection method using a single threshold in the existing algorithm. Through the results of this R&D, it is expected to be useful for on-site control response activities by detecting suspended marine pollutants that were difficult to identify with the naked eye at existing sites.

A Comparative Study of Subset Construction Methods in OSEM Algorithms using Simulated Projection Data of Compton Camera (모사된 컴프턴 카메라 투사데이터의 재구성을 위한 OSEM 알고리즘의 부분집합 구성법 비교 연구)

  • Kim, Soo-Mee;Lee, Jae-Sung;Lee, Mi-No;Lee, Ju-Hahn;Kim, Joong-Hyun;Kim, Chan-Hyeong;Lee, Chun-Sik;Lee, Dong-Soo;Lee, Soo-Jin
    • Nuclear Medicine and Molecular Imaging
    • /
    • v.41 no.3
    • /
    • pp.234-240
    • /
    • 2007
  • Purpose: In this study we propose a block-iterative method for reconstructing Compton scattered data. This study shows that the well-known expectation maximization (EM) approach along with its accelerated version based on the ordered subsets principle can be applied to the problem of image reconstruction for Compton camera. This study also compares several methods of constructing subsets for optimal performance of our algorithms. Materials and Methods: Three reconstruction algorithms were implemented; simple backprojection (SBP), EM, and ordered subset EM (OSEM). For OSEM, the projection data were grouped into subsets in a predefined order. Three different schemes for choosing nonoverlapping subsets were considered; scatter angle-based subsets, detector position-based subsets, and both scatter angle- and detector position-based subsets. EM and OSEM with 16 subsets were performed with 64 and 4 iterations, respectively. The performance of each algorithm was evaluated in terms of computation time and normalized mean-squared error. Results: Both EM and OSEM clearly outperformed SBP in all aspects of accuracy. The OSEM with 16 subsets and 4 iterations, which is equivalent to the standard EM with 64 iterations, was approximately 14 times faster in computation time than the standard EM. In OSEM, all of the three schemes for choosing subsets yielded similar results in computation time as well as normalized mean-squared error. Conclusion: Our results show that the OSEM algorithm, which have proven useful in emission tomography, can also be applied to the problem of image reconstruction for Compton camera. With properly chosen subset construction methods and moderate numbers of subsets, our OSEM algorithm significantly improves the computational efficiency while keeping the original quality of the standard EM reconstruction. The OSEM algorithm with scatter angle- and detector position-based subsets is most available.

Development of GIS based Water Quality Simulation System for Han River and Kyeonggi Bay Area (한강과 경기만 지역 GIS 기반 통합수질모의 시스템 개발)

  • Lee, Chol-Young;Kim, Kye-Hyun
    • Journal of Korea Spatial Information System Society
    • /
    • v.10 no.4
    • /
    • pp.77-88
    • /
    • 2008
  • There has been growing demands to manage the water quality of west coastal region due to the large scale urbanization along the coastal zone, the possibility of application of TMDL(Total Maximum Daily Loadings) to Han river, and the natural disaster such as oil spill incident in Taean, Chungnam. However, no system has been developed for such purposes. In this background, the demand of GIS based effective water quality management has been increased to monitor water quality environment and propose best management alternatives for Han river and Kyeonggi bay. This study mainly focused on the development of integrated water quality management system for Han river bas in and its estuary are a connected to Kyeonggi bay to support integrated water quality management and its plan. Integration was made based on GIS by spatial linking between water quality attributes and location information. A GIS DB was built to estimate the amount of generated and discharged water pollutants according to TMDL technical guide and it included input data to use two different water quality models--W ASP7 for Han river and EFDC for coastal area--to forecast water quality and to suggest BMP(Best management Practices). The results of BOD, TN, and TP from WASP7 were used as the input to run EFDC. Based on the study results, some critical areas which have relatively higher pollutant loadings were identified, and it was also identified that the locations discharging water pollutant loadings to river and seasonal factor affected water quality. And the relationship of water quality between river and its estuary area was quantitatively verified. The results showed that GIS based integrated system could be used as a tool for estimating status-quo of water quality and proposing economically effective BMPs to mitigate water pollution. Further studies need to be made for improving system's capabilities such as adding decision making function as well as cost-benefit analysis, etc. Also, the concrete methodology for water quality management using the system need to be developed.

  • PDF

Ventilation Effect of the Greenhouse with Folding Panel Type Windows (패널굴절방식 환기창 온실의 환기효과)

  • Kim, Jin-Young;Lee, Si-Young;Kim, Hyun-Hwan;Chun, Hee;Yun, In-Hak
    • Journal of Bio-Environment Control
    • /
    • v.11 no.1
    • /
    • pp.5-11
    • /
    • 2002
  • In this study, new development of natural ventilation window was accomplished to control environment of greenhouse with no use of farced ventilation during hot season. The ventilation effect of developed ventilation window was investigated in experimental greenhouse which was designed using side wall panel and folding type panel fur natural ventilation. Folding panel type ventilation window was designed to open upper part of the side wall and top of the roof using two hinges which are located bottom of the side wall and the roof panel to grab one side of each panels and guide the other side along with the guidance rail. Developed ventilation window has top ventilation part with maximum moving distance X=ι (1-cos$\theta$)=848.5 mm and side ventilation part with maximum moving distance Y=ι/2 $\times$sin$\theta$=1,184.4 mm at 45$^{\circ}$ of theoretical opening angle. It took 4.5 minutes to open roof vent fully and temperature at 1.2 and 0.8 m height decreased after 1 minute from starting opening and became equilibrium state maintaining 3-4$^{\circ}C$ difference after 2 minutes from complete opening. Air exchange rate was 15.2~39.3 h$^{-1}$ which was more than 10~15 h$^{-1}$ of continuous type and Venlo type greenhouse. The descent effect of temperature by ventilation windows was two times higher than Venlo type greenhouse.