• 제목/요약/키워드: Data-driven models

검색결과 257건 처리시간 0.038초

LSTM - MLP 인공신경망 앙상블을 이용한 장기 강우유출모의 (Long-term runoff simulation using rainfall LSTM-MLP artificial neural network ensemble)

  • 안성욱;강동호;성장현;김병식
    • 한국수자원학회논문집
    • /
    • 제57권2호
    • /
    • pp.127-137
    • /
    • 2024
  • 수자원 관리를 위해 주로 사용되는 물리 모형은 입력자료의 구축과 구동이 어렵고 사용자의 주관적 견해가 개입될 수 있다. 최근 수자원 분야에서 이러한 문제점을 보완하기 위해 기계학습과 같은 자료기반 모델을 이용한 연구가 활발히 진행되고 있다. 본 연구에서는 관측자료만을 이용하여 강원도 삼척시 오십천 유역의 장기강우유출모의를 수행했다. 이를 위해 기상자료로 3개의 입력자료군(기상관측요소, 일 강수량 및 잠재증발산량, 일강수량 - 잠재증발산량)을 구성하고 LSTM (Long Short-term Memory)인공신경망 모델에 각각 학습시킨 결과를 비교 및 분석했다. 그 결과 기상관측요소만을 이용한 LSTM-Model 1의 성능이 가장 높았으며, 여기에 MLP 인공신경망을 더한 6개의 LSTM-MLP 앙상블 모델을 구축하여 오십천 유역의 장기유출을 모의했다. LSTM 모델과 LSTM-MLP 모형을 비교한 결과 두 모델 모두 대체적으로 비슷한 결과를 보였지만 LSTM 모델에 비해 LSTM-MLP의 MAE, MSE, RMSE가 감소했고 특히 저유량 부분이 개선되었다. LSTM-MLP의 결과에서 저유량 부분의 개선을 보임에 따라 향후 LSTM-MLP 모델 이외에 CNN등 다양한 앙상블 모형을 이용해 물리적 모델 구축 및 구동 시간이 오래 걸리는 대유역과 입력 자료가 부족한 미계측 유역의 유황곡선 작성 등에 활용성이 높을 것으로 판단된다.

저수지 CO2 배출량 산정을 위한 기계학습 모델의 적용 (Applications of Machine Learning Models for the Estimation of Reservoir CO2 Emissions)

  • 유지수;정세웅;박형석
    • 한국물환경학회지
    • /
    • 제33권3호
    • /
    • pp.326-333
    • /
    • 2017
  • The lakes and reservoirs have been reported as important sources of carbon emissions to the atmosphere in many countries. Although field experiments and theoretical investigations based on the fundamental gas exchange theory have proposed the quantitative amounts of Net Atmospheric Flux (NAF) in various climate regions, there are still large uncertainties at the global scale estimation. Mechanistic models can be used for understanding and estimating the temporal and spatial variations of the NAFs considering complicated hydrodynamic and biogeochemical processes in a reservoir, but these models require extensive and expensive datasets and model parameters. On the other hand, data driven machine learning (ML) algorithms are likely to be alternative tools to estimate the NAFs in responding to independent environmental variables. The objective of this study was to develop random forest (RF) and multi-layer artificial neural network (ANN) models for the estimation of the daily $CO_2$ NAFs in Daecheong Reservoir located in Geum River of Korea, and compare the models performance against the multiple linear regression (MLR) model that proposed in the previous study (Chung et al., 2016). As a result, the RF and ANN models showed much enhanced performance in the estimation of the high NAF values, while MLR model significantly under estimated them. Across validation with 10-fold random samplings was applied to evaluate the performance of three models, and indicated that the ANN model is best, and followed by RF and MLR models.

Identifying Stakeholder Perspectives on Data Industry Regulation in South Korea

  • Lee, Youhyun;Jung, Il-Young
    • Journal of Information Science Theory and Practice
    • /
    • 제9권3호
    • /
    • pp.14-30
    • /
    • 2021
  • Data innovation is at the core of the Fourth Industrial Revolution. While the catastrophic COVID-19 pandemic has accelerated the societal shift toward a data-driven society, the direction of overall data regulation remains unclear and data policy experts have yet to reach a consensus. This study identifies and examines the ideal regulator models of data-policy experts and suggests an appropriate method for developing policy in the data economy. To identify different typologies of data regulation, this study used Q methodology with 42 data policy experts, including public officers, researchers, entrepreneurs, and professors, and additional focus group interviews (FGIs) with six data policy experts. Using a Q survey, this study discerns four types of data policy regulators: proactive activists, neutral conservatives, pro-protection idealists, and pro-protection pragmatists. Based on the results of the analysis and FGIs, this study suggests three practical policy implications for framing a nation's data policy. It also discusses possibilities for exploring diverse methods of data industry regulation, underscoring the value of identifying regulatory issues in the data industry from a social science perspective.

저주기 피로해석을 위한 다층모델의 재료상수 추출에 관한 연구 (Study on the Material Parameter Extraction of the Overlay Model for the Low Cycle Fatigue(LCF) Analysis)

  • 김상호;카비르 후마이언;여태인
    • 한국자동차공학회논문집
    • /
    • 제18권1호
    • /
    • pp.66-73
    • /
    • 2010
  • This work was focused on the material parameter extraction for the isothermal cyclic deformation analysis for which Chaboche(Combined Nonlinear Isotropic and Kinematic Hardening) and Overlay(Multi Linear Hardening) models are normally used. In this study all the parameters were driven especially based on Overlay theories. A simple method is suggested to find out best material parameters for the cyclic deformation analysis prior to the isothermal LCF(Low Cycle Fatigue) analysis. The parameter extraction was done using 400 series stainless steel data which were published in the reference papers. For simple and quick review of the parameters extracted by suggested method, 1D FORTRAN program was developed, and this program could reduce the time for checking the material data tremendously. For the application to FE code ABAQUS user subroutine for the material models was developed by means of UMAT(User Material Subroutine), and the stabilized hysteresis loops obtained by the numerical analysis were in good harmony with test results.

Line Based Transformation Model (LBTM) for high-resolution satellite imagery rectification

  • Shaker, Ahmed;Shi, Wenzhong
    • 대한원격탐사학회:학술대회논문집
    • /
    • 대한원격탐사학회 2003년도 Proceedings of ACRS 2003 ISRS
    • /
    • pp.225-227
    • /
    • 2003
  • Traditional photogrammetry and satellite image rectification technique have been developed based on control-points for many decades. These techniques are driven from linked points in image space and the corresponding points in the object space in rigorous colinearity or coplanarity conditions. Recently, digital imagery facilitates the opportunity to use features as well as points for images rectification. These implementations were mainly based on rigorous models that incorporated geometric constraints into the bundle adjustment and could not be applied to the new high-resolution satellite imagery (HRSI) due to the absence of sensor calibration and satellite orbit information. This research is an attempt to establish a new Line Based Transformation Model (LBTM), which is based on linear features only or linear features with a number of ground control points instead of the traditional models that only use Ground Control Points (GCPs) for satellite imagery rectification. The new model does not require any further information about the sensor model or satellite ephemeris data. Synthetic as well as real data have been demonestrated to check the validity and fidelity of the new approach and the results showed that the LBTM can be used efficiently for rectifying HRSI.

  • PDF

Support vector ensemble for incipient fault diagnosis in nuclear plant components

  • Ayodeji, Abiodun;Liu, Yong-kuo
    • Nuclear Engineering and Technology
    • /
    • 제50권8호
    • /
    • pp.1306-1313
    • /
    • 2018
  • The randomness and incipient nature of certain faults in reactor systems warrant a robust and dynamic detection mechanism. Existing models and methods for fault diagnosis using different mathematical/statistical inferences lack incipient and novel faults detection capability. To this end, we propose a fault diagnosis method that utilizes the flexibility of data-driven Support Vector Machine (SVM) for component-level fault diagnosis. The technique integrates separately-built, separately-trained, specialized SVM modules capable of component-level fault diagnosis into a coherent intelligent system, with each SVM module monitoring sub-units of the reactor coolant system. To evaluate the model, marginal faults selected from the failure mode and effect analysis (FMEA) are simulated in the steam generator and pressure boundary of the Chinese CNP300 PWR (Qinshan I NPP) reactor coolant system, using a best-estimate thermal-hydraulic code, RELAP5/SCDAP Mod4.0. Multiclass SVM model is trained with component level parameters that represent the steady state and selected faults in the components. For optimization purposes, we considered and compared the performances of different multiclass models in MATLAB, using different coding matrices, as well as different kernel functions on the representative data derived from the simulation of Qinshan I NPP. An optimum predictive model - the Error Correcting Output Code (ECOC) with TenaryComplete coding matrix - was obtained from experiments, and utilized to diagnose the incipient faults. Some of the important diagnostic results and heuristic model evaluation methods are presented in this paper.

Experimental Study and Correlation of the Solid-liquid Equilibrium of Some Amino Acids in Binary Organic Solvents

  • Mustafa Jaipallah Abualreish;Adel Noubigh
    • Korean Chemical Engineering Research
    • /
    • 제62권2호
    • /
    • pp.173-180
    • /
    • 2024
  • Under ordinary atmospheric circumstances, the gravimetric technique was used to measure the solubility of L-cysteine (L-Cys) and L-alanine (L-Ala) in various solvents, including methyl alcohol, ethyl acetate, and mixtures of the two, in the range o 283.15 K to 323.15 K. Both individual solvents and their combinations showed a rise in the solubility of L-Cys and L-Ala with increasing temperature, according to the analyzed data but when analyzed at a constant temperature in the selected mixed solvents, the solubility declined with decreasing of initial mole fractions of methyl alcohol. To further assess, the relative utility of the four solubility models, we fitted the solubility data using the Jouyban-Acree (J-A), van't Hoff-Jouyban-Acree (V-J-A), Apelblat-Jouyban-Acree (A-J-A), and Ma models followed by evaluation of the values of the RAD information criteria and the RMSD were. The dissolution was also found to be an entropy-driven spontaneous mixing process in the solvents since the thermodynamic parameters of the solvents were determined using the van't Hoff model. In order to support the industrial crystallization of L-cysteine and L-alanine and contribute to future theoretical research, we have determined the experimental solubility, correlation equations, and thermodynamic parameters of the selected amino acids during the dissolution process.

전자제품 휴먼 인터페이스의 메뉴 설계 방안 (Design of menu structures for the human interfaces of electronic products)

  • 곽지영;한성호
    • 한국경영과학회:학술대회논문집
    • /
    • 대한산업공학회/한국경영과학회 1995년도 춘계공동학술대회논문집; 전남대학교; 28-29 Apr. 1995
    • /
    • pp.534-544
    • /
    • 1995
  • Many electronic products employ menu-driven interfaces for user-system dialogue. Unlike the software user interfaces, a small single-line display, such as a Liquid Crystal Display, is typically used to present menu items. Since the display can show only a single menu item at a time, more serious navigation problems are expected with single-line display menus(SDM). This study attempts to provide a set of unique guidelines for the design of the SDM based on empirical results. A human factors experiment was conducted to examine the effects of four design variables: menu structure, user experience, navigation aid, and number of targets. The usability of design alternatives was measured quantitatively in four different aspects, which were speed, accuracy, inefficiency of navigation, and subjective user preference. The analysis of variance was used to test the statistical effects of the design variables and their interaction effects. A set of design guidelines was drawn from the results which can be applied to the design of human-system interfaces of a wide variety of electronic consumer products using such displays. Since more generalized guidelines could be provided by constructing prediction models based on the empirical data, some powerful performance models are also required for the SDM. As a preliminary study, a survey was done on the performance models for ordinary computer menus.

  • PDF

Development of Water Quality Modeling in the United States

  • Ambrose, Robert B;Wool, Tim A;Barnwell, Thomas O.
    • Environmental Engineering Research
    • /
    • 제14권4호
    • /
    • pp.200-210
    • /
    • 2009
  • The modern era of water quality modeling in the United States began in the 1960s. Pushed by advances in computer technology as well as environmental sciences, water quality modeling evolved through five broad periods: (1) initial model development with mainframe computers (1960s - mid 1970s), (2) model refinement and generalization with minicomputers (mid 1970s - mid 1980s), (3) model standardization and support with microcomputers (mid 1980s - mid 1990s), (4) better model access and performance with faster desktop computers running Windows and local area networks linked to the Internet (mid 1990s - early 2000s), and (5) model integration and widespread use of the Internet (early 2000s - present). Improved computer technology continues to drive improvements in water quality models, including more detailed environmental analysis (spatially and temporally), better user interfaces and GIS software, more accessibility to environmental data from on-line repositories, and more robust modeling frameworks linking hydrodynamics, water quality, watershed and atmospheric models. Driven by regulatory needs and advancing technology, water quality modeling will continue to improve to better address more complicated water bodies and pollutant types, and more complicated management questions. This manuscript describes historical trends in water quality model development in the United States, reviews current efforts, and projects promising future directions.

멀티프로세서용 임베디드 시스템을 위한 UML 기반 소프트웨어 모델의 분할 기법 (A Partition Technique of UML-based Software Models for Multi-Processor Embedded Systems)

  • 김종필;홍장의
    • 정보처리학회논문지D
    • /
    • 제15D권1호
    • /
    • pp.87-98
    • /
    • 2008
  • 임베디드 시스템의 하드웨어 구성요소들에 대한 성능 고도화가 요구됨에 따라 이에 탑재될 소프트웨어의 개발 방법도 영향을 받고 있다. 특히 MPSoC와 같은 고가의 하드웨어 아키텍처에서는 효율적인 자원의 사용 및 성능의 향상을 위해 소프트웨어 측면에서의 고려가 필수적으로 요구된다. 따라서 본 연구에서는 임베디드 소프트웨어 개발과정에서 멀티프로세서 기반의 하드웨어 아키텍처를 고려하는 소프트웨어 태스크의 분할기법을 제시한다. 제시하는 기법은 UML 기반의 소프트웨어 모델을 CBCFG (Constraints-Based Control Flow Graph)로 변환하고, 이를 병렬성과 데이터 의존성을 고려한 소프트웨어 컴포넌트로 분할하는 기법이다. 이러한 기법은 임베디드 소프트웨어의 플랫폼 의존적인 모델 개발과 태스크 성능 예측 등을 위한 자료로 활용할 수 있다.