• Title/Summary/Keyword: Pre-processor

Search Result 225, Processing Time 0.022 seconds

Development and Validation of A Decision Support System for the Real-time Monitoring and Management of Reservoir Turbidity Flows: A Case Study for Daecheong Dam (실시간 저수지 탁수 감시 및 관리를 위한 의사결정지원시스템 개발 및 검증: 대청댐 사례)

  • Chung, Se-Woong;Jung, Yong-Rak;Ko, Ick-Hwan;Kim, Nam-Il
    • Journal of Korea Water Resources Association
    • /
    • v.41 no.3
    • /
    • pp.293-303
    • /
    • 2008
  • Reservoir turbidity flows degrade the efficiency and sustainability of water supply system in many countries located in monsoon climate region. A decision support system called RTMMS aimed to assist reservoir operations was developed for the real time monitoring, modeling, and management of turbidity flows induced by flood runoffs in Daecheong reservoir. RTMMS consists of a real time data acquisition module that collects and stores field monitoring data, a data assimilation module that assists pre-processing of model input data, a two dimensional numerical model for the simulation of reservoir hydrodynamics and turbidity, and a post-processor that aids the analysis of simulation results and alternative management scenarios. RTMMS was calibrated using field data obtained during the flood season of 2004, and applied to real-time simulations of flood events occurred on July of 2006 for assessing its predictive capability. The system showed fairly satisfactory performance in reproducing the density flow regimes and fate of turbidity plumes in the reservoir with efficient computation time that is a vital requirement for a real time application. The configurations of RTMMS suggested in this study can be adopted in many reservoirs that have similar turbidity issues for better management of water supply utilities and downstream aquatic ecosystem.

A Proposal of Personal Information DB Encryption Assurance Framework (개인정보 DB 암호화 검증 프레임웍 제안)

  • Ko, Youngdai;Lee, Sang-Jin
    • Journal of the Korea Institute of Information Security & Cryptology
    • /
    • v.24 no.2
    • /
    • pp.397-409
    • /
    • 2014
  • According to the Personal Information Protection Act(PIPA) which is legislated in March 2011, the individual or company that handles personal information, called Personal information processor, should encrypt some kinds of personal information kept in his Database. For convenience sake we call it DB Encryption in this paper. Law enforcement and the implementation agency accordingly are being strengthen the supervision that the status of DB Encryption is being properly applied and implemented as the PIPA. However, the process of DB Encryption is very complicate and difficult as well as there are many factors to consider in reality. For example, there are so many considerations and requirements in the process of DB Encryption like pre-analysis and design, real application and test, etc.. And also there are surely points to be considered in related system components, business process and time and costs. Like this, although there are plenty of factors significantly associated with DB Encryption, yet more concrete and realistic validation entry seems somewhat lacking. In this paper, we propose a realistic DB Encryption Assurance Framework that it is acceptable and resonable in the performance of the PIPA duty (the aspect of the individual or company) and standard direction of inspection and verification of DB Encryption (the aspect of law enforcement).

A Modified grid-based KIneMatic wave STOrm Runoff Model (ModKIMSTORM) (I) - Theory and Model - (격자기반 운동파 강우유출모형 KIMSTORM의 개선(I) - 이론 및 모형 -)

  • Jung, In Kyun;Lee, Mi Seon;Park, Jong Yoon;Kim, Seong Joon
    • KSCE Journal of Civil and Environmental Engineering Research
    • /
    • v.28 no.6B
    • /
    • pp.697-707
    • /
    • 2008
  • The grid-based KIneMatic wave STOrm Runoff Model (KIMSTORM) by Kim (1998) predicts the temporal variation and spatial distribution of overland flow, subsurface flow and stream flow in a watershed. The model programmed with C++ language on Unix operating system adopts single flowpath algorithm for water balance simulation of flow at each grid element. In this study, we attempted to improve the model by converting the code into FORTRAN 90 on MS Windows operating system and named as ModKIMSTORM. The improved functions are the addition of GAML (Green-Ampt & Mein-Larson) infiltration model, control of paddy runoff rate by flow depth and Manning's roughness coefficient, addition of baseflow layer, treatment of both spatial and point rainfall data, development of the pre- and post-processor, and development of automatic model evaluation function using five evaluation criteria (Pearson's coefficient of determination, Nash and Sutcliffe model efficiency, the deviation of runoff volume, relative error of the peak runoff rate, and absolute error of the time to peak runoff). The modified model adopts Shell Sort algorithm to enhance the computational performance. Input data formats are accepted as raster and MS Excel, and model outputs viz. soil moisture, discharge, flow depth and velocity are generated as BSQ, ASCII grid, binary grid and raster formats.

An Implementation of OTB Extension to Produce TOA and TOC Reflectance of LANDSAT-8 OLI Images and Its Product Verification Using RadCalNet RVUS Data (Landsat-8 OLI 영상정보의 대기 및 지표반사도 산출을 위한 OTB Extension 구현과 RadCalNet RVUS 자료를 이용한 성과검증)

  • Kim, Kwangseob;Lee, Kiwon
    • Korean Journal of Remote Sensing
    • /
    • v.37 no.3
    • /
    • pp.449-461
    • /
    • 2021
  • Analysis Ready Data (ARD) for optical satellite images represents a pre-processed product by applying spectral characteristics and viewing parameters for each sensor. The atmospheric correction is one of the fundamental and complicated topics, which helps to produce Top-of-Atmosphere (TOA) and Top-of-Canopy (TOC) reflectance from multi-spectral image sets. Most remote sensing software provides algorithms or processing schemes dedicated to those corrections of the Landsat-8 OLI sensors. Furthermore, Google Earth Engine (GEE), provides direct access to Landsat reflectance products, USGS-based ARD (USGS-ARD), on the cloud environment. We implemented the Orfeo ToolBox (OTB) atmospheric correction extension, an open-source remote sensing software for manipulating and analyzing high-resolution satellite images. This is the first tool because OTB has not provided calibration modules for any Landsat sensors. Using this extension software, we conducted the absolute atmospheric correction on the Landsat-8 OLI images of Railroad Valley, United States (RVUS) to validate their reflectance products using reflectance data sets of RVUS in the RadCalNet portal. The results showed that the reflectance products using the OTB extension for Landsat revealed a difference by less than 5% compared to RadCalNet RVUS data. In addition, we performed a comparative analysis with reflectance products obtained from other open-source tools such as a QGIS semi-automatic classification plugin and SAGA, besides USGS-ARD products. The reflectance products by the OTB extension showed a high consistency to those of USGS-ARD within the acceptable level in the measurement data range of the RadCalNet RVUS, compared to those of the other two open-source tools. In this study, the verification of the atmospheric calibration processor in OTB extension was carried out, and it proved the application possibility for other satellite sensors in the Compact Advanced Satellite (CAS)-500 or new optical satellites.

Measuring the Public Service Quality Using Process Mining: Focusing on N City's Building Licensing Complaint Service (프로세스 마이닝을 이용한 공공서비스의 품질 측정: N시의 건축 인허가 민원 서비스를 중심으로)

  • Lee, Jung Seung
    • Journal of Intelligence and Information Systems
    • /
    • v.25 no.4
    • /
    • pp.35-52
    • /
    • 2019
  • As public services are provided in various forms, including e-government, the level of public demand for public service quality is increasing. Although continuous measurement and improvement of the quality of public services is needed to improve the quality of public services, traditional surveys are costly and time-consuming and have limitations. Therefore, there is a need for an analytical technique that can measure the quality of public services quickly and accurately at any time based on the data generated from public services. In this study, we analyzed the quality of public services based on data using process mining techniques for civil licensing services in N city. It is because the N city's building license complaint service can secure data necessary for analysis and can be spread to other institutions through public service quality management. This study conducted process mining on a total of 3678 building license complaint services in N city for two years from January 2014, and identified process maps and departments with high frequency and long processing time. According to the analysis results, there was a case where a department was crowded or relatively few at a certain point in time. In addition, there was a reasonable doubt that the increase in the number of complaints would increase the time required to complete the complaints. According to the analysis results, the time required to complete the complaint was varied from the same day to a year and 146 days. The cumulative frequency of the top four departments of the Sewage Treatment Division, the Waterworks Division, the Urban Design Division, and the Green Growth Division exceeded 50% and the cumulative frequency of the top nine departments exceeded 70%. Higher departments were limited and there was a great deal of unbalanced load among departments. Most complaint services have a variety of different patterns of processes. Research shows that the number of 'complementary' decisions has the greatest impact on the length of a complaint. This is interpreted as a lengthy period until the completion of the entire complaint is required because the 'complement' decision requires a physical period in which the complainant supplements and submits the documents again. In order to solve these problems, it is possible to drastically reduce the overall processing time of the complaints by preparing thoroughly before the filing of the complaints or in the preparation of the complaints, or the 'complementary' decision of other complaints. By clarifying and disclosing the cause and solution of one of the important data in the system, it helps the complainant to prepare in advance and convinces that the documents prepared by the public information will be passed. The transparency of complaints can be sufficiently predictable. Documents prepared by pre-disclosed information are likely to be processed without problems, which not only shortens the processing period but also improves work efficiency by eliminating the need for renegotiation or multiple tasks from the point of view of the processor. The results of this study can be used to find departments with high burdens of civil complaints at certain points of time and to flexibly manage the workforce allocation between departments. In addition, as a result of analyzing the pattern of the departments participating in the consultation by the characteristics of the complaints, it is possible to use it for automation or recommendation when requesting the consultation department. In addition, by using various data generated during the complaint process and using machine learning techniques, the pattern of the complaint process can be found. It can be used for automation / intelligence of civil complaint processing by making this algorithm and applying it to the system. This study is expected to be used to suggest future public service quality improvement through process mining analysis on civil service.