• Title/Summary/Keyword: Generate Data

Search Result 3,065, Processing Time 0.032 seconds

Prediction of critical heat flux for narrow rectangular channels in a steady state condition using machine learning

  • Kim, Huiyung;Moon, Jeongmin;Hong, Dongjin;Cha, Euiyoung;Yun, Byongjo
    • Nuclear Engineering and Technology
    • /
    • v.53 no.6
    • /
    • pp.1796-1809
    • /
    • 2021
  • The subchannel of a research reactor used to generate high power density is designed to be narrow and rectangular and comprises plate-type fuels operating under downward flow conditions. Critical heat flux (CHF) is a crucial parameter for estimating the safety of a nuclear fuel; hence, this parameter should be accurately predicted. Here, machine learning is applied for the prediction of CHF in a narrow rectangular channel. Although machine learning can effectively analyze large amounts of complex data, its application to CHF, particularly for narrow rectangular channels, remains challenging because of the limited flow conditions available in existing experimental databases. To resolve this problem, we used four CHF correlations to generate pseudo-data for training an artificial neural network. We also propose a network architecture that includes pre-training and prediction stages to predict and analyze the CHF. The trained neural network predicted the CHF with an average error of 3.65% and a root-mean-square error of 17.17% for the test pseudo-data; the respective errors of 0.9% and 26.4% for the experimental data were not considered during training. Finally, machine learning was applied to quantitatively investigate the parametric effect on the CHF in narrow rectangular channels under downward flow conditions.

A novel risk assessment approach for data center structures

  • Cicek, Kubilay;Sari, Ali
    • Earthquakes and Structures
    • /
    • v.19 no.6
    • /
    • pp.471-484
    • /
    • 2020
  • Previous earthquakes show that, structural safety evaluations should include the evaluation of nonstructural components. Failure of nonstructural components can affect the operational capacity of critical facilities, such as hospitals and fire stations, which can cause an increase in number of deaths. Additionally, failure of nonstructural components may result in economic, architectural, and historical losses of community. Accelerations and random vibrations must be under the predefined limitations in structures with high technological equipment, data centers in this case. Failure of server equipment and anchored server racks are investigated in this study. A probabilistic study is completed for a low-rise rigid sample structure. The structure is investigated in two versions, (i) conventional fixed-based structure and (ii) with a base isolation system. Seismic hazard assessment is completed for the selected site. Monte Carlo simulations are generated with selected parameters. Uncertainties in both structural parameters and mechanical properties of isolation system are included in simulations. Anchorage failure and vibration failures are investigated. Different methods to generate fragility curves are used. The site-specific annual hazard curve is used to generate risk curves for two different structures. A risk matrix is proposed for the design of data centers. Results show that base isolation systems reduce the failure probability significantly in higher floors. It was also understood that, base isolation systems are highly sensitive to earthquake characteristics rather than variability in structural and mechanical properties, in terms of accelerations. Another outcome is that code-provided anchorage failure limitations are more vulnerable than the random vibration failure limitations of server equipment.

Generation of daily temperature data using monthly mean temperature and precipitation data (월 평균 기온과 강우 자료를 이용한 일 기온 자료의 생성)

  • Moon, Kyung Hwan;Song, Eun Young;Wi, Seung Hwan;Seo, Hyung Ho;Hyun, Hae Nam
    • Korean Journal of Agricultural and Forest Meteorology
    • /
    • v.20 no.3
    • /
    • pp.252-261
    • /
    • 2018
  • This study was conducted to develop a method to generate daily maximum and minimum temperatures using monthly data. We analyzed 30-year daily weather data of the 23 meteorological stations in South Korea and elucidated the parameters for predicting annual trend (center value ($\hat{U}$), amplitude (C), deviation (T)) and daily fluctuation (A, B) of daily maximum and minimum temperature. We use national average values for C, T, A and B parameters, but the center value is derived from the annual average data on each stations. First, daily weather data were generated according to the occurrence of rainfall, then calibrated using monthly data, and finally, daily maximum and minimum daily temperatures were generated. With this method, we could generate daily weather data with more than 95% similar distribution to recorded data for all 23 stations. In addition, this method was able to generate Growing Degree Day(GDD) similar to the past data, and it could be applied to areas not subject to survey. This method is useful for generating daily data in case of having monthly data such as climate change scenarios.

Automatic Generation of GCP Chips from High Resolution Images using SUSAN Algorithms

  • Um Yong-Jo;Kim Moon-Gyu;Kim Taejung;Cho Seong-Ik
    • Proceedings of the KSRS Conference
    • /
    • 2004.10a
    • /
    • pp.220-223
    • /
    • 2004
  • Automatic image registration is an essential element of remote sensing because remote sensing system generates enormous amount of data, which are multiple observations of the same features at different times and by different sensor. The general process of automatic image registration includes three steps: 1) The extraction of features to be used in the matching process, 2) the feature matching strategy and accurate matching process, 3) the resampling of the data based on the correspondence computed from matched feature. For step 2) and 3), we have developed an algorithms for automated registration of satellite images with RANSAC(Random Sample Consensus) in success. However, for step 1), There still remains human operation to generate GCP Chips, which is time consuming, laborious and expensive process. The main idea of this research is that we are able to automatically generate GCP chips with comer detection algorithms without GPS survey and human interventions if we have systematic corrected satellite image within adaptable positional accuracy. In this research, we use SUSAN(Smallest Univalue Segment Assimilating Nucleus) algorithm in order to detect the comer. SUSAN algorithm is known as the best robust algorithms for comer detection in the field of compute vision. However, there are so many comers in high-resolution images so that we need to reduce the comer points from SUSAN algorithms to overcome redundancy. In experiment, we automatically generate GCP chips from IKONOS images with geo level using SUSAN algorithms. Then we extract reference coordinate from IKONOS images and DEM data and filter the comer points using texture analysis. At last, we apply automatically collected GCP chips by proposed method and the GCP by operator to in-house automatic precision correction algorithms. The compared result will be presented to show the GCP quality.

  • PDF

Acceleration Method for Integral Imaging Generation of Volume Data based on CUDA (CUDA를 기반한 볼륨데이터의 집적영상 생성을 위한 고속화 기법)

  • Park, Chan;Jeong, Ji-Seong;Park, Jae-Hyeung;Kwon, Ki-Chul;Kim, Nam;Yoo, Kwan-Hee
    • The Journal of the Korea Contents Association
    • /
    • v.11 no.3
    • /
    • pp.9-17
    • /
    • 2011
  • Recently, with the advent of stereoscopic 3D TV, the activation of 3D stereoscopic content is expected. Research on 3D auto stereoscopic display has been carried out to relieve discomfort of 3D stereoscopic display. In this research, it is necessary to generate the elemental image from a lens array. As the number of lens in a lens array is increased, it takes a lot of time to generate the elemental image, and it will take more time for a large volume data. In order to improve the problem, in this paper, we propose a method to generate the elemental image by using OpenCL based on CUDA. We perform our proposed method on PC environment with one of Tesla C1060, Geforce 9800GT and Quadro FX 3800 graphics cards. Experimental results show that the proposed method can obtain almost 20 times better performance than recent research result[11].

ON THE CONSTRUCTION OF A SURFACE FROM DISCRETE DERIVATIVE DATA AND ITS EXTENDED SURFACE USING THE LEAST SQUARES METHOD

  • Kim, Hoi-Sub
    • Journal of applied mathematics & informatics
    • /
    • v.4 no.2
    • /
    • pp.387-396
    • /
    • 1997
  • For given discrete derivative data in a rectangular re-gion we propose a method to generate an approximated surface which fits the given derivative data in the region and extends smoothly to a sufficiently large rectangular region. Such an extension in nec-essary in the generation of the surface in NC(numerical control) ma-chine.

Development of Intelligent Database system for softground instrumentation management (연약지반 계측관리를 위한 지능형 데이터베이스 시스템 개발)

  • 우철웅;장병욱
    • Proceedings of the Korean Society of Agricultural Engineers Conference
    • /
    • 1999.10c
    • /
    • pp.618-624
    • /
    • 1999
  • For many soft ground embankment projects , instrumentation programs for stability and settlement management is being essential . This usually leads to generate large volume of data, which can be used for further research. Database technique is most effective method for data management . Data produced by soft ground embankment instrumentation can not be used by itself but must be reproduced using geotechnical analysis technique. In this study, a intelligent database system for softground called IDSIM was developed to examine applicability intellgent database. . The IDSIM analysis instrumentation data automatically and present results by Web/DB interface successfully.

  • PDF

Quantitative Application of TM Data in Shallow Geological Structure Reconstruction

  • Yang, Liu;Liqun, Zou;Mingxin, Liu
    • Proceedings of the KSRS Conference
    • /
    • 2003.11a
    • /
    • pp.1313-1315
    • /
    • 2003
  • This paper is dedicated to studying the quantitative analysis method with remote-sensing data in shallow geological structure reconstruction by the example of TM data in western China. A new method of computing attitude of geological contacts from remote-sensing data is developed and assessed. We generate several geological profiles with remotely derived measurements to constrain the shallow geological structure reconstruction in three dimensions.

  • PDF

Analysis of Network Traffic using Classification and Association Rule (데이터 마이닝의 분류화와 연관 규칙을 이용한 네트워크 트래픽 분석)

  • 이창언;김응모
    • Journal of the Korea Society for Simulation
    • /
    • v.11 no.4
    • /
    • pp.15-23
    • /
    • 2002
  • As recently the network environment and application services have been more complex and diverse, there has. In this paper we introduce a scheme the extract useful information for network management by analyzing traffic data in user login file. For this purpose we use classification and association rule based on episode concept in data mining. Since login data has inherently time series characterization, convertible data mining algorithms cannot directly applied. We generate virtual transaction, classify transactions above threshold value in time window, and simulate the classification algorithm.

  • PDF

Fast Visualization Technique and Visual Analytics System for Real-time Analyzing Stream Data (실시간 스트림 데이터 분석을 위한 시각화 가속 기술 및 시각적 분석 시스템)

  • Jeong, Seongmin;Yeon, Hanbyul;Jeong, Daekyo;Yoo, Sangbong;Kim, Seokyeon;Jang, Yun
    • Journal of the Korea Computer Graphics Society
    • /
    • v.22 no.4
    • /
    • pp.21-30
    • /
    • 2016
  • Risk management system should be able to support a decision making within a short time to analyze stream data in real time. Many analytical systems consist of CPU computation and disk based database. However, it is more problematic when existing system analyzes stream data in real time. Stream data has various production periods from 1ms to 1 hour, 1day. One sensor generates small data but tens of thousands sensors generate huge amount of data. If hundreds of thousands sensors generate 1GB data per second, CPU based system cannot analyze the data in real time. For this reason, it requires fast processing speed and scalability for analyze stream data. In this paper, we present a fast visualization technique that consists of hybrid database and GPU computation. In order to evaluate our technique, we demonstrate a visual analytics system that analyzes pipeline leak using sensor and tweet data.