• Title/Summary/Keyword: Generate Data

Search Result 3,065, Processing Time 0.034 seconds

Pipeline wall thinning rate prediction model based on machine learning

  • Moon, Seongin;Kim, Kyungmo;Lee, Gyeong-Geun;Yu, Yongkyun;Kim, Dong-Jin
    • Nuclear Engineering and Technology
    • /
    • v.53 no.12
    • /
    • pp.4060-4066
    • /
    • 2021
  • Flow-accelerated corrosion (FAC) of carbon steel piping is a significant problem in nuclear power plants. The basic process of FAC is currently understood relatively well; however, the accuracy of prediction models of the wall-thinning rate under an FAC environment is not reliable. Herein, we propose a methodology to construct pipe wall-thinning rate prediction models using artificial neural networks and a convolutional neural network, which is confined to a straight pipe without geometric changes. Furthermore, a methodology to generate training data is proposed to efficiently train the neural network for the development of a machine learning-based FAC prediction model. Consequently, it is concluded that machine learning can be used to construct pipe wall thinning rate prediction models and optimize the number of training datasets for training the machine learning algorithm. The proposed methodology can be applied to efficiently generate a large dataset from an FAC test to develop a wall thinning rate prediction model for a real situation.

A Study on Operating Profile Creation Method for Risk-Based Oversight (위험기반 항공안전감독을 위한 운영 프로파일 생성기법 연구)

  • Kim, Yong;Jeong, Hyunjin;Sim, Yeongmin
    • Journal of the Korean Society for Aviation and Aeronautics
    • /
    • v.26 no.4
    • /
    • pp.149-154
    • /
    • 2018
  • Risk Based Oversight is a way of performing oversight, where planning is driven by the combination of risk profile and safety performance. Risk Based Oversight enables prioritization and allocation of State's safety management resources commensurate with the safety risk profile of each service provider. This paper presents the concept of Risk Based Oversight and applies it to the current Korean aviation safety oversight process. In particular, this study presents a method for generating operating profiles, one of the key concepts of Risk Based Oversight. Operations Specifications of Part 121 Airlines can generate operating profiles. In this study, a practical study was conducted to generate the operating profile and scoped DCTs using Part 121 Airlines's Operations Specifications.

Generation of modern satellite data from Galileo sunspot drawings by deep learning

  • Lee, Harim;Park, Eunsu;Moon, Young-Jae
    • The Bulletin of The Korean Astronomical Society
    • /
    • v.46 no.1
    • /
    • pp.41.1-41.1
    • /
    • 2021
  • We generate solar magnetograms and EUV images from Galileo sunspot drawings using a deep learning model based on conditional generative adversarial networks. We train the model using pairs of sunspot drawing from Mount Wilson Observatory (MWO) and their corresponding magnetogram (or UV/EUV images) from 2011 to 2015 except for every June and December by the SDO (Solar Dynamic Observatory) satellite. We evaluate the model by comparing pairs of actual magnetogram (or UV/EUV images) and the corresponding AI-generated one in June and December. Our results show that bipolar structures of the AI-generated magnetograms are consistent with those of the original ones and their unsigned magnetic fluxes (or intensities) are well consistent with those of the original ones. Applying this model to the Galileo sunspot drawings in 1612, we generate HMI-like magnetograms and AIA-like EUV images of the sunspots. We hope that the EUV intensities can be used for estimating solar EUV irradiance at long-term historical times.

  • PDF

Utilisation of IoT Systems as Entropy Source for Random Number Generation

  • Oguzhan ARSLAN;Ismail KIRBAS
    • International Journal of Computer Science & Network Security
    • /
    • v.24 no.4
    • /
    • pp.77-86
    • /
    • 2024
  • Using random numbers to represent uncertainty and unpredictability is essential in many industries. This is crucial in disciplines like computer science, cryptography, and statistics where the use of randomness helps to guarantee the security and dependability of systems and procedures. In computer science, random number generation is used to generate passwords, keys, and other security tokens as well as to add randomness to algorithms and simulations. According to recent research, the hardware random number generators used in billions of Internet of Things devices do not produce enough entropy. This article describes how raw data gathered by IoT system sensors can be used to generate random numbers for cryptography systems and also examines the results of these random numbers. The results obtained have been validated by successfully passing the FIPS 140-1 and NIST 800-22 test suites.

ICAIM;An Improved CAIM Algorithm for Knowledge Discovery

  • Yaowapanee, Piriya;Pinngern, Ouen
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 2004.08a
    • /
    • pp.2029-2032
    • /
    • 2004
  • The quantity of data were rapidly increased recently and caused the data overwhelming. This led to be difficult in searching the required data. The method of eliminating redundant data was needed. One of the efficient methods was Knowledge Discovery in Database (KDD). Generally data can be separate into 2 cases, continuous data and discrete data. This paper describes algorithm that transforms continuous attributes into discrete ones. We present an Improved Class Attribute Interdependence Maximization (ICAIM), which designed to work with supervised data, for discretized process. The algorithm does not require user to predefine the number of intervals. ICAIM improved CAIM by using significant test to determine which interval should be merged to one interval. Our goal is to generate a minimal number of discrete intervals and improve accuracy for classified class. We used iris plant dataset (IRIS) to test this algorithm compare with CAIM algorithm.

  • PDF

Pattern Data Extraction and Generation Algorithm for A Computer Controlled Pattern Sewing Machine (컴퓨터 제어 패턴 재봉기를 위한 패턴 데이타 추출 및 생성 알고리즘)

  • Yun, Sung-yong;Baik, Sang-hyun;Kim, Il-hwan
    • Journal of Industrial Technology
    • /
    • v.19
    • /
    • pp.179-187
    • /
    • 1999
  • The computer pattern sewing machine is an automatic sewing machine that is controlled by an input pattern. Even a novice can run this machine for various tasks fast and reliably such as sewing a button, a belt ring and an airbag, etc. The pattern processing software, which is the main software of this machine, is for editing and modifying pattern data by online teaching or off-line editing, setting up parameters, and calculate a moving distance of working area on the x-y axes. In this paper we propose an algorithm to generate pattern data for sewing by simplifying image data. The pattern data are composed of outline data like dot, line, circle, arc, curve, etc. We need converting this data into sewing data which involve sewing parameter, moving distance of working are an the x-y axes, thread, spindle speed.

  • PDF

On statistical Computing via EM Algorithm in Logistic Linear Models Involving Non-ignorable Missing data

  • Jun, Yu-Na;Qian, Guoqi;Park, Jeong-Soo
    • Proceedings of the Korean Statistical Society Conference
    • /
    • 2005.11a
    • /
    • pp.181-186
    • /
    • 2005
  • Many data sets obtained from surveys or medical trials often include missing observations. When these data sets are analyzed, it is general to use only complete cases. However, it is possible to have big biases or involve inefficiency. In this paper, we consider a method for estimating parameters in logistic linear models involving non-ignorable missing data mechanism. A binomial response and normal exploratory model for the missing data are used. We fit the model using the EM algorithm. The E-step is derived by Metropolis-hastings algorithm to generate a sample for missing data and Monte-carlo technique, and the M-step is by Newton-Raphson to maximize likelihood function. Asymptotic variances of the MLE's are derived and the standard error and estimates of parameters are compared.

  • PDF

Design of Collecting System for Traffic Information using Loop Detector and Piezzo Sensor (루프검지기와 피에조 센서를 이용한 교통정보 수집시스템 설계)

  • Yang, Seung-Hun;Han, Kyong-Ho
    • Proceedings of the KIEE Conference
    • /
    • 2000.07d
    • /
    • pp.2956-2958
    • /
    • 2000
  • This paper describes the design of a real time traffic data acquisition system using loop detector and piezzo sensor. Loop detector is the cheapest method to measure the speed and piezzo is used to detect the vehicle axle information. A ISA slot based I/O board is designed for data acquisition and PC process the raw traffic data and transfer the data to the host system. Simulation kit is designed with toy car kits. simulated loop detector and piezzo sensor. The data acquisition system collects up to 10 lane highway traffic data such as vehicle count. speed. length axle count. distance between the axles. The data is processed to generate traffic count, vehicle classification, which are to be used for ITS. The system architecture and simulation data is included and the system will be tested for field operation.

  • PDF

k-NN Join Based on LSH in Big Data Environment

  • Ji, Jiaqi;Chung, Yeongjee
    • Journal of information and communication convergence engineering
    • /
    • v.16 no.2
    • /
    • pp.99-105
    • /
    • 2018
  • k-Nearest neighbor join (k-NN Join) is a computationally intensive algorithm that is designed to find k-nearest neighbors from a dataset S for every object in another dataset R. Most related studies on k-NN Join are based on single-computer operations. As the data dimensions and data volume increase, running the k-NN Join algorithm on a single computer cannot generate results quickly. To solve this scalability problem, we introduce the locality-sensitive hashing (LSH) k-NN Join algorithm implemented in Spark, an approach for high-dimensional big data. LSH is used to map similar data onto the same bucket, which can reduce the data search scope. In order to achieve parallel implementation of the algorithm on multiple computers, the Spark framework is used to accelerate the computation of distances between objects in a cluster. Results show that our proposed approach is fast and accurate for high-dimensional and big data.

Effect of Input Data Video Interval and Input Data Image Similarity on Learning Accuracy in 3D-CNN

  • Kim, Heeil;Chung, Yeongjee
    • International Journal of Internet, Broadcasting and Communication
    • /
    • v.13 no.2
    • /
    • pp.208-217
    • /
    • 2021
  • 3D-CNN is one of the deep learning techniques for learning time series data. However, these three-dimensional learning can generate many parameters, requiring high performance or having a significant impact on learning speed. We will use these 3D-CNNs to learn hand gesture and find the parameters that showed the highest accuracy, and then analyze how the accuracy of 3D-CNN varies through input data changes without any structural changes in 3D-CNN. First, choose the interval of the input data. This adjusts the ratio of the stop interval to the gesture interval. Secondly, the corresponding interframe mean value is obtained by measuring and normalizing the similarity of images through interclass 2D cross correlation analysis. This experiment demonstrates that changes in input data affect learning accuracy without structural changes in 3D-CNN. In this paper, we proposed two methods for changing input data. Experimental results show that input data can affect the accuracy of the model.