• Title/Summary/Keyword: large pipeline

Search Result 225, Processing Time 0.031 seconds

Pipeline wall thinning rate prediction model based on machine learning

  • Moon, Seongin;Kim, Kyungmo;Lee, Gyeong-Geun;Yu, Yongkyun;Kim, Dong-Jin
    • Nuclear Engineering and Technology
    • /
    • v.53 no.12
    • /
    • pp.4060-4066
    • /
    • 2021
  • Flow-accelerated corrosion (FAC) of carbon steel piping is a significant problem in nuclear power plants. The basic process of FAC is currently understood relatively well; however, the accuracy of prediction models of the wall-thinning rate under an FAC environment is not reliable. Herein, we propose a methodology to construct pipe wall-thinning rate prediction models using artificial neural networks and a convolutional neural network, which is confined to a straight pipe without geometric changes. Furthermore, a methodology to generate training data is proposed to efficiently train the neural network for the development of a machine learning-based FAC prediction model. Consequently, it is concluded that machine learning can be used to construct pipe wall thinning rate prediction models and optimize the number of training datasets for training the machine learning algorithm. The proposed methodology can be applied to efficiently generate a large dataset from an FAC test to develop a wall thinning rate prediction model for a real situation.

Information Requirements for Model-based Monitoring of Construction via Emerging Big Visual Data and BIM

  • Han, Kevin K.;Golparvar-Fard, Mani
    • International conference on construction engineering and project management
    • /
    • 2015.10a
    • /
    • pp.317-320
    • /
    • 2015
  • Documenting work-in-progress on construction sites using images captured with smartphones, point-and-shoot cameras, and Unmanned Aerial Vehicles (UAVs) has gained significant popularity among practitioners. The spatial and temporal density of these large-scale site image collections and the availability of 4D Building Information Models (BIM) provide a unique opportunity to develop BIM-driven visual analytics that can quickly and easily detect and visualize construction progress deviations. Building on these emerging sources of information this paper presents a pipeline for model-driven visual analytics of construction progress. It particularly focuses on the following key steps: 1) capturing, transferring, and storing images; 2) BIM-driven analytics to identify performance deviations, and 3) visualizations that enable root-cause assessments on performance deviations. The information requirements, and the challenges and opportunities for improvements in data collection, plan preparations, progress deviation analysis particularly under limited visibility, and transforming identified deviations into performance metrics to enable root-cause assessments are discussed using several real world case studies.

  • PDF

Bioinformatics services for analyzing massive genomic datasets

  • Ko, Gunhwan;Kim, Pan-Gyu;Cho, Youngbum;Jeong, Seongmun;Kim, Jae-Yoon;Kim, Kyoung Hyoun;Lee, Ho-Yeon;Han, Jiyeon;Yu, Namhee;Ham, Seokjin;Jang, Insoon;Kang, Byunghee;Shin, Sunguk;Kim, Lian;Lee, Seung-Won;Nam, Dougu;Kim, Jihyun F.;Kim, Namshin;Kim, Seon-Young;Lee, Sanghyuk;Roh, Tae-Young;Lee, Byungwook
    • Genomics & Informatics
    • /
    • v.18 no.1
    • /
    • pp.8.1-8.10
    • /
    • 2020
  • The explosive growth of next-generation sequencing data has resulted in ultra-large-scale datasets and ensuing computational problems. In Korea, the amount of genomic data has been increasing rapidly in the recent years. Leveraging these big data requires researchers to use large-scale computational resources and analysis pipelines. A promising solution for addressing this computational challenge is cloud computing, where CPUs, memory, storage, and programs are accessible in the form of virtual machines. Here, we present a cloud computing-based system, Bio-Express, that provides user-friendly, cost-effective analysis of massive genomic datasets. Bio-Express is loaded with predefined multi-omics data analysis pipelines, which are divided into genome, transcriptome, epigenome, and metagenome pipelines. Users can employ predefined pipelines or create a new pipeline for analyzing their own omics data. We also developed several web-based services for facilitating downstream analysis of genome data. Bio-Express web service is freely available at https://www. bioexpress.re.kr/.

An Implementation of the $5\times5$ CNN Hardware and the Pre.Post Processor ($5\times5$ CNN 하드웨어 및 전.후 처리기 구현)

  • Kim Seung-Soo;Jeon Heung-Woo
    • Journal of the Korea Institute of Information and Communication Engineering
    • /
    • v.10 no.5
    • /
    • pp.865-870
    • /
    • 2006
  • The cellular neural networks have shown a vast computing power for the image processing in spite of the simplicity of its structure. However, it is impossible to implement the CNN hardware which would require the same enormous amount of cells as that of the pixels involved in the practical large image. In this parer, the $5\times5$ CNN hardware and the pre post processor which can be used for processing the real large image with a time-multiplexing scheme are implemented. The implemented $5\times5$ CNN hardware and pre post processor is applied to the edge detection of $256\times256$ lena image to evaluate the performance. The total number of block. By the time-multiplexing process is about 4,000 blocks and to control pulses are needed to perform the pipelined operation or the each block. By the experimental resorts, the implemented $5\times5$ CNN hardware and pre post processor can be used to the real large image processing.

NGSOne: Cloud-based NGS data analysis tool (NGSOne: 클라우드 기반의 유전체(NGS) 데이터 분석 툴)

  • Kwon, Chang-hyuk;Kim, Jason;Jang, Jeong-hwa;Ahn, Jae-gyoon
    • Journal of Platform Technology
    • /
    • v.6 no.4
    • /
    • pp.87-95
    • /
    • 2018
  • With the decrease of sequencing price, many national projects that analyzes 0.1 to 1 million people are now in progress. However, large portion of budget of these large projects is dedicated for construction of the cluster system or purchase servers, due to the lack of programs or systems that can handle large amounts of data simultaneously. In this study, we developed NGSOne, a client program that is easy-to-use for even biologists, and performs SNP analysis using hundreds or more of Whole Genome and Whole Exome analysis without construction of their own server or cluster environment. DRAGEN, BWA / GATK, and Isaac / Strelka2, which are representative SNP analysis tools, were selected and DRAGEN showed the best performance in terms of execution time and number of errors. Also, NGSOne can be extended for various analysis tools as well as SNP analysis tools.

Hazard Distance from Hydrogen Accidents (수소가스사고의 피해범위)

  • Jo, Young-Do
    • Journal of the Korean Institute of Gas
    • /
    • v.16 no.1
    • /
    • pp.15-21
    • /
    • 2012
  • An analysis was completed of the hazards distance of hydrogen accidents such as jet release, jet fire, and vapor cloud explosion(VCE) of hydrogen gas, and simplified equations have been proposed to predict the hazard distances to set up safety distance by the gas dispersion, fire, and explosion following hydrogen gas release. For a small release rate of hydrogen gas, such as from a pine-hole, the hazard distance from jet dispersion is longer than that from jet fire. The hazard distance is directly proportional to the pressure raised to a half power and to the diameter of hole and up to several tens meters. For a large release rate, such as from full bore rupture of a pipeline or a large hole of storage vessel, the hazard distance from a large jet fire is longer than that from unconfined vapor cloud explosion. The hazard distance from the fire may be up to several hundred meters. Hydrogen filling station in urban area is difficult to compliance with the safety distance criterion, if the accident scenario of large hydrogen gas release is basis for setting up the safety distance, which is minimum separation distance between the station and building. Therefore, the accident of large hydrogen gas release must be prevented by using safety devices and the safety distance may be set based on the small release rate of hydrogen gas. But if there are any possibility of large release, populated building, such as school, hospital etc, should be separated several hundred meters.

A Comparative Study of Waste Collection Technologies (생활폐기물 수거 방법의 비교 연구)

  • Jung, Young Hoon;Suh, Sang-Ho;Kim, Hyoung-Ho
    • The KSFM Journal of Fluid Machinery
    • /
    • v.16 no.3
    • /
    • pp.48-53
    • /
    • 2013
  • Due to the urbanization, lots of people are living in cities. It is very convenient to live in the cities for the people but at the same time, the highly populated city has several environmental problems. During delivery process of large amount of municipal waste generated from the cities, the automobile emission and traffic jam have been occurred. The waste collection in the cities has been mainly done by using labour force and delivery truck. This is the conventional waste collection up to now. Recently, new technologies like automated waste collection system and capsule transportation have been introduced. Conventional waste collection mainly relied on the labour force and truck delivery does not need to invest a lot of money for the start-up. However, it requires to pay the operational cost both for the labour force and the truck delivery. On the contrary to this conventional waste collection, the automated waste collection and capsule transportation require high initial investment cost. However, the automated waste collection and capsule transportation can reduce significantly the pollutants emission, traffic jam by the waste trucks and actual waste collection cost per ton. In dealing with the waste collection in the cities, new waste collection technologies could be properly combined with the conventional waste collection for the effective municipal waste treatment.

Development of Stress-Modified Fracture Strain Criterion for Ductile Fracture of API X65 Steel (API X65 강의 연성파괴 해석을 위한 삼축응력 영향을 고려한 파괴변형률 기준 개발)

  • Oh Chang-Kyun;Kim Yun-Jae;Park Jin-Moo;Baek Jong-Hyun;Kim Woo-Sik
    • Transactions of the Korean Society of Mechanical Engineers A
    • /
    • v.29 no.12 s.243
    • /
    • pp.1621-1628
    • /
    • 2005
  • This paper presents a stress-modified fracture strain for API X65 steel used for gas pipeline, as a function of stress triaxiality. To determine the stress-modified fracture strain, tension test of bars with four different notch radii, made of API X65 steel, is firstly performed, from which true fracture strains are determined as a function of notch radius. Then detailed elastic-plastic, large strain finite element (FE) analyses are performed to estimate variations of stress triaxiality in the notched bars with load. Combining experimental with FE results provides the true fracture strain as a function of stress triaxiality, which is regarded as a criterion of ductile fracture. Application of the developed stress-modified fracture strain to failure prediction of gas pipes made of API X65 steel with various types of defects is discussed.

Machine Learning Approach to Estimation of Stellar Atmospheric Parameters

  • Han, Jong Heon;Lee, Young Sun;Kim, Young kwang
    • The Bulletin of The Korean Astronomical Society
    • /
    • v.41 no.2
    • /
    • pp.54.2-54.2
    • /
    • 2016
  • We present a machine learning approach to estimating stellar atmospheric parameters, effective temperature (Teff), surface gravity (log g), and metallicity ([Fe/H]) for stars observed during the course of the Sloan Digital Sky Survey (SDSS). For training a neural network, we randomly sampled the SDSS data with stellar parameters available from SEGUE Stellar Parameter Pipeline (SSPP) to cover the parameter space as wide as possible. We selected stars that are not included in the training sample as validation sample to determine the accuracy and precision of each parameter. We also divided the training and validation samples into four groups that cover signal-to-noise ratio (S/N) of 10-20, 20-30, 30-50, and over 50 to assess the effect of S/N on the parameter estimation. We find from the comparison of the network-driven parameters with the SSPP ones the range of the uncertainties of 73~123 K in Teff, 0.18~0.42 dex in log g, and 0.12~0.25 dex in [Fe/H], respectively, depending on the S/N range adopted. We conclude that these precisions are high enough to study the chemical and kinematic properties of the Galactic disk and halo stars, and we will attempt to apply this technique to Large Sky Area Multi-Object Fiber Spectroscopic Telescope (LAMOST), which plans to obtain about 8 million stellar spectra, in order to estimate stellar parameters.

  • PDF

A Study on Various Application Technologies Using Coal Bed Methane (Coal Bed Methane을 사용한 다양한 응용 기술에 대한 고찰)

  • CHO, WONJUN;LEE, JESEOL;YU, HYEJIN;LEE, HYUN CHAN;JU, WOO SUNG;LIM, OCKTAEK
    • Journal of Hydrogen and New Energy
    • /
    • v.29 no.1
    • /
    • pp.130-137
    • /
    • 2018
  • Now discusses the potential use and applications of coal bed methane (CBM) in various industries. One of the options for gas monetization is gas to power (GTP), sometimes called gas to wire (GTW). Electric power can be an intermediate product, such as in the case of mineral refining in which electricity is used to refine bauxite into aluminum; or it can be an end product that is distributed into a large utility power grid. For stranded gas, away from the regional markets, the integration of the ammonia and urea plants makes commercial sense. These new applications, if established, could lead to a surge in demand for methanol plants.