• Title/Summary/Keyword: Monte Carlo techniques

Search Result 211, Processing Time 0.024 seconds

Failure Probability Assessment of an API 5L X52 Gas Pipeline with a Wall-thinned Section

  • Lee Sang-Min;Yun Kang-Ok;Chang Yoon-Suk;Choi Jae-Boong;Kim Young-Jin
    • International Journal of Precision Engineering and Manufacturing
    • /
    • v.7 no.3
    • /
    • pp.24-29
    • /
    • 2006
  • Pressurized gas pipelines are subject to harmful effects from both the surrounding environment and the materials passing through them. Reliable assessment procedures, including fracture mechanics analyses, are required to maintain their integrity. Currently, integrity assessments are performed using conventional deterministic approaches, even though there are many uncertainties to hinder rational evaluations. Therefore, in this study, a probabilistic approach was considered for gas pipeline evaluations. The objectives were to estimate the failure probability of a corroded pipeline in the gas and oil industries and to propose limited operating conditions for different types of loadings. To achieve these objectives, a probabilistic assessment program was developed using a reliability index and simulation techniques, and applied to evaluate the failure probabilities of a corroded API-5L-X52 gas pipeline subjected to internal pressures, bending moments, and combined loadings. The results demonstrated the potential of the probabilistic integrity assessment program.

Learning the Covariance Dynamics of a Large-Scale Environment for Informative Path Planning of Unmanned Aerial Vehicle Sensors

  • Park, Soo-Ho;Choi, Han-Lim;Roy, Nicholas;How, Jonathan P.
    • International Journal of Aeronautical and Space Sciences
    • /
    • v.11 no.4
    • /
    • pp.326-337
    • /
    • 2010
  • This work addresses problems regarding trajectory planning for unmanned aerial vehicle sensors. Such sensors are used for taking measurements of large nonlinear systems. The sensor investigations presented here entails methods for improving estimations and predictions of large nonlinear systems. Thoroughly understanding the global system state typically requires probabilistic state estimation. Thus, in order to meet this requirement, the goal is to find trajectories such that the measurements along each trajectory minimize the expected error of the predicted state of the system. The considerable nonlinearity of the dynamics governing these systems necessitates the use of computationally costly Monte-Carlo estimation techniques, which are needed to update the state distribution over time. This computational burden renders planning to be infeasible since the search process must calculate the covariance of the posterior state estimate for each candidate path. To resolve this challenge, this work proposes to replace the computationally intensive numerical prediction process with an approximate covariance dynamics model learned using a nonlinear time-series regression. The use of autoregressive time-series featuring a regularized least squares algorithm facilitates the learning of accurate and efficient parametric models. The learned covariance dynamics are demonstrated to outperform other approximation strategies, such as linearization and partial ensemble propagation, when used for trajectory optimization, in terms of accuracy and speed, with examples of simplified weather forecasting.

Estimation of Drought Rainfall According to Consecutive Duration and Return Period Using Probability Distribution (확률분포에 의한 지속기간 및 빈도별 가뭄우량 추정)

  • Lee, Soon Hyuk;Maeng, Sung Jin;Ryoo, Kyong Sik
    • Proceedings of the Korea Water Resources Association Conference
    • /
    • 2004.05b
    • /
    • pp.1103-1106
    • /
    • 2004
  • The objective of this study is to induce the design drought rainfall by the methodology of L-moment including testing homogeneity, independence and outlier of the data of annual minimum monthly rainfall in 57 rainfall stations in Korea in terms of consecutive duration for 1, 2, 4, 6, 9 and 12 months. To select appropriate distribution of the data for annual minimum monthy rainfall by rainfall station, the distribution of generalized extreme value (GEV), generalized logistic (GLO) as well as that of generalized pareto (GPA) are applied and the appropriateness of the applied GEV, GLO, and GPA distribution is judged by L-moment ratio diagram and Kolmogorov-Smirnov (K-S) test. As for the annual minimum monthly rainfall measured by rainfall station and that stimulated by Monte Carlo techniques, the parameters of the appropriately selected GEV and GPA distributions are calculated by the methodology of L-moment and the design drought rainfall is induced. Through the comparative analysis of design drought rainfall induced by GEV and GPA distribution by rainfall station, the optimal design drought rainfall by rainfall station is provided.

  • PDF

Failure Probability Assessment of Gas Pipelines Considering Wall-Thinning Phenomenon (감육현상을 고려한 가스배관의 파손확률 평가)

  • Lee Sang-Min;Yun Kang-Ok;Chang Yoon-Suk;Choi Jae-Boons;Kim Young-Jin
    • Journal of the Korean Society for Precision Engineering
    • /
    • v.22 no.10 s.175
    • /
    • pp.158-166
    • /
    • 2005
  • Pressurized gas pipeline is subject to harmful effects both of the surrounding environment and of the materials transmitted in them. In order to maintain the integrity, reliable assessment procedures including tincture mechanics analysis etc are required. Up to now, the integrity assessment has been performed using conventional deterministic approaches even though there are many uncertainties to hinder a rational evaluation. In this respect, probabilistic approach is considered as an appropriate method for gas pipeline evaluation. The objectives of this paper are to estimate the failure probability of corroded pipeline in gas and oil plants and to propose limited operating conditions under different types of leadings. To do this, a probabilistic assessment program using reliability index and simulation techniques was developed and applied to evaluate failure probabilities of corroded API-5L-X52/X60 gas pipelines subjected to internal pressure, bending moment and combined loading. The evaluation results showed a promising applicability of the probabilistic integrity assessment program.

Application of Probabilistic Health Risk Analysis in Life Cycle Assessment -Part I : A General Framework for Uncertainty and Variability Analysis of Health Risk in Life Cycle Assessment (전과정평가에 있어 확률론적 건강영향분석기법 적용 -Part I : 전과정평가에 있어 확률론적 위해도 분석기법 적용방안에 관한 연구)

  • Choi, Kwang-Soo;Park, Jae-Sung
    • Journal of Environmental Impact Assessment
    • /
    • v.9 no.3
    • /
    • pp.185-202
    • /
    • 2000
  • Uncertainty and variability in Life Cycle Assessment(LCA) have been significant key issues in LCA methodology with techniques in other research area such as social and political science. Variability is understood as stemming from inherent variations in the real world, while uncertainty comes from inaccurate measurements, lack of data, model assumptions, etc. Related articles in this issues were reviewed for classification, distinguish and elaboration of probabilistic/stochastic health risk analysis application in LCA. Concept of focal zone, streamlining technique, scenario modelling and Monte Carlo/Latin Hypercube risk analysis were applied to the uncertainty/variability analysis of health risk in LCA. These results show that this general framework of multi-disciplinary methodology between probabilistic health risk assessment and LCA was of benefit to decision making process by suppling information about input/output data sensitivity, health effect priority and health risk distribution. There should be further research needs for case study using this methodology.

  • PDF

A Study on Periodic Buffer Allocation for Program Master Schedule (프로그램 공정계획을 위한 주기적 버퍼 설치에 관한 고찰)

  • Koo Kyo-Jin
    • Proceedings of the Korean Institute Of Construction Engineering and Management
    • /
    • autumn
    • /
    • pp.81-87
    • /
    • 2001
  • In a dynamically changing environment, the manager of a maintenance and remodeling (M/R) program is confronted with an increasing complexity of coordinating and cooperating multi-resource constrained multiple projects. The root causes of the complexity, uncertainty and interdependence, cause an internal disruption of an activity and chain reactions of disturbance propagation that deteriorate the stability and manageability of the program. This paper evaluates previous endeavors to apply production control and management techniques to the construction industry, and investigates the possibility of applying other management concepts and theories to organizational program management. In particular, this paper proposes a buffer allocation model by which periodic buffers are allocated in the flows of program constraint resources to stabilize a program master schedule instead of protecting individual activities. Comparative experiments by Monte Carlo simulations illustrate improved performance of the proposed model in terms of program's goals: productivity, flexibility, and long-term stability.

  • PDF

Comprehensive Performance Analysis of Interconnect Variation by Double and Triple Patterning Lithography Processes

  • Kim, Youngmin;Lee, Jaemin;Ryu, Myunghwan
    • JSTS:Journal of Semiconductor Technology and Science
    • /
    • v.14 no.6
    • /
    • pp.824-831
    • /
    • 2014
  • In this study, structural variations and overlay errors caused by multiple patterning lithography techniques to print narrow parallel metal interconnects are investigated. Resistance and capacitance parasitic of the six lines of parallel interconnects printed by double patterning lithography (DPL) and triple patterning lithography (TPL) are extracted from a field solver. Wide parameter variations both in DPL and TPL processes are analyzed to determine the impact on signal propagation. Simulations of 10% parameter variations in metal lines show delay variations up to 20% and 30% in DPL and TPL, respectively. Monte Carlo statistical analysis shows that the TPL process results in 21% larger standard variation in delay than the DPL process. Crosstalk simulations are conducted to analyze the dependency on the conditions of the neighboring wires. As expected, opposite signal transitions in the neighboring wires significantly degrade the speed of signal propagation, and the impact becomes larger in the C-worst metals patterned by the TPL process compared to those patterned by the DPL process. As a result, both DPL and TPL result in large variations in parasitic and delay. Therefore, an accurate understanding of variations in the interconnect parameters by multiple patterning lithography and adding proper margins in the circuit designs is necessary.

GPS Pull-In Search Using Reverse Directional Finite Rate of Innovation (FRI)

  • Kong, Seung-Hyun;Yoo, Kyungwoo
    • Journal of Positioning, Navigation, and Timing
    • /
    • v.3 no.3
    • /
    • pp.107-116
    • /
    • 2014
  • When an incoming Global Positioning System (GPS) signal is acquired, pull-in search performs a finer search of the Doppler frequency of the incoming signal so that phase lock loop can be quickly stabilized and the receiver can produce an accurate pseudo-range measurement. However, increasing the accuracy of the Doppler frequency estimation often involves a higher computational cost for weaker GPS signals, which delays the position fix. In this paper, we show that the Doppler frequency detectable by a long coherent auto-correlation can be accurately estimated using a complex-weighted sum of consecutive short coherent auto-correlation outputs with a different Doppler frequency hypothesis, and by exploiting this we propose a noise resistant, low-cost and highly accurate Doppler frequency and phase estimation technique based on a reverse directional application of the finite rate of innovation (FRI) technique. We provide a performance and computational complexity analysis to show the feasibility of the proposed technique and compare the performance to conventional techniques using numerous Monte Carlo simulations.

Batch Time Interval and Initial State Estimation using GMM-TS for Target Motion Analysis (GMM-TS를 이용한 표적기동분석용 배치구간 및 초기상태 추정 기법)

  • Kim, Woo-Chan;Song, Taek-Lyul
    • Journal of Institute of Control, Robotics and Systems
    • /
    • v.18 no.3
    • /
    • pp.285-294
    • /
    • 2012
  • Using bearing measurement only, target motion state is not directly obtained so that TMA (Target Motion Analysis) is needed for this situation. TMA is a nonlinear estimation technique used in passive SONAR systems. Also it is the one of important techniques for underwater combat management systems. TMA can be divided to two parts: batch estimation and sequential estimation. It is preferable to use sequential estimation for reducing computational load as well as adaptively to target maneuvers, batch estimation is still required to attain target initial state vector for convergence of sequential estimation. Selection of batch time interval which depends on observability is critical in TMA performance. Batch estimation in general utilizes predetermined batch time interval. In this paper, we propose a new method called the BTIS (Batch Time Interval and Initial State Estimation). The proposed BTIS estimates target initial status and determines the batch time interval sequentially by using a bank of GMM-TS (Gaussian Mixture Measurement-Track Splitting) filters. The performance of the proposal method is verified by a Monte Carlo simulation study.

RISK-INFORMED REGULATION: HANDLING UNCERTAINTY FOR A RATIONAL MANAGEMENT OF SAFETY

  • Zio, Enrico
    • Nuclear Engineering and Technology
    • /
    • v.40 no.5
    • /
    • pp.327-348
    • /
    • 2008
  • A risk-informed regulatory approach implies that risk insights be used as supplement of deterministic information for safety decision-making purposes. In this view, the use of risk assessment techniques is expected to lead to improved safety and a more rational allocation of the limited resources available. On the other hand, it is recognized that uncertainties affect both the deterministic safety analyses and the risk assessments. In order for the risk-informed decision making process to be effective, the adequate representation and treatment of such uncertainties is mandatory. In this paper, the risk-informed regulatory framework is considered under the focus of the uncertainty issue. Traditionally, probability theory has provided the language and mathematics for the representation and treatment of uncertainty. More recently, other mathematical structures have been introduced. In particular, the Dempster-Shafer theory of evidence is here illustrated as a generalized framework encompassing probability theory and possibility theory. The special case of probability theory is only addressed as term of comparison, given that it is a well known subject. On the other hand, the special case of possibility theory is amply illustrated. An example of the combination of probability and possibility for treating the uncertainty in the parameters of an event tree is illustrated.