• Title/Summary/Keyword: Monte Carlo Sampling

Search Result 294, Processing Time 0.022 seconds

The analysis on the Energy Distribution Function for Electron in SiH4-Ar Gas Mixtures (SiH4-Ar혼합기체의 전자분포함수 해석)

  • Kim, Sang-Nam
    • The Transactions of the Korean Institute of Electrical Engineers P
    • /
    • v.53 no.2
    • /
    • pp.65-69
    • /
    • 2004
  • This paper calculates and gives the analysis of electron swarm transport coefficients as described electric conductive characteristics of pure Ar, pure $SiH_4$, Ar-$SiH_4$ mixture gases($SiH_4$-0.5%, 2.5%, 5%) over the range of E/N = 0.01~300[Td], P = 0.1, 1, 5.0 [Torr] by Monte Carlo the backward prolongation method of the Boltzmann equation using computer simulation without using expensive equipment. The results have been obtained by using the electron collision cross sections by TOF, PT, SST sampling, compared with the experimental data determined by the other author. It also proved the reliability of the electron collision cross sections and shows the practical values of computer simulation. Electron swann parameters in argon were drastically changed by adding a small amount of mono-silane. The electron drift velocity in these mixtures showed unusual behaviour against E/N. It had negative slope in the medium range of E/N, yet the slope was not smooth but contained a small hump. The longitudinal diffusion coefficient also showed a corresponding feature in its dependence on E/N. A two-tenn approximation of the Boltzmann equation analysis and Monte Carlo simulation have been used to study electron transport coefficients.

An Off-Site Consequence Modeling for Accident Using Monte Carlo Method (몬테칼로 방법을 사용할 사고후 영향 평가모델)

  • Chang Sun Kang;Sae Yul Lee
    • Nuclear Engineering and Technology
    • /
    • v.16 no.3
    • /
    • pp.136-140
    • /
    • 1984
  • A new medal is presented in order to evaluate the risk from a nuclear facility following accidents directly combining the on-site meteorological data using the Monte Carlo Method. To estimate the radiological detriment to the surrounding population-at-large (collective dose equivalent), in this study the probability distribution of each meteorological element based upon on-site data is analyzed to generate atmospheric dispersion conditions. The random sampling is used to select the dispersion conditions at any given time of effluent releases. In this study it is considered that the meteorological conditions such as wind direction, speed and stability are mutually independent and each condition satisfies the Markov condition. As a sample study, the risk of KNU-1 following the large LOCA was calculated, The calculated collective dose equivalent in the 50 mile region population from the large LOCA with 50 percent confidence level is 2.0$\times$10$^2$ man-sievert.

  • PDF

Reliability Analysis of Seismically Induced Slope Deformations (신뢰성 기법을 이용한 지진으로 인한 사면 변위해석)

  • Kim, Jin-Man
    • Journal of the Korean Geotechnical Society
    • /
    • v.23 no.3
    • /
    • pp.111-121
    • /
    • 2007
  • The paper presents a reliability-based method that can capture the impact of uncertainty of seismic loadings. The proposed method incorporates probabilistic concepts into the classical limit equilibrium and the Newmark-type deformation techniques. The risk of damage is then computed by Monte Carlo simulation. Random process and RMS hazard method are introduced to produce seismic motions and also to use them in the seismic slope analyses. The geotechnical variability and sampling errors are also considered. The results of reliability analyses indicate that in a highly seismically active region, characterization of earthquake hazard is the more critical factor, and characterization of soil properties has a relatively small effect on the computed risk of slope failure and excessive slope deformations. The results can be applicable to both circular and non-circular slip surface failure modes.

Reliability-Based Design Optimization Using Kriging Metamodel with Sequential Sampling Technique (순차적 샘플링과 크리깅 메타모델을 이용한 신뢰도 기반 최적설계)

  • Choi, Kyu-Seon;Lee, Gab-Seong;Choi, Dong-Hoon
    • Transactions of the Korean Society of Mechanical Engineers A
    • /
    • v.33 no.12
    • /
    • pp.1464-1470
    • /
    • 2009
  • RBDO approach based on a sampling method with the Kriging metamodel and Constraint Boundary Sampling (CBS), which is sequential sampling method to generate metamodels is proposed. The major advantage of the proposed RBDO approach is that it does not require Most Probable failure Point (MPP) which is essential for First-Order Reliability Method (FORM)-based RBDO approach. The Monte Carlo Sampling (MCS), most well-known method of the sampling methods for the reliability analysis is used to assess the reliability of constraints. In addition, a Cumulative Distribution Function (CDF) of the constraints is approximated using Moving Least Square (MLS) method from empirical distribution function. It is possible to acquire a probability of failure and its analytic sensitivities by using an approximate function of the CDF for the constraints. Moreover, a concept of inactive design is adapted to improve a numerical efficiency of the proposed approach. Computational accuracy and efficiency of the proposed RBDO approach are demonstrated by numerical and engineering problems.

A Bayesian Approach to Geophysical Inverse Problems (베이지안 방식에 의한 지구물리 역산 문제의 접근)

  • Oh Seokhoon;Chung Seung-Hwan;Kwon Byung-Doo;Lee Heuisoon;Jung Ho Jun;Lee Duk Kee
    • Geophysics and Geophysical Exploration
    • /
    • v.5 no.4
    • /
    • pp.262-271
    • /
    • 2002
  • This study presents a practical procedure for the Bayesian inversion of geophysical data. We have applied geostatistical techniques for the acquisition of prior model information, then the Markov Chain Monte Carlo (MCMC) method was adopted to infer the characteristics of the marginal distributions of model parameters. For the Bayesian inversion of dipole-dipole array resistivity data, we have used the indicator kriging and simulation techniques to generate cumulative density functions from Schlumberger array resistivity data and well logging data, and obtained prior information by cokriging and simulations from covariogram models. The indicator approach makes it possible to incorporate non-parametric information into the probabilistic density function. We have also adopted the MCMC approach, based on Gibbs sampling, to examine the characteristics of a posteriori probability density function and the marginal distribution of each parameter.

A Study on the Performance Characteristics of a Disk-type Drag Pump (원판형 드래그펌프의 성능특성에 관한 연구)

  • Hwang, Young-Kyu;Heo, Joong-Sik
    • Proceedings of the KSME Conference
    • /
    • 2001.06e
    • /
    • pp.643-648
    • /
    • 2001
  • The direct simulation Monte Carlo(DSMC) method is applied to investigate steady and unsteady flow fields of a single-stage disk-type drag pump. Two different kinds of pumps are considered: the first one is a rotor-rotor combination, and the second one is a rotor-stator combination. The pumping channels are cut on a rotor and stator. The rotor and stator have 10 Archimedes' spiral blades, respectively. In the present DSMC method, the variable hard sphere model is used as a molecular model, and the no time counter method is employed as a collision sampling technique. For simulation of diatomic gas flows, the Borgnakke-Larsen phenomenological model is adopted to redistribute the translational and internal energies. The DSMC results are in good agreement with the experimental data.

  • PDF

Parallel processing in structural reliability

  • Pellissetti, M.F.
    • Structural Engineering and Mechanics
    • /
    • v.32 no.1
    • /
    • pp.95-126
    • /
    • 2009
  • The present contribution addresses the parallelization of advanced simulation methods for structural reliability analysis, which have recently been developed for large-scale structures with a high number of uncertain parameters. In particular, the Line Sampling method and the Subset Simulation method are considered. The proposed parallel algorithms exploit the parallelism associated with the possibility to simultaneously perform independent FE analyses. For the Line Sampling method a parallelization scheme is proposed both for the actual sampling process, and for the statistical gradient estimation method used to identify the so-called important direction of the Line Sampling scheme. Two parallelization strategies are investigated for the Subset Simulation method: the first one consists in the embarrassingly parallel advancement of distinct Markov chains; in this case the speedup is bounded by the number of chains advanced simultaneously. The second parallel Subset Simulation algorithm utilizes the concept of speculative computing. Speedup measurements in context with the FE model of a multistory building (24,000 DOFs) show the reduction of the wall-clock time to a very viable amount (<10 minutes for Line Sampling and ${\approx}$ 1 hour for Subset Simulation). The measurements, conducted on clusters of multi-core nodes, also indicate a strong sensitivity of the parallel performance to the load level of the nodes, in terms of the number of simultaneously used cores. This performance degradation is related to memory bottlenecks during the modal analysis required during each FE analysis.

Global sensitivity analysis improvement of rotor-bearing system based on the Genetic Based Latine Hypercube Sampling (GBLHS) method

  • Fatehi, Mohammad Reza;Ghanbarzadeh, Afshin;Moradi, Shapour;Hajnayeb, Ali
    • Structural Engineering and Mechanics
    • /
    • v.68 no.5
    • /
    • pp.549-561
    • /
    • 2018
  • Sobol method is applied as a powerful variance decomposition technique in the field of global sensitivity analysis (GSA). The paper is devoted to increase convergence speed of the extracted Sobol indices using a new proposed sampling technique called genetic based Latine hypercube sampling (GBLHS). This technique is indeed an improved version of restricted Latine hypercube sampling (LHS) and the optimization algorithm is inspired from genetic algorithm in a new approach. The new approach is based on the optimization of minimax value of LHS arrays using manipulation of array indices as chromosomes in genetic algorithm. The improved Sobol method is implemented to perform factor prioritization and fixing of an uncertain comprehensive high speed rotor-bearing system. The finite element method is employed for rotor-bearing modeling by considering Eshleman-Eubanks assumption and interaction of axial force on the rotor whirling behavior. The performance of the GBLHS technique are compared with the Monte Carlo Simulation (MCS), LHS and Optimized LHS (Minimax. criteria). Comparison of the GBLHS with other techniques demonstrates its capability for increasing convergence speed of the sensitivity indices and improving computational time of the GSA.

Serviceability reliability analysis of cable-stayed bridges

  • Cheng, Jin;Xiao, Ru-Cheng
    • Structural Engineering and Mechanics
    • /
    • v.20 no.6
    • /
    • pp.609-630
    • /
    • 2005
  • A reliability analysis method is proposed in this paper through a combination of the advantages of the response surface method (RSM), finite element method (FEM), first order reliability method (FORM) and the importance sampling updating method. The accuracy and efficiency of the method is demonstrated through several numerical examples. Then the method is used to estimate the serviceability reliability of cable-stayed bridges. Effects of geometric nonlinearity, randomness in loading, material, and geometry are considered. The example cable-stayed bridge is the Second Nanjing Bridge with a main span length of 628 m built in China. The results show that the cable sag that is part of the geometric nonlinearities of cable-stayed bridges has a major effect on the reliability of cable-stayed bridge. Finally, the most influential random variables on the reliability of cable-stayed bridges are identified by using a sensitivity analysis.

A BAYESIAN APPROACH FOR A DECOMPOSITION MODEL OF SOFTWARE RELIABILITY GROWTH USING A RECORD VALUE STATISTICS

  • Choi, Ki-Heon;Kim, Hee-Cheul
    • Journal of applied mathematics & informatics
    • /
    • v.8 no.1
    • /
    • pp.243-252
    • /
    • 2001
  • The points of failure of a decomposition process are defined to be the union of the points of failure from two component point processes for software reliability systems. Because sampling from the likelihood function of the decomposition model is difficulty, Gibbs Sampler can be applied in a straightforward manner. A Markov Chain Monte Carlo method with data augmentation is developed to compute the features of the posterior distribution. For model determination, we explored the prequential conditional predictive ordinate criterion that selects the best model with the largest posterior likelihood among models using all possible subsets of the component intensity functions. A numerical example with a simulated data set is given.