• Title/Summary/Keyword: Monte Carlo sampling

Search Result 291, Processing Time 0.027 seconds

Development of Computer Code for Simulation of Multicomponent Aerosol Dynamics -Uncertainty and Sensitivity Analysis- (다성분 에어로졸계의 동특성 묘사를 위한 전산 코드의 개발 -불확실성 및 민감도 해석-)

  • Na, Jang-Hwan;Lee, Byong-Whi
    • Nuclear Engineering and Technology
    • /
    • v.19 no.2
    • /
    • pp.85-98
    • /
    • 1987
  • To analyze the aerosol dynamics in severe accidents of LMFBR, a new computer code entitled MCAD (Multicomponent Aerosol Dynamics) has been developed. The code can treat two component aerosol system using relative collision probability of each particles as sequences of accident scenarios. Coagulation and removal mechanisms incorporating Brownian diffusion and gravitational sedimentation are included in this model. In order to see the effect of particle geometry, the code makes use of the concept of density correction factor and shape factors. The code is verified using the experimental result of NSPP-300 series and compared to other code. At present, it fits the result of experiment well and agrees to the existing code. The input variables included are very uncertain. Hence, it requires uncertainty and sensitivity analysis as a supplement to code development. In this analysis, 14 variables are selected to analyze. The input variables are compounded by experimental design method and Latin hypercube sampling. Their results are applied to Response surface method to see the degree of regression. The stepwise regression method gives an insight to which variables are significant as time elapse and their reasonable ranges. Using Monte Carlo Method to the regression model of LHS, the confidence level of the results of MCAD and their variables is improved.

  • PDF

A new high-order response surface method for structural reliability analysis

  • Li, Hong-Shuang;Lu, Zhen-Zhou;Qiao, Hong-Wei
    • Structural Engineering and Mechanics
    • /
    • v.34 no.6
    • /
    • pp.779-799
    • /
    • 2010
  • In order to consider high-order effects on the actual limit state function, a new response surface method is proposed for structural reliability analysis by the use of high-order approximation concept in this study. Hermite polynomials are used to determine the highest orders of input random variables, and the sampling points for the determination of highest orders are located on Gaussian points of Gauss-Hermite integration. The cross terms between two random variables, only in case that their corresponding percent contributions to the total variation of limit state function are significant, will be added to the response surface function to improve the approximation accuracy. As a result, significant reduction in computational cost is achieved with this strategy. Due to the addition of cross terms, the additional sampling points, laid on two-dimensional Gaussian points off axis on the plane of two significant variables, are required to determine the coefficients of the approximated limit state function. All available sampling points are employed to construct the final response surface function. Then, Monte Carlo Simulation is carried out on the final approximation response surface function to estimate the failure probability. Due to the use of high order polynomial, the proposed method is more accurate than the traditional second-order or linear response surface method. It also provides much more efficient solutions than the available high-order response surface method with less loss in accuracy. The efficiency and the accuracy of the proposed method compared with those of various response surface methods available are illustrated by five numerical examples.

Bayesian analysis of finite mixture model with cluster-specific random effects (군집 특정 변량효과를 포함한 유한 혼합 모형의 베이지안 분석)

  • Lee, Hyejin;Kyung, Minjung
    • The Korean Journal of Applied Statistics
    • /
    • v.30 no.1
    • /
    • pp.57-68
    • /
    • 2017
  • Clustering algorithms attempt to find a partition of a finite set of objects in to a potentially predetermined number of nonempty subsets. Gibbs sampling of a normal mixture of linear mixed regressions with a Dirichlet prior distribution calculates posterior probabilities when the number of clusters was known. Our approach provides simultaneous partitioning and parameter estimation with the computation of classification probabilities. A Monte Carlo study of curve estimation results showed that the model was useful for function estimation. Examples are given to show how these models perform on real data.

Precise System Models using Crystal Penetration Error Compensation for Iterative Image Reconstruction of Preclinical Quad-Head PET

  • Lee, Sooyoung;Bae, Seungbin;Lee, Hakjae;Kim, Kwangdon;Lee, Kisung;Kim, Kyeong-Min;Bae, Jaekeon
    • Journal of the Korean Physical Society
    • /
    • v.73 no.11
    • /
    • pp.1764-1773
    • /
    • 2018
  • A-PET is a quad-head PET scanner developed for use in small-animal imaging. The dimensions of its volumetric field of view (FOV) are $46.1{\times}46.1{\times}46.1mm^3$ and the gap between the detector modules has been minimized in order to provide a highly sensitive system. However, such a small FOV together with the quad-head geometry causes image quality degradation. The main factor related to image degradation for the quad-head PET is the mispositioning of events caused by the penetration effect in the detector. In this paper, we propose a precise method for modelling the system at the high spatial resolution of the A-PET using a LOR (line of response) based ML-EM (maximum likelihood expectation maximization) that allows for penetration effects. The proposed system model provides the detection probability of every possible ray-path via crystal sampling methods. For the ray-path sampling, the sub-LORs are defined by connecting the sampling points of the crystal pair. We incorporate the detection probability of each sub-LOR into the model by calculating the penetration effect. For comparison, we used a standard LOR-based model and a Monte Carlo-based modeling approach, and evaluated the reconstructed images using both the National Electrical Manufacturers Association NU 4-2008 standards and the Geant4 Application for Tomographic Emission simulation toolkit (GATE). An average full width at half maximum (FWHM) at different locations of 1.77 mm and 1.79 mm are obtained using the proposed system model and standard LOR system model, which does not include penetration effects, respectively. The standard deviation of the uniform region in the NEMA image quality phantom is 2.14% for the proposed method and 14.3% for the LOR system model, indicating that the proposed model out-performs the standard LOR-based model.

Suggestions for Enhancing Sampling-Based Approach of Seismic Probabilistic Risk Assessment (샘플링기반 지진 확률론적 리스크평가 접근법 개선을 위한 제언)

  • Kwag, Shinyoung;Eem, Seunghyun;Choi, Eujeong;Ha, Jeong Gon;Hahm, Daegi
    • Journal of the Computational Structural Engineering Institute of Korea
    • /
    • v.34 no.2
    • /
    • pp.77-84
    • /
    • 2021
  • A sampling-based approach was devised as a nuclear seismic probabilistic risk assessment (SPRA) method to account for the partially correlated relationships between components. However, since this method is based on sampling, there is a limitation that a large number of samples must be extracted to estimate the results accurately. Thus, in this study, we suggest an effective approach to improve the existing sampling method. The main features of this approach are as follows. In place of the existing Monte Carlo sampling (MCS) approach, the Latin hypercube sampling (LHS) method that enables effective sampling in multiple dimensions is introduced to the SPRA method. In addition, the degree of segmentation of the seismic intensity is determined with respect to the final seismic risk result. By applying the suggested approach to an actual nuclear power plant as an example, the accuracy of the results were observed to be almost similar to those of the existing method, but the efficiency was increased by a factor of two in terms of the total number of samples extracted. In addition, it was confirmed that the LHS-based method improves the accuracy of the solution in a small sampling region.

Application of Judgement Post-Stratification to Extended Producer Responsibility System (생산자 책임재활용 제도를 위한 혼입비율 조사에서 Judgement Post-Stratification의 활용)

  • Choi, Wan-Suk;Lim, Jo-Han;Lim, Jong-Ho;Kim, Hyun-Joong
    • Communications for Statistical Applications and Methods
    • /
    • v.15 no.1
    • /
    • pp.105-115
    • /
    • 2008
  • Judgement post-stratification is a new sampling method developed by MacEachern et al. (2004). This article suggests that the judgement post-stratification method can be a good alternative for the simple random sampling when analyzing real-world environmental data. It becomes an important task to accurately measure the output of a recycling facility since the EPR (Extended Producer Responsibility) system takes effect on 2003. However, the total weight of materials processed in the recycling facility may not be a proper measure because the materials are frequently mingled with other non-recycling materials. Therefore, it is necessary to estimate the mixture ratio of non-recycling materials among the total materials admitted in the facility. Unfortunately, the size of sample in a recycling facility is restricted due to the inconvenience of sampling procedure such as safety, odor, time and classification of non-recycling materials. In this article, we showed the relative efficiency of the judgement post-stratification method over the simple random sampling method for equal sample sizes using Monte Carlo simulation. Furthermore, we applied the judgement post-stratification method on the 2004 recycling data and showed that it can replace the simple random sampling even with smaller observations.

Real time orbit estimation using asynchronous multiple RADAR data fusion (비동기 다중 레이더 융합을 통한 실시간 궤도 추정 알고리즘)

  • Song, Ha-Ryong;Moon, Byoung-Jin;Cho, Dong-Hyun
    • Aerospace Engineering and Technology
    • /
    • v.13 no.2
    • /
    • pp.66-72
    • /
    • 2014
  • This paper introduces an asynchronous multiple radar fusion algorithm for space object tracking. To estimate orbital motion of space object, a multiple radar scenario which jointly measures single object with different sampling time indices is described. STK/ODTK is utilized to determine realization of orbital motion and joint coverage of multiple radars. Then, asynchronous fusion algorithm is adapted to enhance the estimation performance of orbital motion during which multiple radars measure the same time instances. Monte-Carlo simulation results demonstrate that the proposed asynchronous multi-sensor fusion scheme better than single linearized Kalman filter in an aspect of root mean square error.

In-situ monitoring and reliability analysis of an embankment slope with soil variability

  • Bai, Tao;Yang, Han;Chen, Xiaobing;Zhang, Shoucheng;Jin, Yuanshang
    • Geomechanics and Engineering
    • /
    • v.23 no.3
    • /
    • pp.261-273
    • /
    • 2020
  • This paper presents an efficient method utilizing user-defined computer functional codes to determine the reliability of an embankment slope with spatially varying soil properties in real time. The soils' mechanical properties varied with the soil layers that had different degrees of compaction and moisture content levels. The Latin Hypercube Sampling (LHS) for the degree of compaction and Kriging simulation of moisture content variation were adopted and programmed to predict their spatial distributions, respectively, that were subsequently used to characterize the spatial distribution of the soil shear strengths. The shear strength parameters were then integrated into the Geostudio command file to determine the safety factor of the embankment slope. An explicit metamodal for the performance function, using the Kriging method, was established and coded to efficiently compute the failure probability of slope with varying moisture contents. Sensitivity analysis showed that the proposed method significantly reduced the computational time compared to Monte Carlo simulation. About 300 times LHS Geostudio computations were needed to optimize precision and efficiency in determining the failure probability. The results also revealed that an embankment slope is prone to high failure risk if the degree of compaction is low and the moisture content is high.

Probabilistic Safety Assessment for High Level Nuclear Waste Repository System

  • Kim, Taw-Woon;Woo, Kab-Koo;Lee, Kun-Jai
    • Journal of Radiation Protection and Research
    • /
    • v.16 no.1
    • /
    • pp.53-72
    • /
    • 1991
  • An integrated model is developed in this paper for the performance assessment of high level radioactive waste repository. This integrated model consists of two simple mathematical models. One is a multiple-barrier failure model of the repository system based on constant failure rates which provides source terms to biosphere. The other is a biosphere model which has multiple pathways for radionuclides to reach to human. For the parametric uncertainty and sensitivity analysis for the risk assessment of high level radioactive waste repository, Latin hypercube sampling and rank correlation techniques are applied to this model. The former is cost-effective for large computer programs because it gives smaller error in estimating output distribution even with smaller number of runs compared to crude Monte Carlo technique. The latter is good for generating dependence structure among samples of input parameters. It is also used to find out the most sensitive, or important, parameter groups among given input parameters. The methodology of the mathematical modelling with statistical analysis will provide useful insights to the decision-making of radioactive waste repository selection and future researches related to uncertain and sensitive input parameters.

  • PDF

Geostatistical Analysis of Soil Enzyme Activities in Mud Flat of Korea

  • Jung, Soohyun;Lee, Seunghoon;Park, Joonhong;Seo, Juyoung;Kang, Hojeong
    • Ecology and Resilient Infrastructure
    • /
    • v.4 no.2
    • /
    • pp.93-96
    • /
    • 2017
  • Spatial variations of physicochemical and microbiological variables were examined to understand spatial heterogeneity of those variables in intertidal flat. Variograms were constructed for understanding spatial autocorrelations of variables by a geostatistical analysis and spatial correlations between two variables were evaluated by applications of a Cross-Mantel test with a Monte Carlo procedure (with 999 permutations). Water content, organic matter content, pH, nitrate, sulfate, chloride, dissolved organic carbon (DOC), four extracellular enzyme activities (${\beta}-glucosidase$, N-acetyl-glucosaminidase, phosphatase, arylsulfatase), and bacterial diversity in soil were measured along a transect perpendicular to shore line. Most variables showed strong spatial autocorrelation or no spatial structure except for DOC. It was suggested that complex interactions between physicochemical and microbiological properties in sediment might controls DOC. Intertidal flat sediment appeared to be spatially heterogeneous. Bacterial diversity was found to be spatially correlated with enzyme activities. Chloride and sulfate were spatially correlated with microbial properties indicating that salinity in coastal environment would influence spatial distributions of decomposition capacities mediated by microorganisms. Overall, it was suggested that considerations on the spatial distributions of physicochemical and microbiological properties in intertidal flat sediment should be included when sampling scheme is designed for decomposition processes in intertidal flat sediment.