• Title/Summary/Keyword: Monte Carlo sampling

Search Result 292, Processing Time 0.024 seconds

Streamflow Generation by Boostrap Method and Skewness (Bootstrap 방법에 의한 하천유출량 모의와 왜곡도)

  • Kim, Byung-Sik;Kim, Hung-Soo;Seoh, Byung-Ha
    • Journal of Korea Water Resources Association
    • /
    • v.35 no.3
    • /
    • pp.275-284
    • /
    • 2002
  • In this study, a method of random resampling of residuals from stochastic models such as the Monte-Carlo model, the lag-one autoregressive model(AR(1)) and the periodic lag-one autoregressive model(PAR(1)), has been adopted to generate a large number of long traces of annual and monthly steamflows. Main advantage of this resampling scheme called the Bootstrap method is that it does not rely on the assumption of population distribution. The Bootstrap is a method for estimating the statistical distribution by resampling the data. When the data are a random sample from a distribution, the Bootstrap method can be implemented (among other ways) by sampling the data randomly with replacement. This procedure has been applied to the Yongdam site to check the performance of Bootstrap method for the streamflow generation. and then the statistics between the historical and generated streamflows have been computed and compared. It has been shown that both the conventional and Bootstrap methods for the generation reproduce fairly well the mean, standard deviation, and serial correlation, but the Bootstrap technique reproduces the skewness better than the conventional ones. Thus, it has been noted that the Bootstrap method might be more appropriate for the preservation of skewness.

A Study on the properties of the Multiensemble Sampling method (Multiensemble Sampling 방법의 속성에 대한 연구)

  • Han, Kyu-Kwang
    • The Journal of Natural Sciences
    • /
    • v.15 no.1
    • /
    • pp.11-34
    • /
    • 2005
  • It is no exaggeration to say that the productivity of a research using computer simulations on complex molecular systems like biomolecules depends on the ability of the sampling algorithm to explore the relevant parts of configuration space. In this study, we investigate the properties on the mutiensemble sampling (MES) which is one of the solutions that surmount limitations of conventional sampling algorithms. Works for finding out practical systematic ways of using the MES efficiently to explore distantly separated regions in configuration space are performed. In this work, the more generalized form of weighting function for MES is used and 'cavity formation in water' is simulated using Monte Carlo. investigating the correlation of simulation parameters and the efficiency of the method, we propose a practical way of maximizing the power of the MES. We applied the way to 'cavity formation in water' and were able to explore the parts of configuration space relevant to cavities of radius from 0 to 5.6A in a single simulation.

  • PDF

Seismic Reliability Analysis of Offshore Wind Turbine with Twisted Tripod Support using Subset Simulation Method (부분집합 시뮬레이션 방법을 이용한 꼬인 삼각대 지지구조를 갖는 해상풍력발전기의 지진 신뢰성 해석)

  • Park, Kwang-Yeun;Park, Wonsuk
    • Journal of the Computational Structural Engineering Institute of Korea
    • /
    • v.32 no.2
    • /
    • pp.125-132
    • /
    • 2019
  • This paper presents a seismic reliability analysis method for an offshore wind turbine with a twisted tripod support structure under earthquake loading. A three dimensional dynamic finite element model is proposed to consider the nonlinearity of the ground-pile interactions and the geometrical characteristics of the twisted tripod support structure where out-of-plane displacement occurs even under in-plane lateral loadings. For the evaluation of seismic reliability, the failure probability was calculated for the maximum horizontal displacement of the pile head, which is calculated from time history analysis using artificial earthquakes for the design return periods. The application of the subset simulation method using the Markov Chain Monte Carlo(MCMC) sampling is proposed for efficient reliability analysis considering the limit state equation evaluation by the nonlinear time history analysis. The proposed method can be applied to the reliability evaluation and design criteria development of the offshore wind turbine with twisted tripod support structure in which two dimensional models and static analysis can not produce accurate results.

Estimation of Interaction Effects among Nucleotide Sequence Variants in Animal Genomes

  • Lee, Chaeyoung;Kim, Younyoung
    • Asian-Australasian Journal of Animal Sciences
    • /
    • v.22 no.1
    • /
    • pp.124-130
    • /
    • 2009
  • Estimating genetic interaction effects in animal genomics would be one of the most challenging studies because the phenotypic variation for economically important traits might be largely explained by interaction effects among multiple nucleotide sequence variants under various environmental exposures. Genetic improvement of economic animals would be expected by understanding multi-locus genetic interaction effects associated with economic traits. Most analyses in animal breeding and genetics, however, have excluded the possibility of genetic interaction effects in their analytical models. This review discusses a historical estimation of the genetic interaction and difficulties in analyzing the interaction effects. Furthermore, two recently developed methods for assessing genetic interactions are introduced to animal genomics. One is the restricted partition method, as a nonparametric grouping-based approach, that iteratively utilizes grouping of genotypes with the smallest difference into a new group, and the other is the Bayesian method that draws inferences about the genetic interaction effects based on their marginal posterior distributions and attains the marginalization of the joint posterior distribution through Gibbs sampling as a Markov chain Monte Carlo. Further developing appropriate and efficient methods for assessing genetic interactions would be urgent to achieve accurate understanding of genetic architecture for complex traits of economic animals.

Comparison of Estimation Methods in NONMEM 7.2: Application to a Real Clinical Trial Dataset (실제 임상 데이터를 이용한 NONMEM 7.2에 도입된 추정법 비교 연구)

  • Yun, Hwi-Yeol;Chae, Jung-Woo;Kwon, Kwang-Il
    • Korean Journal of Clinical Pharmacy
    • /
    • v.23 no.2
    • /
    • pp.137-141
    • /
    • 2013
  • Purpose: This study compared the performance of new NONMEM estimation methods using a population analysis dataset collected from a clinical study that consisted of 40 individuals and 567 observations after a single oral dose of glimepiride. Method: The NONMEM 7.2 estimation methods tested were first-order conditional estimation with interaction (FOCEI), importance sampling (IMP), importance sampling assisted by mode a posteriori (IMPMAP), iterative two stage (ITS), stochastic approximation expectation-maximization (SAEM), and Markov chain Monte Carlo Bayesian (BAYES) using a two-compartment open model. Results: The parameters estimated by IMP, IMPMAP, ITS, SAEM, and BAYES were similar to those estimated using FOCEI, and the objective function value (OFV) for diagnosing the model criteria was significantly decreased in FOCEI, IMPMAP, SAEM, and BAYES in comparison with IMP. Parameter precision in terms of the estimated standard error was estimated precisely with FOCEI, IMP, IMPMAP, and BAYES. The run time for the model analysis was shortest with BAYES. Conclusion: In conclusion, the new estimation methods in NONMEM 7.2 performed similarly in terms of parameter estimation, but the results in terms of parameter precision and model run times using BAYES were most suitable for analyzing this dataset.

NUCLEAR DATA UNCERTAINTY AND SENSITIVITY ANALYSIS WITH XSUSA FOR FUEL ASSEMBLY DEPLETION CALCULATIONS

  • Zwermann, W.;Aures, A.;Gallner, L.;Hannstein, V.;Krzykacz-Hausmann, B.;Velkov, K.;Martinez, J.S.
    • Nuclear Engineering and Technology
    • /
    • v.46 no.3
    • /
    • pp.343-352
    • /
    • 2014
  • Uncertainty and sensitivity analyses with respect to nuclear data are performed with depletion calculations for BWR and PWR fuel assemblies specified in the framework of the UAM-LWR Benchmark Phase II. For this, the GRS sampling based tool XSUSA is employed together with the TRITON depletion sequences from the SCALE 6.1 code system. Uncertainties for multiplication factors and nuclide inventories are determined, as well as the main contributors to these result uncertainties by calculating importance indicators. The corresponding neutron transport calculations are performed with the deterministic discrete-ordinates code NEWT. In addition, the Monte Carlo code KENO in multi-group mode is used to demonstrate a method with which the number of neutron histories per calculation run can be substantially reduced as compared to that in a calculation for the nominal case without uncertainties, while uncertainties and sensitivities are obtained with almost the same accuracy.

Prediction of Blank Thickness Variation in a Deep Drawing Process Using Deep Neural Network (심층 신경망 기반 딥 드로잉 공정 블랭크 두께 변화율 예측)

  • Park, K.T.;Park, J.W.;Kwak, M.J.;Kang, B.S.
    • Transactions of Materials Processing
    • /
    • v.29 no.2
    • /
    • pp.89-96
    • /
    • 2020
  • The finite element method has been widely applied in the sheet metal forming process. However, the finite element method is computationally expensive and time consuming. In order to tackle this problem, surrogate modeling methods have been proposed. An artificial neural network (ANN) is one such surrogate model and has been well studied over the past decades. However, when it comes to ANN with two or more layers, so called deep neural networks (DNN), there is distinct a lack of research. We chose to use DNNs our surrogate model to predict the behavior of sheet metal in the deep drawing process. Thickness variation is selected as an output of the DNN in order to evaluate workpiece feasibility. Input variables of the DNN are radius of die, die corner and blank holder force. Finite element analysis was conducted to obtain data for surrogate model construction and testing. Sampling points were determined by full factorial, latin hyper cube and monte carlo methods. We investigated the performance of the DNN according to its structure, number of nodes and number of layers, then it was compared with a radial basis function surrogate model using various sampling methods and numbers. The results show that our DNN could be used as an efficient surrogate model for the deep drawing process.

Shadow Economy, Corruption and Economic Growth: An Analysis of BRICS Countries

  • NGUYEN, Diep Van;DUONG, My Tien Ha
    • The Journal of Asian Finance, Economics and Business
    • /
    • v.8 no.4
    • /
    • pp.665-672
    • /
    • 2021
  • The paper examines the impact of shadow economy and corruption, along with public expenditure, trade openness, foreign direct investment (FDI), inflation, and tax revenue on the economic growth of the BRICS countries. Data were collected from the World Bank, Transparency International, and Heritage Foundation over the 1991-2017 period. The Bayesian linear regression method is used to examine whether shadow economy, corruption and other indicators affect the economic growth of countries studied. This paper applies the normal prior suggested by Lemoine (2019) while the posterior distribution is simulated using Monte Carlo Markov Chain (MCMC) technique through the Gibbs sampling algorithm. The results indicate that public expenditure and trade openness can enhance the BRICS countries' economic growth, with the positive impact probability of 75.69% and 67.11%, respectively. Also, FDI, inflation, and tax revenue positively affect this growth, though the probability of positive effect is ambiguous, ranging from 51.13% to 56.36%. Further, the research's major finding is that shadow economy and control of corruption have a positive effect on the economic growth of the BRICS countries. Nevertheless, the posterior probabilities of these two factors are 62.23% and 65.25%, respectively. This result suggests that their positive effect probability is not high.

A novel Metropolis-within-Gibbs sampler for Bayesian model updating using modal data based on dynamic reduction

  • Ayan Das;Raj Purohit Kiran;Sahil Bansal
    • Structural Engineering and Mechanics
    • /
    • v.87 no.1
    • /
    • pp.1-18
    • /
    • 2023
  • The paper presents a Bayesian Finite element (FE) model updating methodology by utilizing modal data. The dynamic condensation technique is adopted in this work to reduce the full system model to a smaller model version such that the degrees of freedom (DOFs) in the reduced model correspond to the observed DOFs, which facilitates the model updating procedure without any mode-matching. The present work considers both the MPV and the covariance matrix of the modal parameters as the modal data. Besides, the modal data identified from multiple setups is considered for the model updating procedure, keeping in view of the realistic scenario of inability of limited number of sensors to measure the response of all the interested DOFs of a large structure. A relationship is established between the modal data and structural parameters based on the eigensystem equation through the introduction of additional uncertain parameters in the form of modal frequencies and partial mode shapes. A novel sampling strategy known as the Metropolis-within-Gibbs (MWG) sampler is proposed to sample from the posterior Probability Density Function (PDF). The effectiveness of the proposed approach is demonstrated by considering both simulated and experimental examples.

Bayesian Test of Quasi-Independence in a Sparse Two-Way Contingency Table

  • Kwak, Sang-Gyu;Kim, Dal-Ho
    • Communications for Statistical Applications and Methods
    • /
    • v.19 no.3
    • /
    • pp.495-500
    • /
    • 2012
  • We consider a Bayesian test of independence in a two-way contingency table that has some zero cells. To do this, we take a three-stage hierarchical Bayesian model under each hypothesis. For prior, we use Dirichlet density to model the marginal cell and each cell probabilities. Our method does not require complicated computation such as a Metropolis-Hastings algorithm to draw samples from each posterior density of parameters. We draw samples using a Gibbs sampler with a grid method. For complicated posterior formulas, we apply the Monte-Carlo integration and the sampling important resampling algorithm. We compare the values of the Bayes factor with the results of a chi-square test and the likelihood ratio test.