• Title/Summary/Keyword: Monte Carlo techniques

Search Result 211, Processing Time 0.022 seconds

A Sequential Monte Carlo inference for longitudinal data with luespotted mud hopper data (짱뚱어 자료로 살펴본 장기 시계열 자료의 순차적 몬테 칼로 추론)

  • Choi, Il-Su
    • Journal of the Korea Institute of Information and Communication Engineering
    • /
    • v.9 no.6
    • /
    • pp.1341-1345
    • /
    • 2005
  • Sequential Monte Carlo techniques are a set of powerful and versatile simulation-based methods to perform optimal state estimation in nonlinear non-Gaussian state-space models. We can use Monte Carlo particle filters adaptively, i.e. so that they simultaneously estimate the parameters and the signal. However, Sequential Monte Carlo techniques require the use of special panicle filtering techniques which suffer from several drawbacks. We consider here an alternative approach combining particle filtering and Sequential Hybrid Monte Carlo. We give some examples of applications in fisheries(luespotted mud hopper data).

A methodology for uncertainty quantification and sensitivity analysis for responses subject to Monte Carlo uncertainty with application to fuel plate characteristics in the ATRC

  • Price, Dean;Maile, Andrew;Peterson-Droogh, Joshua;Blight, Derreck
    • Nuclear Engineering and Technology
    • /
    • v.54 no.3
    • /
    • pp.790-802
    • /
    • 2022
  • Large-scale reactor simulation often requires the use of Monte Carlo calculation techniques to estimate important reactor parameters. One drawback of these Monte Carlo calculation techniques is they inevitably result in some uncertainty in calculated quantities. The present study includes parametric uncertainty quantification (UQ) and sensitivity analysis (SA) on the Advanced Test Reactor Critical (ATRC) facility housed at Idaho National Laboratory (INL) and addresses some complications due to Monte Carlo uncertainty when performing these analyses. This approach for UQ/SA includes consideration of Monte Carlo code uncertainty in computed sensitivities, consideration of uncertainty from directly measured parameters and a comparison of results obtained from brute-force Monte Carlo UQ versus UQ obtained from a surrogate model. These methodologies are applied to the uncertainty and sensitivity of keff for two sets of uncertain parameters involving fuel plate geometry and fuel plate composition. Results indicate that the less computationally-expensive method for uncertainty quantification involving a linear surrogate model provides accurate estimations for keff uncertainty and the Monte Carlo uncertainty in calculated keff values can have a large effect on computed linear model parameters for parameters with low influence on keff.

A PRACTICAL LOOK AT MONTE CARLO VARIANCE REDUCTION METHODS IN RADIATION SHIELDING

  • Olsher Richard H.
    • Nuclear Engineering and Technology
    • /
    • v.38 no.3
    • /
    • pp.225-230
    • /
    • 2006
  • With the advent of inexpensive computing power over the past two decades, applications of Monte Carlo radiation transport techniques have proliferated dramatically. At Los Alamos, the Monte Carlo codes MCNP5 and MCNPX are used routinely on personal computer platforms for radiation shielding analysis and dosimetry calculations. These codes feature a rich palette of variance reduction (VR) techniques. The motivation of VR is to exchange user efficiency for computational efficiency. It has been said that a few hours of user time often reduces computational time by several orders of magnitude. Unfortunately, user time can stretch into the many hours as most VR techniques require significant user experience and intervention for proper optimization. It is the purpose of this paper to outline VR strategies, tested in practice, optimized for several common radiation shielding tasks, with the hope of reducing user setup time for similar problems. A strategy is defined in this context to mean a collection of MCNP radiation transport physics options and VR techniques that work synergistically to optimize a particular shielding task. Examples are offered in the areas of source definition, skyshine, streaming, and transmission.

Geant 4 Monte Carlo simulation for I-125 brachytherapy

  • Jie Liu;M.E. Medhat;A.M.M. Elsayed
    • Nuclear Engineering and Technology
    • /
    • v.56 no.7
    • /
    • pp.2516-2523
    • /
    • 2024
  • This study aims to validate the dosimetric characteristics of Low Dose Rate (LDR) I-125 source Geant4-based Monte Carlo code. According to the recommendation of the American Association of Physicists in Medicine (AAPM) task group report (TG-43), the dosimetric parameters of a new brachytherapy source should be verified either experimentally or theoretically before clinical procedures. The simulation studies are very important since this procedure delivers a high dose of radiation to the tumor with only a minimal dose affecting the surrounding tissues. GEANT4 Monte Carlo simulation toolkit associated brachytherapy example was modified, adapted and several updated techniques have been developed to facilitate and smooth radiotherapy techniques. The great concordance of the current study results with the consensus data and with the results of other MC based studies is promising. It implies that Geant4-based Monte Carlo simulation has the potential to be used as a reliable and standard simulation code in the field of brachytherapy for verification and treatment planning purposes.

A Methodology on Treating Uncertainty of LCI Data using Monte Carlo Simulation (몬테카를로 시뮬레이션을 이용한 LCI data 불활실성 처리 방법론)

  • Park Ji-Hyung;Seo Kwang-Kyu
    • Journal of the Korean Society for Precision Engineering
    • /
    • v.21 no.12
    • /
    • pp.109-118
    • /
    • 2004
  • Life cycle assessment (LCA) usually involves some uncertainty. These uncertainties are generally divided in two categories such lack of data and data inaccuracy in life cycle inventory (LCI). This paper explo.es a methodology on dealing with uncertainty due to lack of data in LCI. In order to treat uncertainty of LCI data, a model for data uncertainty is proposed. The model works with probabilistic curves as inputs and with Monte Carlo Simulation techniques to propagate uncertainty. The probabilistic curves were derived from the results of survey in expert network and Monte Carlo Simulation was performed using the derived probabilistic curves. The results of Monte Carlo Simulation were verified by statistical test. The proposed approach should serve as a guide to improve data quality and deal with uncertainty of LCI data in LCA projects.

Evaluation of Artificial Intelligence-Based Denoising Methods for Global Illumination

  • Faradounbeh, Soroor Malekmohammadi;Kim, SeongKi
    • Journal of Information Processing Systems
    • /
    • v.17 no.4
    • /
    • pp.737-753
    • /
    • 2021
  • As the demand for high-quality rendering for mixed reality, videogame, and simulation has increased, global illumination has been actively researched. Monte Carlo path tracing can realize global illumination and produce photorealistic scenes that include critical effects such as color bleeding, caustics, multiple light, and shadows. If the sampling rate is insufficient, however, the rendered results have a large amount of noise. The most successful approach to eliminating or reducing Monte Carlo noise uses a feature-based filter. It exploits the scene characteristics such as a position within a world coordinate and a shading normal. In general, the techniques are based on the denoised pixel or sample and are computationally expensive. However, the main challenge for all of them is to find the appropriate weights for every feature while preserving the details of the scene. In this paper, we compare the recent algorithms for removing Monte Carlo noise in terms of their performance and quality. We also describe their advantages and disadvantages. As far as we know, this study is the first in the world to compare the artificial intelligence-based denoising methods for Monte Carlo rendering.

Analysis of inconsistent source sampling in monte carlo weight-window variance reduction methods

  • Griesheimer, David P.;Sandhu, Virinder S.
    • Nuclear Engineering and Technology
    • /
    • v.49 no.6
    • /
    • pp.1172-1180
    • /
    • 2017
  • The application of Monte Carlo (MC) to large-scale fixed-source problems has recently become possible with new hybrid methods that automate generation of parameters for variance reduction techniques. Two common variance reduction techniques, weight windows and source biasing, have been automated and popularized by the consistent adjoint-driven importance sampling (CADIS) method. This method uses the adjoint solution from an inexpensive deterministic calculation to define a consistent set of weight windows and source particles for a subsequent MC calculation. One of the motivations for source consistency is to avoid the splitting or rouletting of particles at birth, which requires computational resources. However, it is not always possible or desirable to implement such consistency, which results in inconsistent source biasing. This paper develops an original framework that mathematically expresses the coupling of the weight window and source biasing techniques, allowing the authors to explore the impact of inconsistent source sampling on the variance of MC results. A numerical experiment supports this new framework and suggests that certain classes of problems may be relatively insensitive to inconsistent source sampling schemes with moderate levels of splitting and rouletting.

Simplification of Monte Carlo Techniques for the Estimation of Expected Benefits in Stochastic Ananlysis of Multiple Reservoir Systems (저수지군으로부터 기대편익 산정을 위한 Monte Carlo 기법의 간략화)

  • 이광만;고석구
    • Water for future
    • /
    • v.26 no.2
    • /
    • pp.89-97
    • /
    • 1993
  • For the system benefit optimization by considering risk or reliability from a multiple reservoir system using the Monte Carlo technique, many stochastically generated inflow series have to be used for the system analysis. In this study, the stochastically generated inflow series for the multiple reservoir system operation are preprocessed according to the considered system objectives and operating time periods. Through this procedure, several representative inflow series which have discrete probability levels and operation horizons are selected among the thousands of generated inflows. Then a deterministic optimization technique is applied to the power energy estimation from the Han River Reservoirs System which considers five reservoirs in the study. It took much lower computational requirements then using the original Monte Carlo Technique, even though estimated result was almost similar.

  • PDF

Application of Variance Reduction Techniques for the Improvement of Monte Carlo Dose Calculation Efficiency (분산 감소 기법에 의한 몬테칼로 선량 계산 효율 평가)

  • Park, Chang-Hyun;Park, Sung-Yong;Park, Dal
    • Progress in Medical Physics
    • /
    • v.14 no.4
    • /
    • pp.240-248
    • /
    • 2003
  • The Monte Carlo calculation is the most accurate means of predicting radiation dose, but its accuracy is accompanied by an increase in the amount of time required to produce a statistically meaningful dose distribution. In this study, the effects on calculation time by introducing variance reduction techniques and increasing computing power, respectively, in the Monte Carlo dose calculation for a 6 MV photon beam from the Varian 600 C/D were estimated when maintaining accuracy of the Monte Carlo calculation results. The EGSnrc­based BEAMnrc code was used to simulate the beam and the EGSnrc­based DOSXYZnrc code to calculate dose distributions. Variance reduction techniques in the codes were used to describe reduced­physics, and a computer cluster consisting of ten PCs was built to execute parallel computing. As a result, time was more reduced by the use of variance reduction techniques than that by the increase of computing power. Because the use of the Monte Carlo dose calculation in clinical practice is yet limited by reducing the computational time only through improvements in computing power, introduction of reduced­physics into the Monte Carlo calculation is inevitable at this point. Therefore, a more active investigation of existing or new reduced­physics approaches is required.

  • PDF

Structural reliability estimation using Monte Carlo simulation and Pearson's curves

  • Krakovski, Mikhail B.
    • Structural Engineering and Mechanics
    • /
    • v.3 no.3
    • /
    • pp.201-213
    • /
    • 1995
  • At present Level 2 and importance sampling methods are the main tools used to estimate reliability of structural systems. But sometimes application of these techniques to realistic problems involves certain difficulties. In order to overcome the difficulties it is suggested to use Monte Carlo simulation in combination with two other techniques-extreme value and tail entropy approximations; an appropriate Pearson's curve is fit to represent simulation results. On the basis of this approach an algorithm and computer program for structural reliability estimation are developed. A number of specially chosen numerical examples are considered with the aim of checking the accuracy of the approach and comparing it with the Level 2 and importance sampling methods. The field of application of the approach is revealed.