• Title/Summary/Keyword: Monte Carlo codes

Search Result 120, Processing Time 0.026 seconds

The Analysis and Design of Tree-LDPC codes with EXIT charts (EXIT charts를 이용한 Tree-LDPC 코드의 분석 및 설계)

  • Lee, Sung-Jun;Heo, Jun
    • Proceedings of the IEEK Conference
    • /
    • 2006.06a
    • /
    • pp.1049-1050
    • /
    • 2006
  • In this paper, we present the analysis of Tree-LDPC codes using EXIT(Extrinsic information transfer) charts methods. Two different EXIT charts schemes are compared. One is based on the closed form equation and the other is based on Monte-Carlo simulation. The thresholds by these two schemes match well with the threshold by DE(density evolution) scheme. Simulation performance is also shown with the obtained thresholds.

  • PDF

Application of Variance Reduction Techniques for the Improvement of Monte Carlo Dose Calculation Efficiency (분산 감소 기법에 의한 몬테칼로 선량 계산 효율 평가)

  • Park, Chang-Hyun;Park, Sung-Yong;Park, Dal
    • Progress in Medical Physics
    • /
    • v.14 no.4
    • /
    • pp.240-248
    • /
    • 2003
  • The Monte Carlo calculation is the most accurate means of predicting radiation dose, but its accuracy is accompanied by an increase in the amount of time required to produce a statistically meaningful dose distribution. In this study, the effects on calculation time by introducing variance reduction techniques and increasing computing power, respectively, in the Monte Carlo dose calculation for a 6 MV photon beam from the Varian 600 C/D were estimated when maintaining accuracy of the Monte Carlo calculation results. The EGSnrc­based BEAMnrc code was used to simulate the beam and the EGSnrc­based DOSXYZnrc code to calculate dose distributions. Variance reduction techniques in the codes were used to describe reduced­physics, and a computer cluster consisting of ten PCs was built to execute parallel computing. As a result, time was more reduced by the use of variance reduction techniques than that by the increase of computing power. Because the use of the Monte Carlo dose calculation in clinical practice is yet limited by reducing the computational time only through improvements in computing power, introduction of reduced­physics into the Monte Carlo calculation is inevitable at this point. Therefore, a more active investigation of existing or new reduced­physics approaches is required.

  • PDF

Three-dimensional Self-consistent Particle-in-cell and Monte Carlo Collisional Simulation of DC Magnetron Discharges

  • Kim, Seong-Bong;Chang, Hyon-U;Yoo, Suk-Jae;Oh, Ji-Young;Park, Jang-Sik
    • Proceedings of the Korean Vacuum Society Conference
    • /
    • 2012.02a
    • /
    • pp.526-526
    • /
    • 2012
  • DC magnetron discharges were studied using three-dimensional self-consistent particle-in-cell and Monte Carlo collisional (PIC-MCC) simulation codes. Two rectangular sputter sources (120 mm * 250 mm and 380 mm * 200 mm target sizes) were used in the simulation modeling. The number of incident ions to the Cu target as a function of position and simulation time was obtained. The target erosion profile was calculated by using the incident ions and the sputtering yields of the Cu target calculated with SRIM codes. The maximum ion density of the ion density distribution in the discharge was about $10^{10}cm^{-3}$ due to the calculation speed limit. The result may be less than one or two order of magnitude smaller than the real maximum ion density. However, the target erosion profiles of the two sputter sources were in good agreement with the measured target erosion profiles except for the erosion profile near the target surface, in which which the measured erosion width was broader than the simulation erosion width.

  • PDF

A Study on the performance evaluation with TCM and MTCM in the mobile radio environment (이동 무선 환경에서의 TCM 및 MTCM의 성능 비교 평가)

  • 김민호
    • Journal of the Korea Society of Computer and Information
    • /
    • v.5 no.4
    • /
    • pp.90-95
    • /
    • 2000
  • In order to enhance the confidence in the mobile communication and improve the performance of the bit error, we have been using coding method. In the case of this, we have to add redundancy bits by using error correcting codes such as the block or convolutional codes. However, the result of redundancy bits causes to improve confidence. but to drop the efficiency in the bandwidth. We have studied coding method that we are able to get the good coding gain without any changes in the data transmission rates in the limited bandwidth. In this Paper, we design TCM(Trellis Coded Modulation) which was proposed by Ungerboeck and MTCM(Multiple TCM), with multiplicity(k=2), which was proposed by Divsalar, using the optimum encoder. As state number is varied in the optimum encoder, we compare the performance of the TCM and MTCM by using Monte Carlo simulation.

  • PDF

Nonlinear finite element analysis of reinforced concrete corbels at both deterministic and probabilistic levels

  • Strauss, Alfred;Mordini, Andrea;Bergmeister, Konrad
    • Computers and Concrete
    • /
    • v.3 no.2_3
    • /
    • pp.123-144
    • /
    • 2006
  • Reinforced concrete corbels are structural elements widely used in practical engineering. The complex response of these elements is described in design codes in a simplified manner. These formulations are not sufficient to show the real behavior, which, however, is an essential prerequisite for the manufacturing of numerous elements. Therefore, a deterministic and probabilistic study has been performed, which is described in this contribution. Real complex structures have been modeled by means of the finite element method supported primarily by experimental works. The main objective of this study was the detection of uncertainties effects and safety margins not captured by traditional codes. This aim could be fulfilled by statistical considerations applied to the investigated structures. The probabilistic study is based on advanced Monte Carlo simulation techniques and sophisticated nonlinear finite element formulations.

Bayesian analysis of financial volatilities addressing long-memory, conditional heteroscedasticity and skewed error distribution

  • Oh, Rosy;Shin, Dong Wan;Oh, Man-Suk
    • Communications for Statistical Applications and Methods
    • /
    • v.24 no.5
    • /
    • pp.507-518
    • /
    • 2017
  • Volatility plays a crucial role in theory and applications of asset pricing, optimal portfolio allocation, and risk management. This paper proposes a combined model of autoregressive moving average (ARFIMA), generalized autoregressive conditional heteroscedasticity (GRACH), and skewed-t error distribution to accommodate important features of volatility data; long memory, heteroscedasticity, and asymmetric error distribution. A fully Bayesian approach is proposed to estimate the parameters of the model simultaneously, which yields parameter estimates satisfying necessary constraints in the model. The approach can be easily implemented using a free and user-friendly software JAGS to generate Markov chain Monte Carlo samples from the joint posterior distribution of the parameters. The method is illustrated by using a daily volatility index from Chicago Board Options Exchange (CBOE). JAGS codes for model specification is provided in the Appendix.

Evaluation of accidental eccentricity for buildings by artificial neural networks

  • Badaoui, M.;Chateauneuf, A.;Fournely, E.;Bourahla, N.;Bensaibi, M.
    • Structural Engineering and Mechanics
    • /
    • v.41 no.4
    • /
    • pp.527-538
    • /
    • 2012
  • In seismic analyses of structures, additional eccentricity is introduced to take account for oscillations of random and unknown origins. In many codes of practice, the torsion about the vertical axis is considered through empirical accidental eccentricity formulation. Due to the random nature of structural systems, it is very difficult to evaluate the accidental eccentricity in a deterministic way and to specify its effect on the overall seismic response of structures. The aim of this study is to develop a procedure for the evaluation of the accidental eccentricity induced by uncertainties in stiffness and mass of structural members, using the neural network techniques coupled with Monte Carlo simulations. This method gives very interesting results for single story structures. For real structures, this method can be used as a tool to determine the accidental eccentricity in the seismic vulnerability studies of buildings.

Error Rate and Capacity Analysis for Incremental Hybrid DAF Relaying using Polar Codes

  • Madhusudhanan, Natarajan;Venkateswari, Rajamanickam
    • ETRI Journal
    • /
    • v.40 no.3
    • /
    • pp.291-302
    • /
    • 2018
  • The deployment of an incremental hybrid decode-amplify and forward relaying scheme is a promising and superior solution for cellular networks to meet ever-growing network traffic demands. However, the selection of a suitable relaying protocol based on the signal-to-noise ratio threshold is important in realizing an improved quality of service. In this paper, an incremental hybrid relaying protocol is proposed using polar codes. The proposed protocol achieves a better performance than existing turbo codes in terms of capacity. Simulation results show that the polar codes through an incremental hybrid decode-amplify-and-forward relay can provide a 38% gain when ${\gamma}_{th(1)}$ and ${\gamma}_{th(2)}$ are optimal. Further, the channel capacity is improved to 17.5 b/s/Hz and 23 b/s/Hz for $2{\times}2$ MIMO and $4{\times}4$ MIMO systems, respectively. Monte Carlo simulations are carried out to achieve the optimal solution.

Calculation of the Correction Factors related to the Diameter and Density of the Concrete Core Samples using a Monte Carlo Simulation (몬테카를로 전산해석을 이용한 콘크리트 코어시료의 직경과 밀도에 따른 보정인자 계산)

  • Lee, Kyu-Young;Kang, Bo Sun
    • Journal of the Korean Society of Radiology
    • /
    • v.14 no.5
    • /
    • pp.503-510
    • /
    • 2020
  • Concrete is one of the most widely used materials as the shielding structures of a nuclear facilities. It is also the most generated radioactive waste in quantity while dismantling facilities. Since the concrete captures neutrons and generates various radionuclides, radiation measurement and analysis of the sample was fulfilled prior to dismantle facilities. An HPGe detector is used in general for the radiation measurement, and effective correction factors such as geometrical correction factor, self-absorption correction, and absolute detector efficiency have to be applied to the measured data to decide exact radioactivity of the sample. Correction factors are obtained by measuring data using a standard source with the same geometry and chemical states as the sample under the same measurement conditions. However, it is very difficult to prepare standard concrete sources because concrete is limited in pretreatment due to various constituent materials and high density. In addition, the concrete sample obtained by core drill is a volumetric source, which requires geometric correction for sample diameter and self absorption correction for sample density. Therefore in recent years, many researchers are working on the calculation of effective correction factors using Monte carlo simulation instead of measuring them using a standard source. In this study we calculated, using Geant4, one of the Monte carlo codes, the correction factors for the various diameter and density of the concrete core sample at the gamma ray energy emitted from the nuclides 152Eu and 60Co, which are the most generated in radioactive concrete.

Development of transient Monte Carlo in a fissile system with β-delayed emission from individual precursors using modified open source code OpenMC(TD)

  • J. Romero-Barrientos;F. Molina;J.I. Marquez Damian;M. Zambra;P. Aguilera;F. Lopez-Usquiano;S. Parra
    • Nuclear Engineering and Technology
    • /
    • v.55 no.5
    • /
    • pp.1593-1603
    • /
    • 2023
  • In deterministic and Monte Carlo transport codes, b-delayed emission is included using a group structure where all of the precursors are grouped together in 6 groups or families, but given the increase in computational power, nowadays there is no reason to keep this structure. Furthermore, there have been recent efforts to compile and evaluate all the available b-delayed neutron emission data and to measure new and improved data on individual precursors. In order to be able to perform a transient Monte Carlo simulation, data from individual precursors needs to be implemented in a transport code. This work is the first step towards the development of a tool to explore the effect of individual precursors in a fissile system. In concrete, individual precursor data is included by expanding the capabilities of the open source Monte Carlo code OpenMC. In the modified code - named Time Dependent OpenMC or OpenMC(TD)- time dependency related to β-delayed neutron emission was handled by using forced decay of precursors and combing of the particle population. The data for continuous energy neutron cross-sections was taken from JEFF-3.1.1 library. Regarding the data needed to include the individual precursors, cumulative yields were taken from JEFF-3.1.1 and delayed neutron emission probabilities and delayed neutron spectra were taken from ENDF-B/VIII.0. OpenMC(TD) was tested in a monoenergetic system, an energy dependent unmoderated system where the precursors were taken individually or in a group structure, and in a light-water moderated energy dependent system, using 6-groups, 50 and 40 individual precursors. Neutron flux as a function of time was obtained for each of the systems studied. These results show the potential of OpenMC(TD) as a tool to study the impact of individual precursor data on fissile systems, thus motivating further research to simulate more complex fissile systems.