• Title/Summary/Keyword: Uncertainty and sensitivity analysis

Search Result 301, Processing Time 0.031 seconds

Evaluation of the CNESTEN's TRIGA Mark II research reactor physical parameters with TRIPOLI-4® and MCNP

  • H. Ghninou;A. Gruel;A. Lyoussi;C. Reynard-Carette;C. El Younoussi;B. El Bakkari;Y. Boulaich
    • Nuclear Engineering and Technology
    • /
    • v.55 no.12
    • /
    • pp.4447-4464
    • /
    • 2023
  • This paper focuses on the development of a new computational model of the CNESTEN's TRIGA Mark II research reactor using the 3D continuous energy Monte-Carlo code TRIPOLI-4 (T4). This new model was developed to assess neutronic simulations and determine quantities of interest such as kinetic parameters of the reactor, control rods worth, power peaking factors and neutron flux distributions. This model is also a key tool used to accurately design new experiments in the TRIGA reactor, to analyze these experiments and to carry out sensitivity and uncertainty studies. The geometry and materials data, as part of the MCNP reference model, were used to build the T4 model. In this regard, the differences between the two models are mainly due to mathematical approaches of both codes. Indeed, the study presented in this article is divided into two parts: the first part deals with the development and the validation of the T4 model. The results obtained with the T4 model were compared to the existing MCNP reference model and to the experimental results from the Final Safety Analysis Report (FSAR). Different core configurations were investigated via simulations to test the computational model reliability in predicting the physical parameters of the reactor. As a fairly good agreement among the results was deduced, it seems reasonable to assume that the T4 model can accurately reproduce the MCNP calculated values. The second part of this study is devoted to the sensitivity and uncertainty (S/U) studies that were carried out to quantify the nuclear data uncertainty in the multiplication factor keff. For that purpose, the T4 model was used to calculate the sensitivity profiles of the keff to the nuclear data. The integrated-sensitivities were compared to the results obtained from the previous works that were carried out with MCNP and SCALE-6.2 simulation tools and differences of less than 5% were obtained for most of these quantities except for the C-graphite sensitivities. Moreover, the nuclear data uncertainties in the keff were derived using the COMAC-V2.1 covariance matrices library and the calculated sensitivities. The results have shown that the total nuclear data uncertainty in the keff is around 585 pcm using the COMAC-V2.1. This study also demonstrates that the contribution of zirconium isotopes to the nuclear data uncertainty in the keff is not negligible and should be taken into account when performing S/U analysis.

Sensitivity Analyses of Influencing Factors on Vertical Drain with Probabilistic Method (확률론적 해석법에 의한 연직배수 영향인자 민감도 분석)

  • Yoo, Nam-Jae;Jun, Sang-Hyun;Jeong, KiI-Soo;Kim, Dong-Gun
    • Journal of Industrial Technology
    • /
    • v.26 no.B
    • /
    • pp.83-92
    • /
    • 2006
  • A probabilistic analysis model. one of reliability analysis methods introducing the concept of variables, was developed to investigate the uncertainty of dominant factors influencing the degree of consolidation in the radial consolidation theories. Based on the developed probabilistic analysis model, sensitivity study of those factors was performed to find their trends of affecting the degree of consolidation in the vertical drain method. Various radial consolidation theories, proposed by Barron(1948), Hansbo(1979), Yoshikuni(1979) and Onoue(1988), were used for this parametric study with the influencing factors such as size of smear zone, reduction ratio of permeability in the smear zone, discharge capacity, permeability for horizontal flow and coefficient of consolidation for horizontal flow. As results of this sensitivity study, for the given consolidation theory, contribution of each factor to the degree of consolidation was figure out and compared to each other. For the given value of each factor, the sensitivity to the degree of consolidation in the various theories was evaluated and their applicability and limitations were assessed.

  • PDF

Error propagation in 2-D self-calibration algorithm (2차원 자가 보정 알고리즘에서의 불확도 전파)

  • 유승봉;김승우
    • Proceedings of the Korean Society of Precision Engineering Conference
    • /
    • 2003.06a
    • /
    • pp.434-437
    • /
    • 2003
  • Evaluation or the patterning accuracy of e-beam lithography machines requires a high precision inspection system that is capable of measuring the true xy-locations of fiducial marks generated by the e-beam machine under test. Fiducial marks are fabricated on a single photo mask over the entire working area in the form of equally spaced two-dimensional grids. In performing the evaluation, the principles of self-calibration enable to determine the deviations of fiducial marks from their nominal xy-locations precisely, not being affected by the motion errors of the inspection system itself. It is. however, the fact that only repeatable motion errors can be eliminated, while random motion errors encountered in probing the locations of fiducial marks are not removed. Even worse, a random error occurring from the measurement of a single mark propagates and affects in determining locations of other marks, which phenomenon in fact limits the ultimate calibration accuracy of e-beam machines. In this paper, we describe an uncertainty analysis that has been made to investigate how random errors affect the final result of self-calibration of e-beam machines when one uses an optical inspection system equipped with high-resolution microscope objectives and a precision xy-stages. The guide of uncertainty analysis recommended by the International Organization for Standardization is faithfully followed along with necessary sensitivity analysis. The uncertainty analysis reveals that among the dominant components of the patterning accuracy of e-beam lithography, the rotationally symmetrical component is most significantly affected by random errors, whose propagation becomes more severe in a cascading manner as the number of fiducial marks increases

  • PDF

Blast fragility of base-isolated steel moment-resisting buildings

  • Dadkhah, Hamed;Mohebbi, Mohtasham
    • Earthquakes and Structures
    • /
    • v.21 no.5
    • /
    • pp.461-475
    • /
    • 2021
  • Strategic structures are a potential target of the growing terrorist attacks, so their performance under explosion hazard has been paid attention by researchers in the last years. In this regard, the aim of this study is to evaluate the blast-resistance performance of lead-rubber bearing (LRB) base isolation system based on a probabilistic framework while uncertainties related to the charge weight and standoff distance have been taken into account. A sensitivity analysis is first performed to show the effect of explosion uncertainty on the response of base-isolated buildings. The blast fragility curve is then developed for three base-isolated steel moment-resisting buildings with different heights of 4, 8 and 12 stories. The results of sensitivity analysis show that although LRB has the capability of reducing the peak response of buildings under explosion hazard, this control system may lead to increase in the peak response of buildings under some explosion scenarios. This shows the high importance of probabilistic-based assessment of isolated structures under explosion hazard. The blast fragility analysis shows effective performance of LRB in mitigating the probability of failure of buildings. Therefore, LRB can be introduced as effective control system for the protection of buildings from explosion hazard regarding uncertainty effect.

Manning's n Calibration and Sensitivity Analysis using Unsteady Flood Routing Model (부정류 모형을 이용한 하천 조도계수 산정 및 산정오차의 수면곡선에 대한 민감도 분석)

  • Kim, Sun-Min;Jung, Kwan-Sue
    • Proceedings of the Korea Water Resources Association Conference
    • /
    • 2005.05b
    • /
    • pp.324-328
    • /
    • 2005
  • This study is to figure out uncertainty relationship between input data and calibrated parameter on unsteady hydraulic routing model. The uncertainty would be present to model results as a variant water surface profile along the channel. Firstly, Manning's n is calibrated through the model with assumed uncertainty on input hydrograph. Then, spatially distributed n-values sets based on the calibrated n values are used to get water profile of each n-values set. The results show that ${\pm}0.002$ of error in Manning's n cause ${\pm}30cm$ of maximum water surface differences at the Sumjin river.

  • PDF

Sensitivity Analysis of Steel Frames Subjected to Progressive Collapse (철골구조물의 연쇄붕괴에 대한 민감도 해석)

  • Park, Jun-Hei;Hong, Su-Min;Kim, Jin-Koo
    • Proceedings of the Computational Structural Engineering Institute Conference
    • /
    • 2008.04a
    • /
    • pp.307-312
    • /
    • 2008
  • Local damage may cause sequential collapse of structure, which is called progressive collapse. Current progressive collapse analysis is based on the mean value of design variables. This deterministic approach has a low reliability as it doesn't consider uncertainty of variables. In this study the sensitivity of design variables for progressive collapse of structure is evaluated by Monte Calro simulation and Tornado diagram. The analysis results show that the behaviour of model structures is highly sensitive to variation of the yield force of beams and the structural damping ratio.

  • PDF

A Study on the Capital Budgeting under Risk and Uncertainty (위험하(危險下)의 투자결정(投資決定)에 관한 연구(硏究))

  • Lee, Tae-Joo
    • The Korean Journal of Financial Management
    • /
    • v.2 no.1
    • /
    • pp.21-34
    • /
    • 1986
  • The purpose of this study is to analyse the risk and uncertainty involved in the capital budgeting which is executed in long periods and requires massive capital expenditure. Under risk and uncertainty conditions, the estimates in the capital budgeting are random variables rather than known constants. Two approaches have emerged in performing economic analysis that explicitly incorporate risk and uncertainty conditions in the analysis. One approach is to develop a descriptive model which describes the economic performance of an individual investment alternative. But no recomendation would be forthcoming from the model. Rather, the decision-maker would be furnished descriptive information concerning each alternative; the final choice among the alternatives would required a separate action. The second approach is to develop a normative model which includes an objective function to be maximized or minimized. The output from the model prescribes the course of action to be taken. Owing to the fact that the normative approach considers the fitness of criteria for decision-making its reasonableness looks better. But it is almost imposible that we correctly and easily derive the individuals' utility function. So within we recognize the limits of the descriptive methods, it is more practicle to analyse the investment alternatives by sensitivity analysis.

  • PDF

Robust concurrent topology optimization of multiscale structure under load position uncertainty

  • Cai, Jinhu;Wang, Chunjie
    • Structural Engineering and Mechanics
    • /
    • v.76 no.4
    • /
    • pp.529-540
    • /
    • 2020
  • Concurrent topology optimization of macrostructure and microstructure has attracted significant interest due to its high structural performance. However, most of the existing works are carried out under deterministic conditions, the obtained design may be vulnerable or even cause catastrophic failure when the load position exists uncertainty. Therefore, it is necessary to take load position uncertainty into consideration in structural design. This paper presents a computational method for robust concurrent topology optimization with consideration of load position uncertainty. The weighted sum of the mean and standard deviation of the structural compliance is defined as the objective function with constraints are imposed to both macro- and micro-scale structure volume fractions. The Bivariate Dimension Reduction method and Gauss-type quadrature (BDRGQ) are used to quantify and propagate load uncertainty to calculate the objective function. The effective properties of microstructure are evaluated by the numerical homogenization method. To release the computation burden, the decoupled sensitivity analysis method is proposed for microscale design variables. The bi-directional evolutionary structural optimization (BESO) method is used to obtain the black-and-white designs. Several 2D and 3D examples are presented to validate the effectiveness of the proposed robust concurrent topology optimization method.

New Development of Methods for Environmental Impact Assessment Facing Uncertainty and Cumulative Environmental Impacts (불확실성과 누적환경영향하에서의 환경영향평가를 위한 방법론의 새로운 개발)

  • Pietsch, Jurgen
    • Journal of Environmental Impact Assessment
    • /
    • v.4 no.3
    • /
    • pp.87-94
    • /
    • 1995
  • At both international and national levels, such as in the Rio Declaration and the EU's Fifth Environmental Action Plan, governments have committed themselves to the adoption of the precautionary principle (UNCED 1992, CEC 1992). These commitments mean that the existence of uncertainty in appraising policies and proposals for development should be acknowledged. Uncertainty arise in both the prediction of impacts and in the evaluation of their significance, particularly of those cumulative impacts which are individually insignificant but cumulatively damaging. The EC network of EIA experts, stated at their last meeting in Athens that indirect effects and the treatment of uncertainty are one of the main deficiencies of current EIA practice. Uncertainties in decision-making arise where choices have been made in the development of the policy or proposal, such as the selection of options, the justification for that choice, and the selection of different indicators to comply with different regulatory regimes. It is also likely that a weighting system for evaluating significance will have been used which may be implicit rather than explicit. Those involved in decision-making may employ different tolerances of uncertainty than members of the public, for instance over the consideration of the worst-case scenario. Possible methods for dealing with these uncertainties include scenarios, sensitivity analysis, showing points of view, decision analysis, postponing decisions and graphical methods. An understanding of the development of cumulative environmental impacts affords not only ecologic but also socio-economic investigations. Since cumulative impacts originate mainly in centres of urban or industrial development, in particular an analysis of future growth effects that might possibly be induced by certain development impacts. Not least it is seen as an matter of sustainability to connect this issue with ecological research. The serious attempt to reduce the area of uncertainty in environmental planning is a challenge and an important step towards reliable planning and sustainable development.

  • PDF

Sensitivity studies on a novel nuclear forensics methodology for source reactor-type discrimination of separated weapons grade plutonium

  • Kitcher, Evans D.;Osborn, Jeremy M.;Chirayath, Sunil S.
    • Nuclear Engineering and Technology
    • /
    • v.51 no.5
    • /
    • pp.1355-1364
    • /
    • 2019
  • A recently published nuclear forensics methodology for source discrimination of separated weapons-grade plutonium utilizes intra-element isotope ratios and a maximum likelihood formulation to identify the most likely source reactor-type, fuel burnup and time since irradiation of unknown material. Sensitivity studies performed here on the effects of random measurement error and the uncertainty in intra-element isotope ratio values show that different intra-element isotope ratios have disproportionate contributions to the determination of the reactor parameters. The methodology is robust to individual errors in measured intra-element isotope ratio values and even more so for uniform systematic errors due to competing effects on the predictions from the selected intra-element isotope ratios suite. For a unique sample-model pair, simulation uncertainties of up to 28% are acceptable without impeding successful source-reactor discrimination. However, for a generic sample with multiple plausible sources within the reactor library, uncertainties of 7% or less may be required. The results confirm the critical role of accurate reactor core physics, fuel burnup simulations and experimental measurements in the proposed methodology where increased simulation uncertainty is found to significantly affect the capability to discriminate between the reactors in the library.