• Title/Summary/Keyword: Monte Carlo sampling

Search Result 291, Processing Time 0.028 seconds

Calibration and Uncertainty Analysis of Sample-Time Error on High Jitter of Samplers

  • Cho, Chihyun;Lee, Joo-Gwang;Kang, Tae-Weon;Kang, No-Weon
    • Journal of electromagnetic engineering and science
    • /
    • v.18 no.3
    • /
    • pp.169-174
    • /
    • 2018
  • In this paper, we propose an estimation method using multiple in-phase and quadrature (IQ) signals of different frequencies to evaluate the sample-time errors in the sampling oscilloscope. The estimator is implemented by ODRPACK, and a novel iteration scheme is applied to achieve fast convergence without any prior information. Monte-Carlo simulation is conducted to confirm the proposed method. It clearly shows that the multiple IQ approach achieves more accurate results compared to the conventional method. Finally, the criteria for the frequency selection and the signal capture time are investigated.

Geostatistics for Bayesian interpretation of geophysical data

  • Oh Seokhoon;Lee Duk Kee;Yang Junmo;Youn Yong-Hoon
    • 한국지구물리탐사학회:학술대회논문집
    • /
    • 2003.11a
    • /
    • pp.340-343
    • /
    • 2003
  • This study presents a practical procedure for the Bayesian inversion of geophysical data by Markov chain Monte Carlo (MCMC) sampling and geostatistics. We have applied geostatistical techniques for the acquisition of prior model information, and then the MCMC method was adopted to infer the characteristics of the marginal distributions of model parameters. For the Bayesian inversion of dipole-dipole array resistivity data, we have used the indicator kriging and simulation techniques to generate cumulative density functions from Schlumberger array resistivity data and well logging data, and obtained prior information by cokriging and simulations from covariogram models. The indicator approach makes it possible to incorporate non-parametric information into the probabilistic density function. We have also adopted the MCMC approach, based on Gibbs sampling, to examine the characteristics of a posteriori probability density function and the marginal distribution of each parameter. This approach provides an effective way to treat Bayesian inversion of geophysical data and reduce the non-uniqueness by incorporating various prior information.

  • PDF

구조신뢰성 해석방법의 고찰

  • 양영순;서용석
    • Proceedings of the Computational Structural Engineering Institute Conference
    • /
    • 1993.04a
    • /
    • pp.109-116
    • /
    • 1993
  • 구조물의 신뢰도를 평가하는 방법을 살펴보고 각각의 장,단점을 비교한다. 각 방법의 정확성을 평가하는 기준으로 Crude Monte Carlo(CMC)방법을 택하여, Importance Sampling(IS)방법, 그리고 Directional Sampling(DS)방법을 살펴 보고, 1차 근사방법은 현재 많이 사용되고 있는 Rackwitz-Fiessler(RF)방법, Chen과 Lind가 제안한 3-parameter방법(CL), Hohenbichler가 제안한 Rosenblatt 변환방법(RT)을, 그리고 2차 근사방법은 Breitung이 제안한 곡률적합 포물선 (Curvature Fitted Paraboloid,CFP)공식과 Kiureghian이 제안한 점적합 포물선(Point Fitted Paraboloid,PFP)공식, 그리고 Log-Likelihood function을 이용하여 원변수공간에서 파괴확률을 구하는 2차 근사공식(LLF)을 비교한다. 그리고 한계상태식이 불명확할 때 효율적으로 사용할 수 있는 반웅웅답법(Response surface method, RSM)을 살펴본다. 각 방법의 효율성 특히 적용 가능성을 예제를 통해 해석한 결과 추출법의 경우는 DS 방법이, 그리고 근사방법에서는 RSM 방법이 효율적임을 알 수 있었다.

  • PDF

Bayesian Approach for Software Reliability Models (소프트웨어 신뢰모형에 대한 베이지안 접근)

  • Choi, Ki-Heon
    • Journal of the Korean Data and Information Science Society
    • /
    • v.10 no.1
    • /
    • pp.119-133
    • /
    • 1999
  • A Markov Chain Monte Carlo method is developed to compute the software reliability model. We consider computation problem for determining of posterior distibution in Bayseian inference. Metropolis algorithms along with Gibbs sampling are proposed to preform the Bayesian inference of the Mixed model with record value statistics. For model determiniation, we explored the prequential conditional predictive ordinate criterion that selects the best model with the largest posterior likelihood among models using all possible subsets of the component intensity functions. To relax the monotonic intensity function assumptions. A numerical example with simulated data set is given.

  • PDF

Worst Case Sampling Method with Confidence Ellipse for Estimating the Impact of Random Variation on Static Random Access Memory (SRAM)

  • Oh, Sangheon;Jo, Jaesung;Lee, Hyunjae;Lee, Gyo Sub;Park, Jung-Dong;Shin, Changhwan
    • JSTS:Journal of Semiconductor Technology and Science
    • /
    • v.15 no.3
    • /
    • pp.374-380
    • /
    • 2015
  • As semiconductor devices are being scaled down, random variation becomes a critical issue, especially in the case of static random access memory (SRAM). Thus, there is an urgent need for statistical methodologies to analyze the impact of random variations on the SRAM. In this paper, we propose a novel sampling method based on the concept of a confidence ellipse. Results show that the proposed method estimates the SRAM margin metrics in high-sigma regimes more efficiently than the standard Monte Carlo (MC) method.

A Random Sampling Method in Estimating the Mean Areal Precipitation Using Kriging

  • Lee, Sang-Il
    • Korean Journal of Hydrosciences
    • /
    • v.5
    • /
    • pp.45-55
    • /
    • 1994
  • A new method to estimate the mean areal precipitation using kriging is developed. Urlike the conventional approach, points for double and quadruple numerical integrations in the kriging equation are selected randomly, given the boundary of area of interest. This feature eliminates the conventional approach's necessity of dividing the area into subareas and calculating the center of each subarea, which in turn makes the developed method more powerful in the case of complex boundaries. The algorithm to select random points within an arbitrary boundary, based on the theory of complex variables, is described. The results of Monte Carlo simulation showed that the error associated with estimation using randomly selected points is inversely proportional to the square root of the number of sampling points.

  • PDF

A Study on Bayesian Reliability Evaluation of IPM using Simple Information (단순 수명정보를 이용한 IPM의 베이지안 신뢰도 평가 연구)

  • Jo, Dong Cheol;Koo, Jeong Seo
    • Journal of the Korean Society of Safety
    • /
    • v.36 no.2
    • /
    • pp.32-38
    • /
    • 2021
  • This paper suggests an approach to evaluate the reliability of an intelligent power module with information deficiency of prior distribution and the characteristics of censored data through Bayesian statistics. This approach used a prior distribution of Bayesian statistics using the lifetime information provided by the manufacturer and compared and evaluated diffuse prior (vague prior) distributions. To overcome the computational complexity of Bayesian posterior distribution, it was computed with Gibbs sampling in the Monte Carlo simulation method. As a result, the standard deviation of the prior distribution developed using simple information was smaller than that of the posterior distribution calculated with the diffuse prior. In addition, it showed excellent error characteristics on RMSE compared with the Kaplan-Meier method.

Self-adaptive sampling for sequential surrogate modeling of time-consuming finite element analysis

  • Jin, Seung-Seop;Jung, Hyung-Jo
    • Smart Structures and Systems
    • /
    • v.17 no.4
    • /
    • pp.611-629
    • /
    • 2016
  • This study presents a new approach of surrogate modeling for time-consuming finite element analysis. A surrogate model is widely used to reduce the computational cost under an iterative computational analysis. Although a variety of the methods have been widely investigated, there are still difficulties in surrogate modeling from a practical point of view: (1) How to derive optimal design of experiments (i.e., the number of training samples and their locations); and (2) diagnostics of the surrogate model. To overcome these difficulties, we propose a sequential surrogate modeling based on Gaussian process model (GPM) with self-adaptive sampling. The proposed approach not only enables further sampling to make GPM more accurate, but also evaluates the model adequacy within a sequential framework. The applicability of the proposed approach is first demonstrated by using mathematical test functions. Then, it is applied as a substitute of the iterative finite element analysis to Monte Carlo simulation for a response uncertainty analysis under correlated input uncertainties. In all numerical studies, it is successful to build GPM automatically with the minimal user intervention. The proposed approach can be customized for the various response surfaces and help a less experienced user save his/her efforts.

Structural reliability analysis using temporal deep learning-based model and importance sampling

  • Nguyen, Truong-Thang;Dang, Viet-Hung
    • Structural Engineering and Mechanics
    • /
    • v.84 no.3
    • /
    • pp.323-335
    • /
    • 2022
  • The main idea of the framework is to seamlessly combine a reasonably accurate and fast surrogate model with the importance sampling strategy. Developing a surrogate model for predicting structures' dynamic responses is challenging because it involves high-dimensional inputs and outputs. For this purpose, a novel surrogate model based on cutting-edge deep learning architectures specialized for capturing temporal relationships within time-series data, namely Long-Short term memory layer and Transformer layer, is designed. After being properly trained, the surrogate model could be utilized in place of the finite element method to evaluate structures' responses without requiring any specialized software. On the other hand, the importance sampling is adopted to reduce the number of calculations required when computing the failure probability by drawing more relevant samples near critical areas. Thanks to the portability of the trained surrogate model, one can integrate the latter with the Importance sampling in a straightforward fashion, forming an efficient framework called TTIS, which represents double advantages: less number of calculations is needed, and the computational time of each calculation is significantly reduced. The proposed approach's applicability and efficiency are demonstrated through three examples with increasing complexity, involving a 1D beam, a 2D frame, and a 3D building structure. The results show that compared to the conventional Monte Carlo simulation, the proposed method can provide highly similar reliability results with a reduction of up to four orders of magnitudes in time complexity.

A Comparison of Systematic Sampling Designs for Forest Inventory

  • Yim, Jong Su;Kleinn, Christoph;Kim, Sung Ho;Jeong, Jin-Hyun;Shin, Man Yong
    • Journal of Korean Society of Forest Science
    • /
    • v.98 no.2
    • /
    • pp.133-141
    • /
    • 2009
  • This study was conducted to support for determining an efficient sampling design for forest resources assessments in South Korea with respect to statistical efficiency. For this objective, different systematic sampling designs were simulated and compared based on an artificial forest population that had been built from field sample data and satellite data in Yang-Pyeong County, Korea. Using the k-NN technique, two thematic maps (growing stock and forest cover type per pixel unit) across the test area were generated; field data (n=191) and Landsat ETM+ were used as source data. Four sampling designs (systematic sampling, systematic sampling for post-stratification, systematic cluster sampling, and stratified systematic sampling) were employed as optimum sampling design candidates. In order to compute error variance, the Monte Carlo simulation was used (k=1,000). Then, sampling error and relative efficiency were compared. When the objective of an inventory was to obtain estimations for the entire population, systematic cluster sampling was superior to the other sampling designs. If its objective is to obtain estimations for each sub-population, post-stratification gave a better estimation. In order to successfully perform this procedure, it requires clear definitions of strata of interest per field observation unit for efficient stratification.