• Title/Summary/Keyword: importance sampling (IS)

Search Result 314, Processing Time 0.026 seconds

A novel reliability analysis method based on Gaussian process classification for structures with discontinuous response

  • Zhang, Yibo;Sun, Zhili;Yan, Yutao;Yu, Zhenliang;Wang, Jian
    • Structural Engineering and Mechanics
    • /
    • v.75 no.6
    • /
    • pp.771-784
    • /
    • 2020
  • Reliability analysis techniques combining with various surrogate models have attracted increasing attention because of their accuracy and great efficiency. However, they primarily focus on the structures with continuous response, while very rare researches on the reliability analysis for structures with discontinuous response are carried out. Furthermore, existing adaptive reliability analysis methods based on importance sampling (IS) still have some intractable defects when dealing with small failure probability, and there is no related research on reliability analysis for structures involving discontinuous response and small failure probability. Therefore, this paper proposes a novel reliability analysis method called AGPC-IS for such structures, which combines adaptive Gaussian process classification (GPC) and adaptive-kernel-density-estimation-based IS. In AGPC-IS, an efficient adaptive strategy for design of experiments (DoE), taking into consideration the classification uncertainty, the sampling uniformity and the regional classification accuracy improvement, is developed with the purpose of improving the accuracy of Gaussian process classifier. The adaptive kernel density estimation is introduced for constructing the quasi-optimal density function of IS. In addition, a novel and more precise stopping criterion is also developed from the perspective of the stability of failure probability estimation. The efficiency, superiority and practicability of AGPC-IS are verified by three examples.

Evolution Strategies Based Particle Filters for Nonlinear State Estimation

  • Uosaki, Katsuji;Kimura, Yuuya;Hatanaka, Toshiharu
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 2003.10a
    • /
    • pp.559-564
    • /
    • 2003
  • Recently, particle filters have attracted attentions for nonlinear state estimation. They evaluate a posterior probability distribution of the state variable based on observations in simulation using so-called importance sampling. However, degeneracy phenomena in the importance weights deteriorate the filter performance. A new filter, Evolution Strategies Based Particle Filter, is proposed to circumvent this difficulty and to improve the performance. Numerical simulation results illustrate the applicability of the proposed idea.

  • PDF

Evolution Strategies Based Particle Filters for Simultaneous State and Parameter Estimation of Nonlinear Stochastic Models

  • Uosaki, K.;Hatanaka, T.
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 2005.06a
    • /
    • pp.1765-1770
    • /
    • 2005
  • Recently, particle filters have attracted attentions for nonlinear state estimation. In this approaches, a posterior probability distribution of the state variable is evaluated based on observations in simulation using so-called importance sampling. We proposed a new filter, Evolution Strategies based particle (ESP) filter to circumvent degeneracy phenomena in the importance weights, which deteriorates the filter performance, and apply it to simultaneous state and parameter estimation of nonlinear state space models. Results of numerical simulation studies illustrate the applicability of this approach.

  • PDF

Serviceability reliability analysis of cable-stayed bridges

  • Cheng, Jin;Xiao, Ru-Cheng
    • Structural Engineering and Mechanics
    • /
    • v.20 no.6
    • /
    • pp.609-630
    • /
    • 2005
  • A reliability analysis method is proposed in this paper through a combination of the advantages of the response surface method (RSM), finite element method (FEM), first order reliability method (FORM) and the importance sampling updating method. The accuracy and efficiency of the method is demonstrated through several numerical examples. Then the method is used to estimate the serviceability reliability of cable-stayed bridges. Effects of geometric nonlinearity, randomness in loading, material, and geometry are considered. The example cable-stayed bridge is the Second Nanjing Bridge with a main span length of 628 m built in China. The results show that the cable sag that is part of the geometric nonlinearities of cable-stayed bridges has a major effect on the reliability of cable-stayed bridge. Finally, the most influential random variables on the reliability of cable-stayed bridges are identified by using a sensitivity analysis.

Two-stage Latin hypercube sampling and its application (이단계 Latin Hypercube 추출법과 그 응용)

  • 임미정;권우주;이주호
    • The Korean Journal of Applied Statistics
    • /
    • v.8 no.2
    • /
    • pp.99-108
    • /
    • 1995
  • When modeling a complicated system with a computer model, it is of vital importance to choos input values efficiently. The Latin Hypercube sampling (LHS) proposed by MaKay et al.(1979) has been most widely used for choosing input values for a computer model. We propose the two-stage Latin Hypercube sampling(TLHS) which is an improved version of the LHS for procucing input values in estimating the excectation of a function of the output variable. The proposed method is applied to simulation study of the performance of a printer actuator and it is shown to outperform the other sampling methods including the LHS in accuracy.

  • PDF

Structural reliability estimation using Monte Carlo simulation and Pearson's curves

  • Krakovski, Mikhail B.
    • Structural Engineering and Mechanics
    • /
    • v.3 no.3
    • /
    • pp.201-213
    • /
    • 1995
  • At present Level 2 and importance sampling methods are the main tools used to estimate reliability of structural systems. But sometimes application of these techniques to realistic problems involves certain difficulties. In order to overcome the difficulties it is suggested to use Monte Carlo simulation in combination with two other techniques-extreme value and tail entropy approximations; an appropriate Pearson's curve is fit to represent simulation results. On the basis of this approach an algorithm and computer program for structural reliability estimation are developed. A number of specially chosen numerical examples are considered with the aim of checking the accuracy of the approach and comparing it with the Level 2 and importance sampling methods. The field of application of the approach is revealed.

The Reliability-Based Probabilistic Structural Analysis for the Composite Tail Plane Structures (복합재 미익 구조의 신뢰성 기반 확률론적 구조해석)

  • Lee, Seok-Je;Kim, In-Gul
    • Journal of the Korea Institute of Military Science and Technology
    • /
    • v.15 no.1
    • /
    • pp.93-100
    • /
    • 2012
  • In this paper, the deterministic optimal design for the tail plane made of composite materials is conducted under the deterministic loading condition and compared with that of the metallic materials. Next, the reliability analysis with five random variables such as loading and material properties of unidirectional prepreg is conducted to examine the probability of failure for the deterministic optimal design results. The MATLAB programing is used for reliability analysis combined with FEA S/W(COMSOL) for structural analysis. The laminated composite is assumed to the equivalent orthotropic material using classical laminated plate theory. The response surface methodology and importance sampling technique are adopted to reduce computational cost with satisfying the accuracy in reliability analysis. As a result, structural weight of composite materials is lighter than that of metals in deterministic optimal design. However, the probability of failure for the deterministic optimal design of the tail plane structures is too high to be neglected. The sensitivity of each variable is also estimated using probabilistic sensitivity analysis to figure out which variables are sensitive to failure. The computational cost is considerably reduced when response surface methodology and importance sampling technique are used. The study of the computationally inexpensive method for reliability-based design optimization will be necessary in further work.

Study on the Effect of Training Data Sampling Strategy on the Accuracy of the Landslide Susceptibility Analysis Using Random Forest Method (Random Forest 기법을 이용한 산사태 취약성 평가 시 훈련 데이터 선택이 결과 정확도에 미치는 영향)

  • Kang, Kyoung-Hee;Park, Hyuck-Jin
    • Economic and Environmental Geology
    • /
    • v.52 no.2
    • /
    • pp.199-212
    • /
    • 2019
  • In the machine learning techniques, the sampling strategy of the training data affects a performance of the prediction model such as generalizing ability as well as prediction accuracy. Especially, in landslide susceptibility analysis, the data sampling procedure is the essential step for setting the training data because the number of non-landslide points is much bigger than the number of landslide points. However, the previous researches did not consider the various sampling methods for the training data. That is, the previous studies selected the training data randomly. Therefore, in this study the authors proposed several different sampling methods and assessed the effect of the sampling strategies of the training data in landslide susceptibility analysis. For that, total six different scenarios were set up based on the sampling strategies of landslide points and non-landslide points. Then Random Forest technique was trained on the basis of six different scenarios and the attribute importance for each input variable was evaluated. Subsequently, the landslide susceptibility maps were produced using the input variables and their attribute importances. In the analysis results, the AUC values of the landslide susceptibility maps, obtained from six different sampling strategies, showed high prediction rates, ranges from 70 % to 80 %. It means that the Random Forest technique shows appropriate predictive performance and the attribute importance for the input variables obtained from Random Forest can be used as the weight of landslide conditioning factors in the susceptibility analysis. In addition, the analysis results obtained using specific sampling strategies for training data show higher prediction accuracy than the analysis results using the previous random sampling method.

Efficient Performance Evaluation Method for Digital Satellite Broadcasting Channels (효율적인 디지틀 위성방송채널 성능평가 기법)

  • 정창봉;김준명;김용섭;황인관
    • The Journal of Korean Institute of Communications and Information Sciences
    • /
    • v.25 no.6A
    • /
    • pp.794-801
    • /
    • 2000
  • In this paper, the efficient new performance evaluation method for digital communication channels is suggested and verified its efficiency in terms of simulation run-tim for the digital satellite broadcasting satellite TV channel. In order to solve the difficulties of the existing Importance Sampling(IS) Technics, we adopted the discrete probability mass function(PMF) in the new method for estimating the statistical characteristics of received signals from the measured Nth order central moments. From the discrete probability mass function obtained with less number of the received signal than the one required in the IS technic, continuous cumulative probability function and its inverse function are exactly estimated by using interpolation and extrapolation technic. And the overall channel is simplified with encoding block, inner channel performance degra-dation modeing block which is modeled with the Uniform Random Number Generator (URNG) and concatenated Inverse Cummulative Pr bility Distribution function, and decoding block. With the simplified channel model, the overall performance evaluation can be done within a drastically reduced time. The simulation results applied to the nonlinear digital satellite broadcasting TV channel showed the great efficiency of the alogrithm in the sense of computer run time, and demonstrated that the existing problems of IS for the nonlinear satellite channels with coding and M-dimensional memory can be completely solved.

  • PDF

Comparison of Importance Weights for Regression Model and AHP: A Case of Students' Satisfaction with University (회귀모형과 AHP의 가중치에 대한 비교 연구: 대학생의 학교 만족도를 대상으로)

  • Jong Hun Park
    • Journal of Korean Society of Industrial and Systems Engineering
    • /
    • v.45 no.4
    • /
    • pp.118-126
    • /
    • 2022
  • This study attempts a comparison between AHP(Analytic Hierarchy Process) in which the importance weight is structured by individual subjective values and regression model with importance weight based on statistical theory in determining the importance weight of casual model. The casual model is designed by for students' satisfaction with university, and SERVQUAL modeling methodology is applied to derive factors affecting students' satisfaction with university. By comparison of importance weights for regression model and AHP, the following characteristics are observed. 1) the lower the degree of satisfaction of the factor, the higher the importance weight of AHP, 2) the importance weight of AHP has tendency to decrease as the standard deviation(or p-value) increases. degree of decreases. the second sampling is conducted to double-check the above observations. This study empirically checks that the importance weight of AHP has a relationship with the mean and standard deviation(or p-value) of independence variables, but can not reveal how exactly the relationship is. Further research is needed to clarify the relationship with long-term perspective.