• Title/Summary/Keyword: sampling methods

Search Result 3,080, Processing Time 0.035 seconds

Experimental Study on the Consolidation Characteristics of Kwang-Yang Clay by Large Block sampling (대형자연시료를 이용한 광양점토의 압밀특성에 관한 실험적 연구)

  • Kim, Jong-Kook;Yu, Seong-Jin;Chae, Young-Su
    • Proceedings of the Korean Geotechical Society Conference
    • /
    • 2005.10a
    • /
    • pp.429-436
    • /
    • 2005
  • In this study, we have compared with the differences between the extent of sampling disturbance and consolidation characteristics by experiments, which are consolidation test and have been performed with Kwang-Yang clay samples. The effects on sampling disturbance to consolidation characteristics of soft clay have been inverstigated by using soil samples obtained from large block sampling and piston sampling methods. Through a few experiments, we've got important results which are that the consoilidation parameter of large block sample(Pc, Cc, Cv) is much larger than the value of parameter of piston sample. We've also found the fact that the large block sample using the large size sampler is much better than piston sample in the quality of goods to lessen the effects on disturbance of sampling. When compared to the parameter of consolidation along with the methods of experiment, we found that the result performed by large size consolidation test is the greatest one and CRS is much better than standard consolidation test to seek for proper parameter.

  • PDF

Estimation Methods for Population Pharmacokinetic Models using Stochastic Sampling Approach (확률적 표본추출 방법을 이용한 집단 약동학 모형의 추정과 검증에 관한 고찰)

  • Kim, Kwang-Hee;Yoon, Jeong-Hwa;Lee, Eun-Kyung
    • The Korean Journal of Applied Statistics
    • /
    • v.28 no.2
    • /
    • pp.175-188
    • /
    • 2015
  • This study is about estimation methods for the population pharmacokinetic and pharmacodymic model. This is a nonlinear mixed effect model, and it is difficult to find estimates of parameters because of nonlinearity. In this study, we examined theoretical background of various estimation methods provided by NONMEM, which is the most widely used software in the pharmacometrics area. We focused on estimation methods using a stochastic sampling approach - IMP, IMPMAP, SAEM and BAYES. The SAEM method showed the best performance among methods, and IMPMAP and BAYES methods showed slightly less performance than SAEM. The major obstacle to a stochastic sampling approach is the running time to find solution. We propose new approach to find more precise initial values using an ITS method to shorten the running time.

Experimental Analysis of Equilibrization in Binary Classification for Non-Image Imbalanced Data Using Wasserstein GAN

  • Wang, Zhi-Yong;Kang, Dae-Ki
    • International Journal of Internet, Broadcasting and Communication
    • /
    • v.11 no.4
    • /
    • pp.37-42
    • /
    • 2019
  • In this paper, we explore the details of three classic data augmentation methods and two generative model based oversampling methods. The three classic data augmentation methods are random sampling (RANDOM), Synthetic Minority Over-sampling Technique (SMOTE), and Adaptive Synthetic Sampling (ADASYN). The two generative model based oversampling methods are Conditional Generative Adversarial Network (CGAN) and Wasserstein Generative Adversarial Network (WGAN). In imbalanced data, the whole instances are divided into majority class and minority class, where majority class occupies most of the instances in the training set and minority class only includes a few instances. Generative models have their own advantages when they are used to generate more plausible samples referring to the distribution of the minority class. We also adopt CGAN to compare the data augmentation performance with other methods. The experimental results show that WGAN-based oversampling technique is more stable than other approaches (RANDOM, SMOTE, ADASYN and CGAN) even with the very limited training datasets. However, when the imbalanced ratio is too small, generative model based approaches cannot achieve satisfying performance than the conventional data augmentation techniques. These results suggest us one of future research directions.

A Case Study on the Target Sampling Inspection for Improving Outgoing Quality (타겟 샘플링 검사를 통한 출하품질 향상에 관한 사례 연구)

  • Kim, Junse;Lee, Changki;Kim, Kyungnam;Kim, Changwoo;Song, Hyemi;Ahn, Seoungsu;Oh, Jaewon;Jo, Hyunsang;Han, Sangseop
    • Journal of Korean Society for Quality Management
    • /
    • v.49 no.3
    • /
    • pp.421-431
    • /
    • 2021
  • Purpose: For improving outgoing quality, this study presents a novel sampling framework based on predictive analytics. Methods: The proposed framework is composed of three steps. The first step is the variable selection. The knowledge-based and data-driven approaches are employed to select important variables. The second step is the model learning. In this step, we consider the supervised classification methods, the anomaly detection methods, and the rule-based methods. The applying model is the third step. This step includes the all processes to be enabled on real-time prediction. Each prediction model classifies a product as a target sample or random sample. Thereafter intensive quality inspections are executed on the specified target samples. Results: The inspection data of three Samsung products (mobile, TV, refrigerator) are used to check functional defects in the product by utilizing the proposed method. The results demonstrate that using target sampling is more effective and efficient than random sampling. Conclusion: The results of this paper show that the proposed method can efficiently detect products that have the possibilities of user's defect in the lot. Additionally our study can guide practitioners on how to easily detect defective products using stratified sampling

A Study on Methods of Quality Check for Digital Basemaps using Statistical Methods for the Quality Control (통계적 품질관리기법을 도입한 수치지도의 검수방법에 관한 연구)

  • 김병국;서현덕
    • Journal of the Korean Society of Surveying, Geodesy, Photogrammetry and Cartography
    • /
    • v.17 no.1
    • /
    • pp.79-86
    • /
    • 1999
  • In this study, we investigated methods of quality check for digital basemaps and proposed effective methods of quality check. We used new statistical methods for quality control in order to carry out quality check for digital basemaps. We proposed 2-stage complete sampling and 2-stage cluster sampling method to improve present statistical methods of quality check(1-stage complete sampling method). We estimated error rate and number of omitted objects using simulated data about all delivered digital basemaps and estimated variances about it. We could determine confidence interval about error rate and number of omitted objects.

  • PDF

Adaptive kernel method for evaluating structural system reliability

  • Wang, G.S.;Ang, A.H.S.;Lee, J.C.
    • Structural Engineering and Mechanics
    • /
    • v.5 no.2
    • /
    • pp.115-126
    • /
    • 1997
  • Importance sampling methods have been developed with the aim of reducing the computational costs inherent in Monte Carlo methods. This study proposes a new algorithm called the adaptive kernel method which combines and modifies some of the concepts from adaptive sampling and the simple kernel method to evaluate the structural reliability of time variant problems. The essence of the resulting algorithm is to select an appropriate starting point from which the importance sampling density can be generated efficiently. Numerical results show that the method is unbiased and substantially increases the efficiency over other methods.

Efficient Markov Chain Monte Carlo for Bayesian Analysis of Neural Network Models

  • Paul E. Green;Changha Hwang;Lee, Sangbock
    • Journal of the Korean Statistical Society
    • /
    • v.31 no.1
    • /
    • pp.63-75
    • /
    • 2002
  • Most attempts at Bayesian analysis of neural networks involve hierarchical modeling. We believe that similar results can be obtained with simpler models that require less computational effort, as long as appropriate restrictions are placed on parameters in order to ensure propriety of posterior distributions. In particular, we adopt a model first introduced by Lee (1999) that utilizes an improper prior for all parameters. Straightforward Gibbs sampling is possible, with the exception of the bias parameters, which are embedded in nonlinear sigmoidal functions. In addition to the problems posed by nonlinearity, direct sampling from the posterior distributions of the bias parameters is compounded due to the duplication of hidden nodes, which is a source of multimodality. In this regard, we focus on sampling from the marginal posterior distribution of the bias parameters with Markov chain Monte Carlo methods that combine traditional Metropolis sampling with a slice sampler described by Neal (1997, 2001). The methods are illustrated with data examples that are largely confined to the analysis of nonparametric regression models.

Critical Review on Evaporative Loss of Semivolatile Aerosols during Sampling

  • Kim, Seung-Won
    • Journal of Environmental Health Sciences
    • /
    • v.36 no.3
    • /
    • pp.171-181
    • /
    • 2010
  • Semivolatile aerosols exist as vapor and particles at the same time in room temperature and each phase has different intake and uptake mechanisms. This characteristic requires substantial consideration during exposure assessment of semivolatile aerosol. Some sampling methods for solid particles pose high possibility of evaporative loss during sampling. Therefore, when establishing sampling strategy for them, the factors affecting the phase distribution of semivolatile aerosol should be counted including semivolatile aerosol of interest and sampling methods used. Evaluation for phase distributions of semivolatile aerosols is also recommended. Metalworking fluids, pesticides, asphalt fumes, diesel exhaust, and environmental tobacco smoke are common health-related semivolatile aerosols in workplaces.

Comparison of Two-time Homogeneous Poisson Processes Using Inverse Type Sapling Plans (역샘플링법을 이용한 포와슨과정의 비교)

  • 장중순;임춘우;정유진
    • Journal of Korean Society of Industrial and Systems Engineering
    • /
    • v.11 no.17
    • /
    • pp.67-80
    • /
    • 1988
  • This study is concerned with the comparison of two time homogeneous Poisson processes. Traditionally, the methods of testing equality of Poisson processes were based on the binomial distribution or its normal approximations. The sampling plans used in these methods are to observe the processes concurrently over a predetermined time interval, possibly different for each process. However, when the values of the intensities of the processes are small, inverse type sampling plans are more appropriate since there may be cases where only a few or even no events are observed in the predetermined time interval. This study considers 9 inverse type sampling plans for the comparison of two Poisson processes. For each sampling plans considered, critical regions and the design parameters of the sampling plan are determined to guarantee the significance level and the power at some values of the alternative hypothesis. The Problem of comparing of two Weibull processes are also considered.

  • PDF

Analysis of inconsistent source sampling in monte carlo weight-window variance reduction methods

  • Griesheimer, David P.;Sandhu, Virinder S.
    • Nuclear Engineering and Technology
    • /
    • v.49 no.6
    • /
    • pp.1172-1180
    • /
    • 2017
  • The application of Monte Carlo (MC) to large-scale fixed-source problems has recently become possible with new hybrid methods that automate generation of parameters for variance reduction techniques. Two common variance reduction techniques, weight windows and source biasing, have been automated and popularized by the consistent adjoint-driven importance sampling (CADIS) method. This method uses the adjoint solution from an inexpensive deterministic calculation to define a consistent set of weight windows and source particles for a subsequent MC calculation. One of the motivations for source consistency is to avoid the splitting or rouletting of particles at birth, which requires computational resources. However, it is not always possible or desirable to implement such consistency, which results in inconsistent source biasing. This paper develops an original framework that mathematically expresses the coupling of the weight window and source biasing techniques, allowing the authors to explore the impact of inconsistent source sampling on the variance of MC results. A numerical experiment supports this new framework and suggests that certain classes of problems may be relatively insensitive to inconsistent source sampling schemes with moderate levels of splitting and rouletting.