• Title/Summary/Keyword: statistical methods

Search Result 11,646, Processing Time 0.033 seconds

비모수 회귀모형의 차분에 기저한 분산의 추정에 대한 고찰

  • 김종태
    • Communications for Statistical Applications and Methods
    • /
    • v.5 no.1
    • /
    • pp.121-131
    • /
    • 1998
  • 이 논문의 목적은 비모수 회귀모형에 있어서의 오차의 분산을 추정하는 방법들 중 차분에 기저한 방법 (difference-based methods)을 이용한 기존의 추정량들을 비교 분석하는데 있다. 특히 점근적인 최적 이차 차분에 기저한 Hall과 Kay, Titterington(1990)의 HKT 추정량에 대한 그들의 추정량에 대한 문제점들을 제시하고, HKT추정량과, GSJS추정량, Rice추정량에 대하여 모의 실험을 이용하여 모수에 대한 수렴 속도를 비교 분석 하였다. 또한 GSJS 추정량에 대한 일치성과 수렴 속도를 보였다.

  • PDF

Computational Methods for Detection of Multiple Outliers in Nonlinear Regression

  • Myung-Wook Kahng
    • Communications for Statistical Applications and Methods
    • /
    • v.3 no.2
    • /
    • pp.1-11
    • /
    • 1996
  • The detection of multiple outliers in nonlinear regression models can be computationally not feasible. As a compromise approach, we consider the use of simulated annealing algorithm, an approximate approach to combinatorial optimization. We show that this method ensures convergence and works well in locating multiple outliers while reducing computational time.

  • PDF

P-value calculation methods for semi-partial correlation coefficients

  • Kim, Seongho
    • Communications for Statistical Applications and Methods
    • /
    • v.29 no.3
    • /
    • pp.397-402
    • /
    • 2022
  • The mathematical expression of the p-value calculation for the semi-partial correlation coefficient differs between Kim (2015) and Cohen et al. (2003). These two expressions were compared and the advantages of Kim (2015)'s approach over Cohen et al. (2003) were discussed.

Development of a High-Resolution Near-Surface Air Temperature Downscale Model (고해상도 지상 기온 상세화 모델 개발)

  • Lee, Doo-Il;Lee, Sang-Hyun;Jeong, Hyeong-Se;Kim, Yeon-Hee
    • Atmosphere
    • /
    • v.31 no.5
    • /
    • pp.473-488
    • /
    • 2021
  • A new physical/statistical diagnostic downscale model has been developed for use to improve near-surface air temperature forecasts. The model includes a series of physical and statistical correction methods that account for un-resolved topographic and land-use effects as well as statistical bias errors in a low-resolution atmospheric model. Operational temperature forecasts of the Local Data Assimilation and Prediction System (LDAPS) were downscaled at 100 m resolution for three months, which were used to validate the model's physical and statistical correction methods and to compare its performance with the forecasts of the Korea Meteorological Administration Post-processing (KMAP) system. The validation results showed positive impacts of the un-resolved topographic and urban effects (topographic height correction, valley cold air pool effect, mountain internal boundary layer formation effect, urban land-use effect) in complex terrain areas. In addition, the statistical bias correction of the LDAPS model were efficient in reducing forecast errors of the near-surface temperatures. The new high-resolution downscale model showed better agreement against Korean 584 meteorological monitoring stations than the KMAP, supporting the importance of the new physical and statistical correction methods. The new physical/statistical diagnostic downscale model can be a useful tool in improving near-surface temperature forecasts and diagnostics over complex terrain areas.

Estimating survival distributions for two-stage adaptive treatment strategies: A simulation study

  • Vilakati, Sifiso;Cortese, Giuliana;Dlamini, Thembelihle
    • Communications for Statistical Applications and Methods
    • /
    • v.28 no.5
    • /
    • pp.411-424
    • /
    • 2021
  • Inference following two-stage adaptive designs (also known as two-stage randomization designs) with survival endpoints usually focuses on estimating and comparing survival distributions for the different treatment strategies. The aim is to identify the treatment strategy(ies) that leads to better survival of the patients. The objectives of this study were to assess the performance three commonly cited methods for estimating survival distributions in two-stage randomization designs. We review three non-parametric methods for estimating survival distributions in two-stage adaptive designs and compare their performance using simulation studies. The simulation studies show that the method based on the marginal mean model is badly affected by high censoring rates and response rate. The other two methods which are natural extensions of the Nelson-Aalen estimator and the Kaplan-Meier estimator have similar performance. These two methods yield survival estimates which have less bias and more precise than the marginal mean model even in cases of small sample sizes. The weighted versions of the Nelson-Aalen and the Kaplan-Meier estimators are less affected by high censoring rates and low response rates. The bias of the method based on the marginal mean model increases rapidly with increase in censoring rate compared to the other two methods. We apply the three methods to a leukemia clinical trial dataset and also compare the results.

A Study on the Group Sequential Methods for Comparing Survival Distributions in Clinical Trials

  • Jae Won Lee
    • Communications for Statistical Applications and Methods
    • /
    • v.5 no.2
    • /
    • pp.459-475
    • /
    • 1998
  • In many clinical trials, we are interested in comparing the failure time distribution of different treatment groups. Because of ethical and economic reasons, clinical trials need to be monitored for early dramatic benefits or potential harmful effects. Prior knowledge, evolving knowledge, statistical considerations, medical judgment and ethical principles are all involved in the decision to terminate a trial early, and thus the monitoring is usually carried out by an independent scientific committee. This paper reviews the recently proposed group sequential testing procedures for clinical trials with survival data. Design considerations of such clinical trials are also discussed. This paper compares the characteristics of each of these methods and provides the biostatisticians with the guidelines for choosing the appropriate group sequential methods in a given situation.

  • PDF

Statistical Methods in Non-Inferiority Trials - A Focus on US FDA Guidelines -

  • Kang, Seung-Ho;Wang, So-Young
    • The Korean Journal of Applied Statistics
    • /
    • v.25 no.4
    • /
    • pp.575-587
    • /
    • 2012
  • The effect of a new treatment is proven through the comparison of a new treatment with placebo; however, the number of parent non-inferiority trials tends to grow proportionally to the number of active controls. In a non-inferiority trial a new treatment is approved by proof that the new treatment is not inferior to an active control; however, both additional assumptions and historical trials are needed to show (through the comparison of the new treatment with the active control in a non-inferiority trial) that the new treatment is more efficacious than a putative placebo. The two different methods of using the historical data: frequentist principle method and meta-analytic method. This paper discusses the statistical methods and different Type I error rates obtained through the different methods employed.

Nonparametric Tests for 2×2 Cross-Over Design

  • Gee, Kyuhoon;Kim, Dongjae
    • Communications for Statistical Applications and Methods
    • /
    • v.19 no.6
    • /
    • pp.781-791
    • /
    • 2012
  • A $2{\times}2$ Cross-over design is widely used in clinical trials for comparison studies of two kinds of drugs or medical treatments. This design has many statistical methods such as Hills-Armitage's (1979) method or Koch's (1972) method. In this paper, we propose a nonparametric test for $2{\times}2$ Cross-over design based on a two-sample test suggested by Baumgartner et al. (1998). In addition, a Monte Carlo simulation study is adapted to compare the power of the proposed methods with those of previous methods.

How are Bayesian and Non-Parametric Methods Doing a Great Job in RNA-Seq Differential Expression Analysis? : A Review

  • Oh, Sunghee
    • Communications for Statistical Applications and Methods
    • /
    • v.22 no.2
    • /
    • pp.181-199
    • /
    • 2015
  • In a short history, RNA-seq data have established a revolutionary tool to directly decode various scenarios occurring on whole genome-wide expression profiles in regards with differential expression at gene, transcript, isoform, and exon specific quantification, genetic and genomic mutations, and etc. RNA-seq technique has been rapidly replacing arrays with seq-based platform experimental settings by revealing a couple of advantages such as identification of alternative splicing and allelic specific expression. The remarkable characteristics of high-throughput large-scale expression profile in RNA-seq are lied on expression levels of read counts, structure of correlated samples and genes, larger number of genes compared to sample size, different sampling rates, inevitable systematic RNA-seq biases, and etc. In this study, we will comprehensively review how robust Bayesian and non-parametric methods have a better performance than classical statistical approaches by explicitly incorporating such intrinsic RNA-seq specific features with flexible and more appropriate assumptions and distributions in practice.

A Review of Dose Finding Methods and Theory

  • Cheung, Ying Kuen
    • Communications for Statistical Applications and Methods
    • /
    • v.22 no.5
    • /
    • pp.401-413
    • /
    • 2015
  • In this article, we review the statistical methods and theory for dose finding in early phase clinical trials, where the primary objective is to identify an acceptable dose for further clinical investigation. The dose finding literature is initially motivated by applications in phase I clinical trials, in which dose finding is often formulated as a percentile estimation problem. We will present some important phase I methods and give an update on new theoretical developments since a recent review by Cheung (2010), with an aim to cover a broader class of dose finding problems and to illustrate how the general dose finding theory may be applied to evaluate and improve a method. Specifically, we will illustrate theoretical techniques with some numerical results in the context of a phase I/II study that uses trinary toxicity/efficacy outcomes as basis of dose finding.