• Title/Summary/Keyword: statistical approach

Search Result 2,355, Processing Time 0.022 seconds

Conditional Bootstrap Methods for Censored Survival Data

  • Kim, Ji-Hyun
    • Journal of the Korean Statistical Society
    • /
    • v.24 no.1
    • /
    • pp.197-218
    • /
    • 1995
  • We first consider the random censorship model of survival analysis. Efron (1981) introduced two equivalent bootstrap methods for censored data. We propose a new bootstrap scheme, called Method 3, that acts conditionally on the censoring pattern when making inference about aspects of the unknown life-time distribution F. This article contains (a) a motivation for this refined bootstrap scheme ; (b) a proof that the bootstrapped Kaplan-Meier estimatro fo F formed by Method 3 has the same limiting distribution as the one by Efron's approach ; (c) description of and report on simulation studies assessing the small-sample performance of the Method 3 ; (d) an illustration on some Danish data. We also consider the model in which the survival times are censered by death times due to other caused and also by known fixed constants, and propose an appropriate bootstrap method for that model. This bootstrap method is a readily modified version of the Method 3.

  • PDF

CONFIDENCE INTERVALS ON THE AMONG GROUP VARIANCE COMPONENT IN A REGRESSION MODEL WITH AN UNBALANCED ONE-FOLD NESTED ERROR STRUCTURE

  • Park, Dong-Joon
    • Proceedings of the Korean Statistical Society Conference
    • /
    • 2002.11a
    • /
    • pp.141-146
    • /
    • 2002
  • In this article we consider the problem of constructing confidence intervals for a linear regression model with nested error structure. A popular approach is the likelihood-based method employed by PROC MIXED of SAS. In this paper, we examine the ability of MIXED to produce confidence intervals that maintain the stated confidence coefficient. Our results suggest the intervals for the regression coefficients work well, but the intervals for the variance component associated with the primary level cannot be recommended. Accordingly, we propose alternative methods for constructing confidence intervals on the primary level variance component. Computer simulation is used to compare the proposed methods. A numerical example and SAS code are provided to demonstrate the methods.

  • PDF

Generalized Partially Double-Index Model: Bootstrapping and Distinguishing Values

  • Yoo, Jae Keun
    • Communications for Statistical Applications and Methods
    • /
    • v.22 no.3
    • /
    • pp.305-312
    • /
    • 2015
  • We extend a generalized partially linear single-index model and newly define a generalized partially double-index model (GPDIM). The philosophy of sufficient dimension reduction is adopted in GPDIM to estimate unknown coefficient vectors in the model. Subsequently, various combinations of popular sufficient dimension reduction methods are constructed with the best combination among many candidates determined through a bootstrapping procedure that measures distances between subspaces. Distinguishing values are newly defined to match the estimates to the corresponding population coefficient vectors. One of the strengths of the proposed model is that it can investigate the appropriateness of GPDIM over a single-index model. Various numerical studies confirm the proposed approach, and real data application are presented for illustration purposes.

Option Pricing with Bounded Expected Loss under Variance-Gamma Processes

  • Song, Seong-Joo;Song, Jong-Woo
    • Communications for Statistical Applications and Methods
    • /
    • v.17 no.4
    • /
    • pp.575-589
    • /
    • 2010
  • Exponential L$\acute{e}$evy models have become popular in modeling price processes recently in mathematical finance. Although it is a relatively simple extension of the geometric Brownian motion, it makes the market incomplete so that the option price is not uniquely determined. As a trial to find an appropriate price for an option, we suppose a situation where a hedger wants to initially invest as little as possible, but wants to have the expected squared loss at the end not exceeding a certain constant. For this, we assume that the underlying price process follows a variance-gamma model and it converges to a geometric Brownian motion as its quadratic variation converges to a constant. In the limit, we use the mean-variance approach to find the asymptotic minimum investment with the expected squared loss bounded. Some numerical results are also provided.

Combining cluster analysis and neural networks for the classification problem

  • Kim, Kyungsup;Han, Ingoo
    • Proceedings of the Korean Operations and Management Science Society Conference
    • /
    • 1996.10a
    • /
    • pp.31-34
    • /
    • 1996
  • The extensive researches have compared the performance of neural networks(NN) with those of various statistical techniques for the classification problem. The empirical results of these comparative studies have indicated that the neural networks often outperform the traditional statistical techniques. Moreover, there are some efforts that try to combine various classification methods, especially multivariate discriminant analysis with neural networks. While these efforts improve the performance, there exists a problem violating robust assumptions of multivariate discriminant analysis that are multivariate normality of the independent variables and equality of variance-covariance matrices in each of the groups. On the contrary, cluster analysis alleviates this assumption like neural networks. We propose a new approach to classification problems by combining the cluster analysis with neural networks. The resulting predictions of the composite model are more accurate than each individual technique.

  • PDF

An Exponential GARCH Approach to the Effect of Impulsiveness of Euro on Indian Stock Market

  • Sahadudheen, I
    • The Journal of Asian Finance, Economics and Business
    • /
    • v.2 no.3
    • /
    • pp.17-22
    • /
    • 2015
  • This paper examines the effect of impulsiveness of euro on Indian stock market. In order to examine the problem, we select rupee-euro exchange rates and S&P CNX NIFTY and BSE30 SENSEX to represent stock price. We select euro as it considered as second most widely used currency at the international level after dollar. The data are collected a daily basis over a period of 3-Apr-2007 to 30-Mar-2012. The statistical and time series properties of each and every variable have examined using the conventional unit root such as ADF and PP test. Adopting a generalized autoregressive conditional heteroskedasticity (GARCH) and exponential GARCH (EGARCH) model, the study suggests a negative relationship between exchange rate and stock prices in India. Even though India is a major trade partner of European Union, the study couldn't find any significant statistical effect of fluctuations in Euro-rupee exchange rates on stock prices. The study also reveals that shocks to exchange rate have symmetric effect on stock prices and exchange rate fluctuations have permanent effects on stock price volatility in India.

Semiparametric and Nonparametric Modeling for Matched Studies

  • Kim, In-Young;Cohen, Noah
    • Proceedings of the Korean Statistical Society Conference
    • /
    • 2003.10a
    • /
    • pp.179-182
    • /
    • 2003
  • This study describes a new graphical method for assessing and characterizing effect modification by a matching covariate in matched case-control studies. This method to understand effect modification is based on a semiparametric model using a varying coefficient model. The method allows for nonparametric relationships between effect modification and other covariates, or can be useful in suggesting parametric models. This method can be applied to examining effect modification by any ordered categorical or continuous covariates for which cases have been matched with controls. The method applies to effect modification when causality might be reasonably assumed. An example from veterinary medicine is used to demonstrate our approach. The simulation results show that this method, when based on linear, quadratic and nonparametric effect modification, can be more powerful than both a parametric multiplicative model fit and a fully nonparametric generalized additive model fit.

  • PDF

Maximizing the Overlay of Sample Units for Two Stratified Designs by Linear Programming

  • Ryu, Jea-Bok;Kim, Sun-Woong
    • Communications for Statistical Applications and Methods
    • /
    • v.8 no.3
    • /
    • pp.719-729
    • /
    • 2001
  • Overlap Maximization is a sampling technique to reduce survey costs and costs associated with the survey. It was first studied by Keyfitz(1951). Ernst(1998) presented a remarkable procedure for maximizing the overlap when the sampling units can be selected for two identical stratified designs simultaneously, But the approach involves mimicking the behaviour of nonlinear function by linear function and so it is less direct, even though the stratification problem for the overlap corresponds directly to the linear programming problem. furthermore, it uses the controlled selection algorithm that repeatedly needs zero-restricted controlled roundings, which are solutions of capacitated transportation problems. In this paper we suggest a comparatively simple procedure to use linear programming in order to maximize the overlap. We show how this procedure can be implemented practically.

  • PDF

Multistage Point and Confidence Interval Estimation of the Shape Parameter of Pareto Distribution

  • Hamdy, H.I.;Son, M.S.;Gharraph, M.K.;Rashad, A.M.
    • Communications for Statistical Applications and Methods
    • /
    • v.10 no.3
    • /
    • pp.1069-1086
    • /
    • 2003
  • This article presents the asymptotic theory of triple sampling procedure as pertain to estimating the shape parameter of Pareto distribution. Both point and confidence interval estimation are considered within the same inference unified framework. We show that this group sampling technique possesses the efficiency of Anscome (1953), Chow and Robbins (1965) purely sequential procedure as well as reduce the number of sampling operations by utilizing Stein (1945) two stages procedure. The analysis reveals that the technique performs excellent as far as the accuracy is concerned. The present problem differs from those considered by many authors, in multistage sampling, in that the final stage sample size and the parameter's estimate become highly correlated and therefore we adopted different approach.

A Goodness-Of-Fit Test for Adaptive Fourier Model in Time Series Data

  • Lee, Hoonja
    • Communications for Statistical Applications and Methods
    • /
    • v.10 no.3
    • /
    • pp.955-969
    • /
    • 2003
  • The classical Fourier analysis, which is the typical frequency domain approach, is used to detect periodic trends that are of the sinusoidal shape in time series data. In this article, using a sequence of periodic step functions, describes an adaptive Fourier series where the patterns may take general periodic shapes that include sinusoidal as a special case. The results, which extend both Fourier analysis and Walsh-Fourier analysis, are applies to investigate the shape of the periodic component. Through the real data, compare the goodness-of-fit of the model using two methods, the adaptive Fourier method which is proposed method in this paper and classical Fourier method.