• 제목/요약/키워드: Regression estimator

검색결과 311건 처리시간 0.023초

Pliable regression spline estimator using auxiliary variables

  • Oh, Jae-Kwon;Jhong, Jae-Hwan
    • Communications for Statistical Applications and Methods
    • /
    • 제28권5호
    • /
    • pp.537-551
    • /
    • 2021
  • We conducted a study on a regression spline estimator with a few pre-specified auxiliary variables. For the implementation of the proposed estimators, we adapted a coordinate descent algorithm. This was implemented by considering a structure of the sum of the residuals squared objective function determined by the B-spline and the auxiliary coefficients. We also considered an efficient stepwise knot selection algorithm based on the Bayesian information criterion. This was to adaptively select smoothly functioning estimator data. Numerical studies using both simulated and real data sets were conducted to illustrate the proposed method's performance. An R software package psav is available.

Stable activation-based regression with localizing property

  • Shin, Jae-Kyung;Jhong, Jae-Hwan;Koo, Ja-Yong
    • Communications for Statistical Applications and Methods
    • /
    • 제28권3호
    • /
    • pp.281-294
    • /
    • 2021
  • In this paper, we propose an adaptive regression method based on the single-layer neural network structure. We adopt a symmetric activation function as units of the structure. The activation function has a flexibility of its form with a parametrization and has a localizing property that is useful to improve the quality of estimation. In order to provide a spatially adaptive estimator, we regularize coefficients of the activation functions via ℓ1-penalization, through which the activation functions to be regarded as unnecessary are removed. In implementation, an efficient coordinate descent algorithm is applied for the proposed estimator. To obtain the stable results of estimation, we present an initialization scheme suited for our structure. Model selection procedure based on the Akaike information criterion is described. The simulation results show that the proposed estimator performs favorably in relation to existing methods and recovers the local structure of the underlying function based on the sample.

포함확률비례추출에서 회귀계수 최소제곱추정량의 근사분산 (Approximate Variance of Least Square Estimators for Regression Coefficient under Inclusion Probability Proportional to Size Sampling)

  • 김규성
    • Communications for Statistical Applications and Methods
    • /
    • 제19권1호
    • /
    • pp.23-32
    • /
    • 2012
  • 본 논문은 유한모집단에서 회귀계수추정량의 근사편향과 근사분산을 다루고 있다. 유한모집단에서 고정크기 포함확률비례표본을 추출하고 이 표본에서 조사된 데이터에 기초하여 회귀계수를 일반최소제곱추정량과 가중최소제곱추정량으로 추정할 때 두 추정량의 편향, 분산 그리고 평균제곱오차의 근사식을 유도하였다. 그리고 두 추정량의 효율을 비교하기 위하여 두 추정량의 분산을 비교하는 필요충분조건을 제시하였다. 또한 수치적인 비교를 위하여 간단한 예제를 소개하였다.

Optimal Design for Locally Weighted Quasi-Likelihood Response Curve Estimator

  • Park, Dongryeon
    • Communications for Statistical Applications and Methods
    • /
    • 제9권3호
    • /
    • pp.743-752
    • /
    • 2002
  • The estimation of the response curve is the important problem in the quantal bioassay. When we estimate the response curve, we determine the design points in advance of the experiment. Then naturally we have a question of which design would be optimal. As a response curve estimator, locally weighted quasi-likelihood estimator has several more appealing features than the traditional nonparametric estimators. The optimal design density for the locally weighted quasi-likelihood estimator is derived and its ability both in theoretical and in empirical point of view are investigated.

Weighted Least Absolute Deviation Lasso Estimator

  • Jung, Kang-Mo
    • Communications for Statistical Applications and Methods
    • /
    • 제18권6호
    • /
    • pp.733-739
    • /
    • 2011
  • The linear absolute shrinkage and selection operator(Lasso) method improves the low prediction accuracy and poor interpretation of the ordinary least squares(OLS) estimate through the use of $L_1$ regularization on the regression coefficients. However, the Lasso is not robust to outliers, because the Lasso method minimizes the sum of squared residual errors. Even though the least absolute deviation(LAD) estimator is an alternative to the OLS estimate, it is sensitive to leverage points. We propose a robust Lasso estimator that is not sensitive to outliers, heavy-tailed errors or leverage points.

Empirical Choice of the Shape Parameter for Robust Support Vector Machines

  • Pak, Ro-Jin
    • Communications for Statistical Applications and Methods
    • /
    • 제15권4호
    • /
    • pp.543-549
    • /
    • 2008
  • Inspired by using a robust loss function in the support vector machine regression to control training error and the idea of robust template matching with M-estimator, Chen (2004) applies M-estimator techniques to gaussian radial basis functions and form a new class of robust kernels for the support vector machines. We are specially interested in the shape of the Huber's M-estimator in this context and propose a way to find the shape parameter of the Huber's M-estimating function. For simplicity, only the two-class classification problem is considered.

A Support Vector Method for the Deconvolution Problem

  • Lee, Sung-Ho
    • Communications for Statistical Applications and Methods
    • /
    • 제17권3호
    • /
    • pp.451-457
    • /
    • 2010
  • This paper considers the problem of nonparametric deconvolution density estimation when sample observa-tions are contaminated by double exponentially distributed errors. Three different deconvolution density estima-tors are introduced: a weighted kernel density estimator, a kernel density estimator based on the support vector regression method in a RKHS, and a classical kernel density estimator. The performance of these deconvolution density estimators is compared by means of a simulation study.

ON THEIL'S METHOD IN FUZZY LINEAR REGRESSION MODELS

  • Choi, Seung Hoe;Jung, Hye-Young;Lee, Woo-Joo;Yoon, Jin Hee
    • 대한수학회논문집
    • /
    • 제31권1호
    • /
    • pp.185-198
    • /
    • 2016
  • Regression analysis is an analyzing method of regression model to explain the statistical relationship between explanatory variable and response variables. This paper propose a fuzzy regression analysis applying Theils method which is not sensitive to outliers. This method use medians of rate of increment based on randomly chosen pairs of each components of ${\alpha}$-level sets of fuzzy data in order to estimate the coefficients of fuzzy regression model. An example and two simulation results are given to show fuzzy Theils estimator is more robust than the fuzzy least squares estimator.

다양한 오염 상황에서의 여러 로버스트 회귀추정량의 비교연구 (A Comparison Study of Several Robust Regression Estimators under Various Contaminations)

  • 김지연;황진수;김진경
    • 응용통계연구
    • /
    • 제17권3호
    • /
    • pp.475-488
    • /
    • 2004
  • 위치추정량에서 로버스트한 추정기법 중의 하나로 알려진 데이터 뎁스(depth)를 회귀추정에 적용한 회귀뎁스(regression depth)는 Rousseeuw and Hubert(1999)에 의하여 제안되었다. 이 이외의 회귀뎁스 추정량으로는 심플리셜(simplicial) 뎁스와 사영(projection) 개념의 뎁스 회귀추정량들이 있다. 본 논문에서는 뎁스 기반 회귀추정량들의 성능에 대한 모의실험을 여러 오염 조건에서 행하여 비교하였으며 기존의 우수한 로버스트성을 지니는 추정량으로 최근에 제안된 HBR추정량(Chang et al., 1999)들과의 비교연구도 하였다. 2차원 공간에서의 실험은 전반적으로 사영뎁스기반 회귀추정량이 좋은 결과를 보여주었다.

An Efficient Mallows-Type One-Step GM-Estimator in linear Models

  • Song, Moon-Sup;Park, Changsoon;Nam, Ho-Soo
    • Journal of the Korean Statistical Society
    • /
    • 제27권3호
    • /
    • pp.369-383
    • /
    • 1998
  • This paper deals with a robust regression estimator. We propose an efficient one-step GM-estimator, which has a bounded influence function and a high breakdown point. The main idea of this paper is to use the Mallows-type weights which depend on both the predictor variables and the residuals from a high breakdown initial estimator. The proposed weighting scheme severely downweights the bad leverage points and slightly downweights the good leverage points. Under some regularity conditions, we compute the finite-sample breakdown point and prove the asymptotic normality. Some simulation results and a numerical example are also presented.

  • PDF