• Title/Summary/Keyword: regression function

Search Result 2,156, Processing Time 0.025 seconds

Regression Quantile Estimations on Censored Survival Data

  • Shim, Joo-Yong
    • Journal of the Korean Data and Information Science Society
    • /
    • v.13 no.2
    • /
    • pp.31-38
    • /
    • 2002
  • In the case of multiple survival times which might be censored at each covariate vector, we study the regression quantile estimations in this paper. The estimations are based on the empirical distribution functions of the censored times and the sample quantiles of the observed survival times at each covariate vector and the weighted least square method is applied for the estimation of the regression quantile. The estimators are shown to be asymptotically normally distributed under some regularity conditions.

  • PDF

Separate Fuzzy Regression with Crisp Input and Fuzzy Output

  • Yoon, Jin-Hee;Choi, Seung-Hoe
    • Journal of the Korean Data and Information Science Society
    • /
    • v.18 no.2
    • /
    • pp.301-314
    • /
    • 2007
  • The aim of this paper is to deal with a method to construct a separate fuzzy regression model with crisp input and fuzzy output data using a best response function for the center and the width of the predicted output. Also we introduce the crisp mean and variance of the predicted fuzzy value and also give some examples to compare a performance of the proposed fuzzy model with various other fuzzy regression model.

  • PDF

VARIANCE ESTIMATION OF ERROR IN THE REGRESSION MODEL AT A POINT

  • Oh, Jong-Chul
    • Journal of applied mathematics & informatics
    • /
    • v.13 no.1_2
    • /
    • pp.501-508
    • /
    • 2003
  • Although the estimate of regression function is important, some have focused the variance estimation of error term in regression model. Different variance estimators perform well under different conditions. In many practical situations, it is rather hard to assess which conditions are approximately satisfied so as to identify the best variance estimator for the given data. In this article, we suggest SHM estimator compared to LS estimator, which is common estimator using in parametric multiple regression analysis. Moreover, a combined estimator of variance, VEM, is suggested. In the simulation study it is shown that VEM performs well in practice.

A Study on Sensitivity Analysis in Ridge Regression (능형 회귀에서의 민감도 분석에 관한 연구)

  • Kim, Soon-Kwi
    • Journal of Korean Society for Quality Management
    • /
    • v.19 no.1
    • /
    • pp.1-15
    • /
    • 1991
  • In this paper, we discuss and review various measures which have been presented for studying outliers, high-leverage points, and influential observations when ridge regression estimation is adopted. We derive the influence function for ${\underline{\hat{\beta}}}\small{R}$, the ridge regression estimator, and discuss its various finite sample approximations when ridge regression is postulated. We also study several diagnostic measures such as Welsh-Kuh's distance, Cook's distance etc.

  • PDF

Robustness of Minimum Disparity Estimators in Linear Regression Models

  • Pak, Ro-Jin
    • Journal of the Korean Statistical Society
    • /
    • v.24 no.2
    • /
    • pp.349-360
    • /
    • 1995
  • This paper deals with the robustness properties of the minimum disparity estimation in linear regression models. The estimators defined as statistical quantities whcih minimize the blended weight Hellinger distance between a weighted kernel density estimator of the residuals and a smoothed model density of the residuals. It is shown that if the weights of the density estimator are appropriately chosen, the estimates of the regression parameters are robust.

  • PDF

A Comparison Study on the Error Criteria in Nonparametric Regression Estimators

  • Chung, Sung-S.
    • Journal of the Korean Data and Information Science Society
    • /
    • v.11 no.2
    • /
    • pp.335-345
    • /
    • 2000
  • Most context use the classical norms on function spaces as the error criteria. Since these norms are all based on the vertical distances between the curves, these can be quite inappropriate from a visual notion of distance. Visual errors in Marron and Tsybakov(1995) correspond more closely to "what the eye sees". Simulation is performed to compare the performance of the regression smoothers in view of MISE and the visual error. It shows that the visual error can be used as a possible candidate of error criteria in the kernel regression estimation.

  • PDF

Variable selection in the kernel Cox regression

  • Shim, Joo-Yong
    • Journal of the Korean Data and Information Science Society
    • /
    • v.22 no.4
    • /
    • pp.795-801
    • /
    • 2011
  • In machine learning and statistics it is often the case that some variables are not important, while some variables are more important than others. We propose a novel algorithm for selecting such relevant variables in the kernel Cox regression. We employ the weighted version of ANOVA decomposition kernels to choose optimal subset of relevant variables in the kernel Cox regression. Experimental results are then presented which indicate the performance of the proposed method.

Switching Regression Analysis via Fuzzy LS-SVM

  • Hwang, Chang-Ha
    • Journal of the Korean Data and Information Science Society
    • /
    • v.17 no.2
    • /
    • pp.609-617
    • /
    • 2006
  • A new fuzzy c-regression algorithm for switching regression analysis is presented, which combines fuzzy c-means clustering and least squares support vector machine. This algorithm can detect outliers in switching regression models while yielding the simultaneous estimates of the associated parameters together with a fuzzy c-partitions of data. It can be employed for the model-free nonlinear regression which does not assume the underlying form of the regression function. We illustrate the new approach with some numerical examples that show how it can be used to fit switching regression models to almost all types of mixed data.

  • PDF

Varying coefficient model with errors in variables (가변계수 측정오차 회귀모형)

  • Sohn, Insuk;Shim, Jooyong
    • Journal of the Korean Data and Information Science Society
    • /
    • v.28 no.5
    • /
    • pp.971-980
    • /
    • 2017
  • The varying coefficient regression model has gained lots of attention since it is capable to model dynamic changes of regression coefficients in many regression problems of science. In this paper we propose a varying coefficient regression model that effectively considers the errors on both input and response variables, which utilizes the kernel method in estimating the varying coefficient which is the unknown nonlinear function of smoothing variables. We provide a generalized cross validation method for choosing the hyper-parameters which affect the performance of the proposed model. The proposed method is evaluated through numerical studies.

Screening and Clustering for Time-course Yeast Microarray Gene Expression Data using Gaussian Process Regression (효모 마이크로어레이 유전자 발현데이터에 대한 가우시안 과정 회귀를 이용한 유전자 선별 및 군집화)

  • Kim, Jaehee;Kim, Taehoun
    • The Korean Journal of Applied Statistics
    • /
    • v.26 no.3
    • /
    • pp.389-399
    • /
    • 2013
  • This article introduces Gaussian process regression and shows its application with time-course microarray gene expression data. Gene screening for yeast cell cycle microarray expression data is accomplished with a ratio of log marginal likelihood that uses Gaussian process regression with a squared exponential covariance kernel function. Gaussian process regression fitting with each gene is done and shown with the nine top ranking genes. With the screened data the Gaussian model-based clustering is done and its silhouette values are calculated for cluster validity.