• Title/Summary/Keyword: regression function

Search Result 2,149, Processing Time 0.03 seconds

Variable Selection Via Penalized Regression

  • Yoon, Young-Joo;Song, Moon-Sup
    • Communications for Statistical Applications and Methods
    • /
    • v.12 no.3
    • /
    • pp.615-624
    • /
    • 2005
  • In this paper, we review the variable-selection properties of LASSO and SCAD in penalized regression. To improve the weakness of SCAD for high noise level, we propose a new penalty function called MSCAD which relaxes the unbiasedness condition of SCAD. In order to compare MSCAD with LASSO and SCAD, comparative studies are performed on simulated datasets and also on a real dataset. The performances of penalized regression methods are compared in terms of relative model error and the estimates of coefficients. The results of experiments show that the performance of MSCAD is between those of LASSO and SCAD as expected.

Method to Construct Feature Functions of C-CRF Using Regression Tree Analysis (회귀나무 분석을 이용한 C-CRF의 특징함수 구성 방법)

  • Ahn, Gil Seung;Hur, Sun
    • Journal of Korean Institute of Industrial Engineers
    • /
    • v.41 no.4
    • /
    • pp.338-343
    • /
    • 2015
  • We suggest a method to configure feature functions of continuous conditional random field (C-CRF). Regression tree and similarity analysis are introduced to construct the first and second feature functions of C-CRF, respectively. Rules from the regression tree are transformed to logic functions. If a logic in the set of rules is true for a data then it returns the corresponding value of leaf node and zero, otherwise. We build an Euclidean similarity matrix to define neighborhood, which constitute the second feature function. Using two feature functions, we make a C-CRF model and an illustrate example is provided.

Research on Influence Factors on Pulmonary Functions in Korean-Chinese Children

  • Kim, Dae-Seon;Yu, Seung-Do;Cha, Jung-Hoon;Na, Jin-Gyun
    • Proceedings of the Korean Environmental Health Society Conference
    • /
    • 2003.06a
    • /
    • pp.192-195
    • /
    • 2003
  • To identify the difference between Korean-Chinese and Korean children's variation of pulmonary function with personal factors (suck as age, height, and weight), we performed pulmonary function test (PFT) and measured personal factors of 200 Korean-Chinese children participants from two elementary schools of Beijing and Melons city in China. Regression analysis was utilized to determine which personal factors were significantly correlated with PFT measure (FVC and FEV$_1$). We compared the regression model from this study with those of other studies of Korean children. Similar to other studies, we found that the most important variable, influencing PFT measure, was height, whereas addition of either age or weight in the regression virtually did not increase the accuracy. As the result of comparison of the regression model from this study with those of other studies of Korean children, variation in FVC or FEV$_1$ with height were similar.

  • PDF

Geographically weighted least squares-support vector machine

  • Hwang, Changha;Shim, Jooyong
    • Journal of the Korean Data and Information Science Society
    • /
    • v.28 no.1
    • /
    • pp.227-235
    • /
    • 2017
  • When the spatial information of each location is given specifically as coordinates it is popular to use the geographically weighted regression to incorporate the spatial information by assuming that the regression parameters vary spatially across locations. In this paper, we relax the linearity assumption of geographically weighted regression and propose a geographically weighted least squares-support vector machine for estimating geographically weighted mean by using the basic concept of kernel machines. Generalized cross validation function is induced for the model selection. Numerical studies with real datasets have been conducted to compare the performance of proposed method with other methods for predicting geographically weighted mean.

VARIABLE SELECTION VIA PENALIZED REGRESSION

  • Yoon, Young-Joo;Song, Moon-Sup
    • Proceedings of the Korean Statistical Society Conference
    • /
    • 2005.05a
    • /
    • pp.7-12
    • /
    • 2005
  • In this paper, we review the variable-selection properties of LASSO and SCAD in penalized regression. To improve the weakness of SCAD for high noise level, we propose a new penalty function called MSCAD which relaxes the unbiasedness condition of SCAD. In order to compare MSCAD with LASSO and SCAD, comparative studies are performed on simulated datasets and also on a real dataset. The performances of penalized regression methods are compared in terms of relative model error and the estimates of coefficients. The results of experiments show that the performance of MSCAD is between those of LASSO and SCAD as expected.

  • PDF

3D Shape Recovery from Image Focus using Gaussian Process Regression (가우시안 프로세스 회귀분석을 이용한 영상초점으로부터의 3차원 형상 재구성)

  • Mahmood, Muhammad Tariq;Choi, Young Kyu
    • Journal of the Semiconductor & Display Technology
    • /
    • v.11 no.3
    • /
    • pp.19-25
    • /
    • 2012
  • The accuracy of Shape From Focus (SFF) technique depends on the quality of the focus measurements which are computed through a focus measure operator. In this paper, we introduce a new approach to estimate 3D shape of an object based on Gaussian process regression. First, initial depth is estimated by applying a conventional focus measure on image sequence and maximizing it in the optical direction. In second step, input feature vectors consisting of eginvalues are computed from 3D neighborhood around the initial depth. Finally, by utilizing these features, a latent function is developed through Gaussian process regression to estimate accurate depth. The proposed approach takes advantages of the multivariate statistical features and covariance function. The proposed method is tested by using image sequences of various objects. Experimental results demonstrate the efficacy of the proposed scheme.

Prediction Intervals for LS-SVM Regression using the Bootstrap

  • Shim, Joo-Yong;Hwang, Chang-Ha
    • Journal of the Korean Data and Information Science Society
    • /
    • v.14 no.2
    • /
    • pp.337-343
    • /
    • 2003
  • In this paper we present the prediction interval estimation method using bootstrap method for least squares support vector machine(LS-SVM) regression, which allows us to perform even nonlinear regression by constructing a linear regression function in a high dimensional feature space. The bootstrap method is applied to generate the bootstrap sample for estimation of the covariance of the regression parameters consisting of the optimal bias and Lagrange multipliers. Experimental results are then presented which indicate the performance of this algorithm.

  • PDF

Weighted LS-SVM Regression for Right Censored Data

  • Kim, Dae-Hak;Jeong, Hyeong-Chul
    • Communications for Statistical Applications and Methods
    • /
    • v.13 no.3
    • /
    • pp.765-776
    • /
    • 2006
  • In this paper we propose an estimation method on the regression model with randomly censored observations of the training data set. The weighted least squares support vector machine regression is applied for the regression function estimation by incorporating the weights assessed upon each observation in the optimization problem. Numerical examples are given to show the performance of the proposed estimation method.

Fuzzy regression using regularlization method based on Tanaka's model

  • Hong Dug-Hun;Kim Kyung-Tae
    • Journal of the Korean Institute of Intelligent Systems
    • /
    • v.16 no.4
    • /
    • pp.499-505
    • /
    • 2006
  • Regularlization approach to regression can be easily found in Statistics and Information Science literature. The technique of regularlization was introduced as a way of controlling the smoothness properties of regression function. In this paper, we have presented a new method to evaluate linear and non-linear fuzzy regression model based on Tanaka's model using the idea of regularlization technique. Especially this method is a very attractive approach to model non -linear fuzzy data.

Hidden Truncation Normal Regression

  • Kim, Sungsu
    • Communications for Statistical Applications and Methods
    • /
    • v.19 no.6
    • /
    • pp.793-798
    • /
    • 2012
  • In this paper, we propose regression methods based on the likelihood function. We assume Arnold-Beaver Skew Normal(ABSN) errors in a simple linear regression model. It was shown that the novel method performs better with an asymmetric data set compared to the usual regression model with the Gaussian errors. The utility of a novel method is demonstrated through simulation and real data sets.