• Title/Summary/Keyword: Generalized Cross Validation Function

Search Result 42, Processing Time 0.028 seconds

Multiclass LS-SVM ensemble for large data

  • Hwang, Hyungtae
    • Journal of the Korean Data and Information Science Society
    • /
    • v.26 no.6
    • /
    • pp.1557-1563
    • /
    • 2015
  • Multiclass classification is typically performed using the voting scheme method based on combining binary classifications. In this paper we propose multiclass classification method for large data, which can be regarded as the revised one-vs-all method. The multiclass classification is performed by using the hat matrix of least squares support vector machine (LS-SVM) ensemble, which is obtained by aggregating individual LS-SVM trained on each subset of whole large data. The cross validation function is defined to select the optimal values of hyperparameters which affect the performance of multiclass LS-SVM proposed. We obtain the generalized cross validation function to reduce computational burden of cross validation function. Experimental results are then presented which indicate the performance of the proposed method.

e-SVR using IRWLS Procedure

  • Shim, Joo-Yong
    • Journal of the Korean Data and Information Science Society
    • /
    • v.16 no.4
    • /
    • pp.1087-1094
    • /
    • 2005
  • e-insensitive support vector regression(e-SVR) is capable of providing more complete description of the linear and nonlinear relationships among random variables. In this paper we propose an iterative reweighted least squares(IRWLS) procedure to solve the quadratic problem of e-SVR with a modified loss function. Furthermore, we introduce the generalized approximate cross validation function to select the hyperparameters which affect the performance of e-SVR. Experimental results are then presented which illustrate the performance of the IRWLS procedure for e-SVR.

  • PDF

Mixed-effects LS-SVR for longitudinal dat

  • Cho, Dae-Hyeon
    • Journal of the Korean Data and Information Science Society
    • /
    • v.21 no.2
    • /
    • pp.363-369
    • /
    • 2010
  • In this paper we propose a mixed-effects least squares support vector regression (LS-SVR) for longitudinal data. We add a random-effect term in the optimization function of LS-SVR to take random effects into LS-SVR for analyzing longitudinal data. We also present the model selection method that employs generalized cross validation function for choosing the hyper-parameters which affect the performance of the mixed-effects LS-SVR. A simulated example is provided to indicate the usefulness of mixed-effect method for analyzing longitudinal data.

Cox proportional hazard model with L1 penalty

  • Hwang, Chang-Ha;Shim, Joo-Yong
    • Journal of the Korean Data and Information Science Society
    • /
    • v.22 no.3
    • /
    • pp.613-618
    • /
    • 2011
  • The proposed method is based on a penalized log partial likelihood of Cox proportional hazard model with L1-penalty. We use the iteratively reweighted least squares procedure to solve L1 penalized log partial likelihood function of Cox proportional hazard model. It provide the ecient computation including variable selection and leads to the generalized cross validation function for the model selection. Experimental results are then presented to indicate the performance of the proposed procedure.

Support vector quantile regression for longitudinal data

  • Hwang, Chang-Ha
    • Journal of the Korean Data and Information Science Society
    • /
    • v.21 no.2
    • /
    • pp.309-316
    • /
    • 2010
  • Support vector quantile regression (SVQR) is capable of providing more complete description of the linear and nonlinear relationships among response and input variables. In this paper we propose a weighted SVQR for the longitudinal data. Furthermore, we introduce the generalized approximate cross validation function to select the hyperparameters which affect the performance of SVQR. Experimental results are the presented, which illustrate the performance of the proposed SVQR.

Kernel Machine for Poisson Regression

  • Hwang, Chang-Ha
    • Journal of the Korean Data and Information Science Society
    • /
    • v.18 no.3
    • /
    • pp.767-772
    • /
    • 2007
  • A kernel machine is proposed as an estimating procedure for the linear and nonlinear Poisson regression, which is based on the penalized negative log-likelihood. The proposed kernel machine provides the estimate of the mean function of the response variable, where the canonical parameter is related to the input vector in a nonlinear form. The generalized cross validation(GCV) function of MSE-type is introduced to determine hyperparameters which affect the performance of the machine. Experimental results are then presented which indicate the performance of the proposed machine.

  • PDF

Kernel Ridge Regression with Randomly Right Censored Data

  • Shim, Joo-Yong;Seok, Kyung-Ha
    • Communications for Statistical Applications and Methods
    • /
    • v.15 no.2
    • /
    • pp.205-211
    • /
    • 2008
  • This paper deals with the estimations of kernel ridge regression when the responses are subject to randomly right censoring. The iterative reweighted least squares(IRWLS) procedure is employed to treat censored observations. The hyperparameters of model which affect the performance of the proposed procedure are selected by a generalized cross validation(GCV) function. Experimental results are then presented which indicate the performance of the proposed procedure.

Censored Kernel Ridge Regression

  • Shim, Joo-Yong
    • Journal of the Korean Data and Information Science Society
    • /
    • v.16 no.4
    • /
    • pp.1045-1052
    • /
    • 2005
  • This paper deals with the estimations of kernel ridge regression when the responses are subject to randomly right censoring. The weighted data are formed by redistributing the weights of the censored data to the uncensored data. Then kernel ridge regression can be taken up with the weighted data. The hyperparameters of model which affect the performance of the proposed procedure are selected by a generalized approximate cross validation(GACV) function. Experimental results are then presented which indicate the performance of the proposed procedure.

  • PDF

LS-SVM for large data sets

  • Park, Hongrak;Hwang, Hyungtae;Kim, Byungju
    • Journal of the Korean Data and Information Science Society
    • /
    • v.27 no.2
    • /
    • pp.549-557
    • /
    • 2016
  • In this paper we propose multiclassification method for large data sets by ensembling least squares support vector machines (LS-SVM) with principal components instead of raw input vector. We use the revised one-vs-all method for multiclassification, which is one of voting scheme based on combining several binary classifications. The revised one-vs-all method is performed by using the hat matrix of LS-SVM ensemble, which is obtained by ensembling LS-SVMs trained using each random sample from the whole large training data. The leave-one-out cross validation (CV) function is used for the optimal values of hyper-parameters which affect the performance of multiclass LS-SVM ensemble. We present the generalized cross validation function to reduce computational burden of leave-one-out CV functions. Experimental results from real data sets are then obtained to illustrate the performance of the proposed multiclass LS-SVM ensemble.

Sparse kernel classication using IRWLS procedure

  • Kim, Dae-Hak
    • Journal of the Korean Data and Information Science Society
    • /
    • v.20 no.4
    • /
    • pp.749-755
    • /
    • 2009
  • Support vector classification (SVC) provides more complete description of the lin-ear and nonlinear relationships between input vectors and classifiers. In this paper. we propose the sparse kernel classifier to solve the optimization problem of classification with a modified hinge loss function and absolute loss function, which provides the efficient computation and the sparsity. We also introduce the generalized cross validation function to select the hyper-parameters which affects the classification performance of the proposed method. Experimental results are then presented which illustrate the performance of the proposed procedure for classification.

  • PDF