• Title/Summary/Keyword: Generalized Least Squares Method

Search Result 99, Processing Time 0.022 seconds

ON DIFFERENTIABILITY OF THE MATRIX TRACE OPERATOR AND ITS APPLICATIONS

  • Dulov, E.V.;Andrianova, N.A.
    • Journal of applied mathematics & informatics
    • /
    • v.8 no.1
    • /
    • pp.97-109
    • /
    • 2001
  • This article is devoted to “forgotten” and rarely used technique of matrix analysis, introduced in 60-70th and enhanced by authors. We will study the matrix trace operator and it’s differentiability. This idea generalizes the notion of scalar derivative for matrix computations. The list of the most common derivatives is given at the end of the article. Additionally we point out a close connection of this technique with a least square problem in it’s classical and generalized case.

LS-SVM for large data sets

  • Park, Hongrak;Hwang, Hyungtae;Kim, Byungju
    • Journal of the Korean Data and Information Science Society
    • /
    • v.27 no.2
    • /
    • pp.549-557
    • /
    • 2016
  • In this paper we propose multiclassification method for large data sets by ensembling least squares support vector machines (LS-SVM) with principal components instead of raw input vector. We use the revised one-vs-all method for multiclassification, which is one of voting scheme based on combining several binary classifications. The revised one-vs-all method is performed by using the hat matrix of LS-SVM ensemble, which is obtained by ensembling LS-SVMs trained using each random sample from the whole large training data. The leave-one-out cross validation (CV) function is used for the optimal values of hyper-parameters which affect the performance of multiclass LS-SVM ensemble. We present the generalized cross validation function to reduce computational burden of leave-one-out CV functions. Experimental results from real data sets are then obtained to illustrate the performance of the proposed multiclass LS-SVM ensemble.

Multiclass LS-SVM ensemble for large data

  • Hwang, Hyungtae
    • Journal of the Korean Data and Information Science Society
    • /
    • v.26 no.6
    • /
    • pp.1557-1563
    • /
    • 2015
  • Multiclass classification is typically performed using the voting scheme method based on combining binary classifications. In this paper we propose multiclass classification method for large data, which can be regarded as the revised one-vs-all method. The multiclass classification is performed by using the hat matrix of least squares support vector machine (LS-SVM) ensemble, which is obtained by aggregating individual LS-SVM trained on each subset of whole large data. The cross validation function is defined to select the optimal values of hyperparameters which affect the performance of multiclass LS-SVM proposed. We obtain the generalized cross validation function to reduce computational burden of cross validation function. Experimental results are then presented which indicate the performance of the proposed method.

Estimation and variable selection in censored regression model with smoothly clipped absolute deviation penalty

  • Shim, Jooyong;Bae, Jongsig;Seok, Kyungha
    • Journal of the Korean Data and Information Science Society
    • /
    • v.27 no.6
    • /
    • pp.1653-1660
    • /
    • 2016
  • Smoothly clipped absolute deviation (SCAD) penalty is known to satisfy the desirable properties for penalty functions like as unbiasedness, sparsity and continuity. In this paper, we deal with the regression function estimation and variable selection based on SCAD penalized censored regression model. We use the local linear approximation and the iteratively reweighted least squares algorithm to solve SCAD penalized log likelihood function. The proposed method provides an efficient method for variable selection and regression function estimation. The generalized cross validation function is presented for the model selection. Applications of the proposed method are illustrated through the simulated and a real example.

Censored varying coefficient regression model using Buckley-James method

  • Shim, Jooyong;Seok, Kyungha
    • Journal of the Korean Data and Information Science Society
    • /
    • v.28 no.5
    • /
    • pp.1167-1177
    • /
    • 2017
  • The censored regression using the pseudo-response variable proposed by Buckley and James has been one of the most well-known models. Recently, the varying coefficient regression model has received a great deal of attention as an important tool for modeling. In this paper we propose a censored varying coefficient regression model using Buckley-James method to consider situations where the regression coefficients of the model are not constant but change as the smoothing variables change. By using the formulation of least squares support vector machine (LS-SVM), the coefficient estimators of the proposed model can be easily obtained from simple linear equations. Furthermore, a generalized cross validation function can be easily derived. In this paper, we evaluated the proposed method and demonstrated the adequacy through simulate data sets and real data sets.

Feature selection in the semivarying coefficient LS-SVR

  • Hwang, Changha;Shim, Jooyong
    • Journal of the Korean Data and Information Science Society
    • /
    • v.28 no.2
    • /
    • pp.461-471
    • /
    • 2017
  • In this paper we propose a feature selection method identifying important features in the semivarying coefficient model. One important issue in semivarying coefficient model is how to estimate the parametric and nonparametric components. Another issue is how to identify important features in the varying and the constant effects. We propose a feature selection method able to address this issue using generalized cross validation functions of the varying coefficient least squares support vector regression (LS-SVR) and the linear LS-SVR. Numerical studies indicate that the proposed method is quite effective in identifying important features in the varying and the constant effects in the semivarying coefficient model.

Quasi-Likelihood Approach for Linear Models with Censored Data

  • Ha, Il-Do;Cho, Geon-Ho
    • Journal of the Korean Data and Information Science Society
    • /
    • v.9 no.2
    • /
    • pp.219-225
    • /
    • 1998
  • The parameters in linear models with censored normal responses are usually estimated by the iterative maximum likelihood and least square methods. However, the iterative least square method is simple but hardly has theoretical justification, and the iterative maximum likelihood estimating equations are complicatedly derived. In this paper, we justify these methods via Wedderburn (1974)'s quasi-likelihood approach. This provides an explicit justification for the iterative least square method and also directly the iterative maximum likelihood method for estimating the regression coefficients.

  • PDF

Diagnostic Study of Problems under Asymptotically Generalized Least Squares Estimation of Physical Health Model

  • Kim, Jung-Hee
    • Journal of Korean Academy of Nursing
    • /
    • v.29 no.5
    • /
    • pp.1030-1041
    • /
    • 1999
  • This study examined those problems noticed under the Asymptotically Generalized Least Squares estimator in evaluating a structural model of physical health. The problems were highly correlated parameter estimates and high standard errors of some parameter estimates. Separate analyses of the endogenous part of the model and of the metric of a latent factor revealed a highly skewed and kurtotic measurement indicator as the focal point of the manifested problems. Since the sample sizes are far below that needed to produce adequate AGLS estimates in the given modeling conditions, the adequacy of the Maximum Likelihood estimator is further examined with the robust statistics and the bootstrap method. These methods demonstrated that the ML methods were unbiased and statistical decisions based upon the ML standard errors remained almost the same. Suggestions are made for future studies adopting structural equation modeling technique in terms of selecting of a reference indicator and adopting those statistics corrected for nonormality.

  • PDF

Variable selection in L1 penalized censored regression

  • Hwang, Chang-Ha;Kim, Mal-Suk;Shi, Joo-Yong
    • Journal of the Korean Data and Information Science Society
    • /
    • v.22 no.5
    • /
    • pp.951-959
    • /
    • 2011
  • The proposed method is based on a penalized censored regression model with L1-penalty. We use the iteratively reweighted least squares procedure to solve L1 penalized log likelihood function of censored regression model. It provide the efficient computation of regression parameters including variable selection and leads to the generalized cross validation function for the model selection. Numerical results are then presented to indicate the performance of the proposed method.

Fixed size LS-SVM for multiclassification problems of large data sets

  • Hwang, Hyung-Tae
    • Journal of the Korean Data and Information Science Society
    • /
    • v.21 no.3
    • /
    • pp.561-567
    • /
    • 2010
  • Multiclassification is typically performed using voting scheme methods based on combining a set of binary classifications. In this paper we use multiclassification method with a hat matrix of least squares support vector machine (LS-SVM), which can be regarded as the revised one-against-all method. To tackle multiclass problems for large data, we use the $Nystr\ddot{o}m$ approximation and the quadratic Renyi entropy with estimation in the primal space such as used in xed size LS-SVM. For the selection of hyperparameters, generalized cross validation techniques are employed. Experimental results are then presented to indicate the performance of the proposed procedure.