• 제목/요약/키워드: penalized function

검색결과 60건 처리시간 0.021초

A Penalized Principal Components using Probabilistic PCA

  • Park, Chong-Sun;Wang, Morgan
    • 한국통계학회:학술대회논문집
    • /
    • 한국통계학회 2003년도 춘계 학술발표회 논문집
    • /
    • pp.151-156
    • /
    • 2003
  • Variable selection algorithm for principal component analysis using penalized likelihood method is proposed. We will adopt a probabilistic principal component idea to utilize likelihood function for the problem and use HARD penalty function to force coefficients of any irrelevant variables for each component to zero. Consistency and sparsity of coefficient estimates will be provided with results of small simulated and illustrative real examples.

  • PDF

Estimating Parameters in Muitivariate Normal Mixtures

  • Ahn, Sung-Mahn;Baik, Sung-Wook
    • Communications for Statistical Applications and Methods
    • /
    • 제18권3호
    • /
    • pp.357-365
    • /
    • 2011
  • This paper investigates a penalized likelihood method for estimating the parameter of normal mixtures in multivariate settings with full covariance matrices. The proposed model estimates the number of components through the addition of a penalty term to the usual likelihood function and the construction of a penalized likelihood function. We prove the consistency of the estimator and present the simulation results on the multi-dimensional nor-mal mixtures up to the 8-dimension.

Mean estimation of small areas using penalized spline mixed-model under informative sampling

  • Chytrasari, Angela N.R.;Kartiko, Sri Haryatmi;Danardono, Danardono
    • Communications for Statistical Applications and Methods
    • /
    • 제27권3호
    • /
    • pp.349-363
    • /
    • 2020
  • Penalized spline is a suitable nonparametric approach in estimating mean model in small area. However, application of the approach in informative sampling in a published article is uncommon. We propose a semiparametric mixed-model using penalized spline under informative sampling to estimate mean of small area. The response variable is explained in terms of mean model, informative sample effect, area random effect and unit error. We approach the mean model by penalized spline and utilize a penalized spline function of the inclusion probability to account for the informative sample effect. We determine the best and unbiased estimators for coefficient model and derive the restricted maximum likelihood estimators for the variance components. A simulation study shows a decrease in the average absolute bias produced by the proposed model. A decrease in the root mean square error also occurred except in some quadratic cases. The use of linear and quadratic penalized spline to approach the function of the inclusion probability provides no significant difference distribution of root mean square error, except for few smaller samples.

Variable Selection Via Penalized Regression

  • Yoon, Young-Joo;Song, Moon-Sup
    • Communications for Statistical Applications and Methods
    • /
    • 제12권3호
    • /
    • pp.615-624
    • /
    • 2005
  • In this paper, we review the variable-selection properties of LASSO and SCAD in penalized regression. To improve the weakness of SCAD for high noise level, we propose a new penalty function called MSCAD which relaxes the unbiasedness condition of SCAD. In order to compare MSCAD with LASSO and SCAD, comparative studies are performed on simulated datasets and also on a real dataset. The performances of penalized regression methods are compared in terms of relative model error and the estimates of coefficients. The results of experiments show that the performance of MSCAD is between those of LASSO and SCAD as expected.

VARIABLE SELECTION VIA PENALIZED REGRESSION

  • Yoon, Young-Joo;Song, Moon-Sup
    • 한국통계학회:학술대회논문집
    • /
    • 한국통계학회 2005년도 춘계 학술발표회 논문집
    • /
    • pp.7-12
    • /
    • 2005
  • In this paper, we review the variable-selection properties of LASSO and SCAD in penalized regression. To improve the weakness of SCAD for high noise level, we propose a new penalty function called MSCAD which relaxes the unbiasedness condition of SCAD. In order to compare MSCAD with LASSO and SCAD, comparative studies are performed on simulated datasets and also on a real dataset. The performances of penalized regression methods are compared in terms of relative model error and the estimates of coefficients. The results of experiments show that the performance of MSCAD is between those of LASSO and SCAD as expected.

  • PDF

Variable selection in L1 penalized censored regression

  • Hwang, Chang-Ha;Kim, Mal-Suk;Shi, Joo-Yong
    • Journal of the Korean Data and Information Science Society
    • /
    • 제22권5호
    • /
    • pp.951-959
    • /
    • 2011
  • The proposed method is based on a penalized censored regression model with L1-penalty. We use the iteratively reweighted least squares procedure to solve L1 penalized log likelihood function of censored regression model. It provide the efficient computation of regression parameters including variable selection and leads to the generalized cross validation function for the model selection. Numerical results are then presented to indicate the performance of the proposed method.

Cox proportional hazard model with L1 penalty

  • Hwang, Chang-Ha;Shim, Joo-Yong
    • Journal of the Korean Data and Information Science Society
    • /
    • 제22권3호
    • /
    • pp.613-618
    • /
    • 2011
  • The proposed method is based on a penalized log partial likelihood of Cox proportional hazard model with L1-penalty. We use the iteratively reweighted least squares procedure to solve L1 penalized log partial likelihood function of Cox proportional hazard model. It provide the ecient computation including variable selection and leads to the generalized cross validation function for the model selection. Experimental results are then presented to indicate the performance of the proposed procedure.

Multiclass Support Vector Machines with SCAD

  • Jung, Kang-Mo
    • Communications for Statistical Applications and Methods
    • /
    • 제19권5호
    • /
    • pp.655-662
    • /
    • 2012
  • Classification is an important research field in pattern recognition with high-dimensional predictors. The support vector machine(SVM) is a penalized feature selector and classifier. It is based on the hinge loss function, the non-convex penalty function, and the smoothly clipped absolute deviation(SCAD) suggested by Fan and Li (2001). We developed the algorithm for the multiclass SVM with the SCAD penalty function using the local quadratic approximation. For multiclass problems we compared the performance of the SVM with the $L_1$, $L_2$ penalty functions and the developed method.

Kernel Machine for Poisson Regression

  • Hwang, Chang-Ha
    • Journal of the Korean Data and Information Science Society
    • /
    • 제18권3호
    • /
    • pp.767-772
    • /
    • 2007
  • A kernel machine is proposed as an estimating procedure for the linear and nonlinear Poisson regression, which is based on the penalized negative log-likelihood. The proposed kernel machine provides the estimate of the mean function of the response variable, where the canonical parameter is related to the input vector in a nonlinear form. The generalized cross validation(GCV) function of MSE-type is introduced to determine hyperparameters which affect the performance of the machine. Experimental results are then presented which indicate the performance of the proposed machine.

  • PDF

Semiparametric Bayesian Estimation under Structural Measurement Error Model

  • Hwang, Jin-Seub;Kim, Dal-Ho
    • Communications for Statistical Applications and Methods
    • /
    • 제17권4호
    • /
    • pp.551-560
    • /
    • 2010
  • This paper considers a Bayesian approach to modeling a flexible regression function under structural measurement error model. The regression function is modeled based on semiparametric regression with penalized splines. Model fitting and parameter estimation are carried out in a hierarchical Bayesian framework using Markov chain Monte Carlo methodology. Their performances are compared with those of the estimators under structural measurement error model without a semiparametric component.