• Title/Summary/Keyword: stepwise variable selection

Search Result 53, Processing Time 0.026 seconds

Analysis of Client Propensity in Cyber Counseling Using Bayesian Variable Selection

  • Pi, Su-Young
    • International Journal of Fuzzy Logic and Intelligent Systems
    • /
    • v.6 no.4
    • /
    • pp.277-281
    • /
    • 2006
  • Cyber counseling, one of the most compatible type of consultation for the information society, enables people to reveal their mental agonies and private problems anonymously, since it does not require face-to-face interview between a counsellor and a client. However, there are few cyber counseling centers which provide high quality and trustworthy service, although the number of cyber counseling center has highly increased. Therefore, this paper is intended to enable an appropriate consultation for each client by analyzing client propensity using Bayesian variable selection. Bayesian variable selection is superior to stepwise regression analysis method in finding out a regression model. Stepwise regression analysis method, which has been generally used to analyze individual propensity in linear regression model, is not efficient since it is hard to select a proper model for its own defects. In this paper, based on the case database of current cyber counseling centers in the web, we will analyze clients' propensities using Bayesian variable selection to enable individually target counseling and to activate cyber counseling programs.

Evaluating Variable Selection Techniques for Multivariate Linear Regression (다중선형회귀모형에서의 변수선택기법 평가)

  • Ryu, Nahyeon;Kim, Hyungseok;Kang, Pilsung
    • Journal of Korean Institute of Industrial Engineers
    • /
    • v.42 no.5
    • /
    • pp.314-326
    • /
    • 2016
  • The purpose of variable selection techniques is to select a subset of relevant variables for a particular learning algorithm in order to improve the accuracy of prediction model and improve the efficiency of the model. We conduct an empirical analysis to evaluate and compare seven well-known variable selection techniques for multiple linear regression model, which is one of the most commonly used regression model in practice. The variable selection techniques we apply are forward selection, backward elimination, stepwise selection, genetic algorithm (GA), ridge regression, lasso (Least Absolute Shrinkage and Selection Operator) and elastic net. Based on the experiment with 49 regression data sets, it is found that GA resulted in the lowest error rates while lasso most significantly reduces the number of variables. In terms of computational efficiency, forward/backward elimination and lasso requires less time than the other techniques.

Validation Comparison of Credit Rating Models Using Box-Cox Transformation

  • Hong, Chong-Sun;Choi, Jeong-Min
    • Journal of the Korean Data and Information Science Society
    • /
    • v.19 no.3
    • /
    • pp.789-800
    • /
    • 2008
  • Current credit evaluation models based on financial data make use of smoothing estimated default ratios which are transformed from each financial variable. In this work, some problems of the credit evaluation models developed by financial experts are discussed and we propose improved credit evaluation models based on the stepwise variable selection method and Box-Cox transformed data whose distribution is much skewed to the right. After comparing goodness-of-fit tests of these models, the validation of the credit evaluation models using statistical methods such as the stepwise variable selection method and Box-Cox transformation function is explained.

  • PDF

Variable selection in partial linear regression using the least angle regression (부분선형모형에서 LARS를 이용한 변수선택)

  • Seo, Han Son;Yoon, Min;Lee, Hakbae
    • The Korean Journal of Applied Statistics
    • /
    • v.34 no.6
    • /
    • pp.937-944
    • /
    • 2021
  • The problem of selecting variables is addressed in partial linear regression. Model selection for partial linear models is not easy since it involves nonparametric estimation such as smoothing parameter selection and estimation for linear explanatory variables. In this work, several approaches for variable selection are proposed using a fast forward selection algorithm, least angle regression (LARS). The proposed procedures use t-test, all possible regressions comparisons or stepwise selection process with variables selected by LARS. An example based on real data and a simulation study on the performance of the suggested procedures are presented.

The correlation and regression analyses based on variable selection for the university evaluation index (대학 평가지표들에 대한 상관분석과 변수선택에 의한 선형모형추정)

  • Song, Pil-Jun;Kim, Jong-Tae
    • Journal of the Korean Data and Information Science Society
    • /
    • v.23 no.3
    • /
    • pp.457-465
    • /
    • 2012
  • The purpose of this study is to analyze the association between indicators and to find statistical models based on important indicators at 'College Notifier' in Korea Council for University Education. First, Pearson correlation coefficients are used to find statistically significant correlations. By variable selection method, the important indicators are selected and their coefficients are estimated. As variable selection method, backward and stepwise methods are employed.

Variable Selection for Logistic Regression Model Using Adjusted Coefficients of Determination (수정 결정계수를 사용한 로지스틱 회귀모형에서의 변수선택법)

  • Hong C. S.;Ham J. H.;Kim H. I.
    • The Korean Journal of Applied Statistics
    • /
    • v.18 no.2
    • /
    • pp.435-443
    • /
    • 2005
  • Coefficients of determination in logistic regression analysis are defined as various statistics, and their values are relatively smaller than those for linear regression model. These coefficients of determination are not generally used to evaluate and diagnose logistic regression model. Liao and McGee (2003) proposed two adjusted coefficients of determination which are robust at the addition of inappropriate predictors and the variation of sample size. In this work, these adjusted coefficients of determination are applied to variable selection method for logistic regression model and compared with results of other methods such as the forward selection, backward elimination, stepwise selection, and AIC statistic.

Selection of markers in the framework of multivariate receiver operating characteristic curve analysis in binary classification

  • Sameera, G;Vishnu, Vardhan R
    • Communications for Statistical Applications and Methods
    • /
    • v.26 no.2
    • /
    • pp.79-89
    • /
    • 2019
  • Classification models pertaining to receiver operating characteristic (ROC) curve analysis have been extended from univariate to multivariate setup by linearly combining available multiple markers. One such classification model is the multivariate ROC curve analysis. However, not all markers contribute in a real scenario and may mask the contribution of other markers in classifying the individuals/objects. This paper addresses this issue by developing an algorithm that helps in identifying the important markers that are significant and true contributors. The proposed variable selection framework is supported by real datasets and a simulation study, it is shown to provide insight about the individual marker's significance in providing a classifier rule/linear combination with good extent of classification.

A study on equating method based on regression analysis (회귀분석에 기초한 균등화 방법에 관한 연구)

  • Cho, Jang-Sik
    • Journal of the Korean Data and Information Science Society
    • /
    • v.21 no.3
    • /
    • pp.513-521
    • /
    • 2010
  • Most of universities have carried out course evaluation to apply the performance appraisal for professor. But, course evaluation depends on characteristics of each class such as class size, type of lecture, evaluator's grade and so on. As the results, such characteristics of each class lead to serious bias which makes lecturers distrust the course evaluation results. Hence, we propose a equating method for the course evaluation by regression analysis which use stepwise variable selection. And we compare proposed method with the other method by Cho et al. (2009) with respect to efficiencies. Also we give the example to which the method is applied.

Identifying Factors Affecting Dental University Hospitals' Profitability (치과대학병원 수익성에 영향을 미치는 요인 분석)

  • Lee, Ji-Hoon;Kim, Seong-Sik
    • Korea Journal of Hospital Management
    • /
    • v.26 no.2
    • /
    • pp.17-26
    • /
    • 2021
  • Purposes: This study aims to identify factors affecting dental university hospitals' profitability and understand recent their business condition. Methodology: Data from 2016 to 2019 was collected from financial statement, public open data in 8 dental university hospitals. For the study, multiple regression test with stepwise selection was applied. Findings: First of all, 9 out of 19 independent variables were selected by stepwise selection. As a result of multiple regression test with selected independent variables and the dependent variable(operating profit margin ratio), the factors affecting hospitals' profitability were the number of dental unit chair, hospital location, debt ratio, total capital turnover ratio, employment cost rate, material cost rate, management expense rate, the number of patient per a dentist. Practical Implication: To improve dental university hospitals' profitability, hospitals specifically analysis and manage their cost such as employment, material and management cost and seek effectiveness by managing the proper number of patient per a dentist.

A Bayes Criterion for Selecting Variables in MDA (MDA에서 판별변수 선택을 위한 베이즈 기준)

  • 김혜중;유희경
    • The Korean Journal of Applied Statistics
    • /
    • v.11 no.2
    • /
    • pp.435-449
    • /
    • 1998
  • In this article we have introduced a Bayes criterion for the variable selection in multiple discriminant analysis (MDA). The criterion is a default Bayes factor for the comparision of homo/heteroscadasticity of the multivariate normal means. The default Bayes factor is obtained from a development of the imaginary training sample method introduced by Spiegelhalter and Smith (1982). Based an the criterion, we also provided a test for additional discrimination in MDA. The advantage of the criterion is that it is not only applicable for the optimal subset selection method but for the stepwise method. More over, the criterion can be reduced to that for two-group discriminant analysis. Thus the criterion can be regarded as an unified alternative to variable selection criteria suggested by various sampling theory approaches. To illustrate the performance of the criterion, a numerical study has bean done via Monte Carlo experiment.

  • PDF