• Title/Summary/Keyword: SVM regression

Search Result 251, Processing Time 0.029 seconds

Support Vector Median Regression

  • Hwang, Chang-Ha
    • Journal of the Korean Data and Information Science Society
    • /
    • v.14 no.1
    • /
    • pp.67-74
    • /
    • 2003
  • Median regression analysis has robustness properties which make it an attractive alternative to regression based on the mean. Support vector machine (SVM) is used widely in real-world regression tasks. In this paper, we propose a new SV median regression based on check function. And we illustrate how this proposed SVM performs and compare this with the SVM based on absolute deviation loss function.

  • PDF

Training for Huge Data set with On Line Pruning Regression by LS-SVM

  • Kim, Dae-Hak;Shim, Joo-Yong;Oh, Kwang-Sik
    • Proceedings of the Korean Statistical Society Conference
    • /
    • 2003.10a
    • /
    • pp.137-141
    • /
    • 2003
  • LS-SVM(least squares support vector machine) is a widely applicable and useful machine learning technique for classification and regression analysis. LS-SVM can be a good substitute for statistical method but computational difficulties are still remained to operate the inversion of matrix of huge data set. In modern information society, we can easily get huge data sets by on line or batch mode. For these kind of huge data sets, we suggest an on line pruning regression method by LS-SVM. With relatively small number of pruned support vectors, we can have almost same performance as regression with full data set.

  • PDF

Weighted Support Vector Machines for Heteroscedastic Regression

  • Park, Hye-Jung;Hwang, Chang-Ha
    • Journal of the Korean Data and Information Science Society
    • /
    • v.17 no.2
    • /
    • pp.467-474
    • /
    • 2006
  • In this paper we present a weighted support vector machine(SVM) and a weighted least squares support vector machine(LS-SVM) for the prediction in the heteroscedastic regression model. By adding weights to standard SVM and LS-SVM the better fitting ability can be achieved when errors are heteroscedastic. In the numerical studies, we illustrate the prediction performance of the proposed procedure by comparing with the procedure which combines standard SVM and LS-SVM and wild bootstrap for the prediction.

  • PDF

Deep LS-SVM for regression

  • Hwang, Changha;Shim, Jooyong
    • Journal of the Korean Data and Information Science Society
    • /
    • v.27 no.3
    • /
    • pp.827-833
    • /
    • 2016
  • In this paper, we propose a deep least squares support vector machine (LS-SVM) for regression problems, which consists of the input layer and the hidden layer. In the hidden layer, LS-SVMs are trained with the original input variables and the perturbed responses. For the final output, the main LS-SVM is trained with the outputs from LS-SVMs of the hidden layer as input variables and the original responses. In contrast to the multilayer neural network (MNN), LS-SVMs in the deep LS-SVM are trained to minimize the penalized objective function. Thus, the learning dynamics of the deep LS-SVM are entirely different from MNN in which all weights and biases are trained to minimize one final error function. When compared to MNN approaches, the deep LS-SVM does not make use of any combination weights, but trains all LS-SVMs in the architecture. Experimental results from real datasets illustrate that the deep LS-SVM significantly outperforms state of the art machine learning methods on regression problems.

Support Vector Machine for Interval Regression

  • Hong Dug Hun;Hwang Changha
    • Proceedings of the Korean Statistical Society Conference
    • /
    • 2004.11a
    • /
    • pp.67-72
    • /
    • 2004
  • Support vector machine (SVM) has been very successful in pattern recognition and function estimation problems for crisp data. This paper proposes a new method to evaluate interval linear and nonlinear regression models combining the possibility and necessity estimation formulation with the principle of SVM. For data sets with crisp inputs and interval outputs, the possibility and necessity models have been recently utilized, which are based on quadratic programming approach giving more diverse spread coefficients than a linear programming one. SVM also uses quadratic programming approach whose another advantage in interval regression analysis is to be able to integrate both the property of central tendency in least squares and the possibilistic property In fuzzy regression. However this is not a computationally expensive way. SVM allows us to perform interval nonlinear regression analysis by constructing an interval linear regression function in a high dimensional feature space. In particular, SVM is a very attractive approach to model nonlinear interval data. The proposed algorithm here is model-free method in the sense that we do not have to assume the underlying model function for interval nonlinear regression model with crisp inputs and interval output. Experimental results are then presented which indicate the performance of this algorithm.

  • PDF

Support Vector Machine for Linear Regression

  • Hwang, Changha;Seok, Kyungha
    • Communications for Statistical Applications and Methods
    • /
    • v.6 no.2
    • /
    • pp.337-344
    • /
    • 1999
  • Support vector machine(SVM) is a new and very promising regression and classification technique developed by Vapnik and his group at AT&T Bell laboratories. This article provides a brief overview of SVM focusing on linear regression. We explain from statistical point of view why SVM might be attractive and how this could be compared with other linear regression techniques. Furthermore. we explain model selection based on VC-theory.

  • PDF

Bayesian Model Selection for Support Vector Regression using the Evidence Framework

  • Hwang, Chang-Ha;Seok, Kyung-Ha
    • Communications for Statistical Applications and Methods
    • /
    • v.6 no.3
    • /
    • pp.813-820
    • /
    • 1999
  • Supprot vector machine(SVM) is a new and very promising regression and classification technique developed by Vapnik and his group at AT&T Bell Laboratories. in this paper we provide a brief overview of SVM for regression. Furthermore we describe Bayesian model selection based on macKay's evidence framework for SVM regression.

  • PDF

Fuzzy c-Regression Using Weighted LS-SVM

  • Hwang, Chang-Ha
    • 한국데이터정보과학회:학술대회논문집
    • /
    • 2005.10a
    • /
    • pp.161-169
    • /
    • 2005
  • In this paper we propose a fuzzy c-regression model based on weighted least squares support vector machine(LS-SVM), which can be used to detect outliers in the switching regression model while preserving simultaneous yielding the estimates of outputs together with a fuzzy c-partitions of data. It can be applied to the nonlinear regression which does not have an explicit form of the regression function. We illustrate the new algorithm with examples which indicate how it can be used to detect outliers and fit the mixed data to the nonlinear regression models.

  • PDF

Estimating Fuzzy Regression with Crisp Input-Output Using Quadratic Loss Support Vector Machine

  • Hwang, Chang-Ha;Hong, Dug-Hun;Lee, Sang-Bock
    • 한국데이터정보과학회:학술대회논문집
    • /
    • 2004.10a
    • /
    • pp.53-59
    • /
    • 2004
  • Support vector machine(SVM) approach to regression can be found in information science literature. SVM implements the regularization technique which has been introduced as a way of controlling the smoothness properties of regression function. In this paper, we propose a new estimation method based on quadratic loss SVM for a linear fuzzy regression model of Tanaka's, and furthermore propose a estimation method for nonlinear fuzzy regression. This approach is a very attractive approach to evaluate nonlinear fuzzy model with crisp input and output data.

  • PDF