• Title/Summary/Keyword: Support Vector Machine-Regression

Search Result 370, Processing Time 0.028 seconds

Quadratic Loss Support Vector Interval Regression Machine for Crisp Input-Output Data

  • Hwang, Chang-Ha
    • Journal of the Korean Data and Information Science Society
    • /
    • v.15 no.2
    • /
    • pp.449-455
    • /
    • 2004
  • Support vector machine (SVM) has been very successful in pattern recognition and function estimation problems for crisp data. This paper proposes a new method to evaluate interval regression models for crisp input-output data. The proposed method is based on quadratic loss SVM, which implements quadratic programming approach giving more diverse spread coefficients than a linear programming one. The proposed algorithm here is model-free method in the sense that we do not have to assume the underlying model function. Experimental result is then presented which indicate the performance of this algorithm.

  • PDF

Estimating Software Development Cost using Support Vector Regression (Support Vector Regression을 이용한 소프트웨어 개발비 예측)

  • Park, Chan-Kyoo
    • Korean Management Science Review
    • /
    • v.23 no.2
    • /
    • pp.75-91
    • /
    • 2006
  • The purpose of this paper is to propose a new software development cost estimation method using SVR(Support Vector Regression) SVR, one of machine learning techniques, has been attracting much attention for its theoretic clearness and food performance over other machine learning techniques. This paper may be the first study in which SVR is applied to the field of software cost estimation. To derive the new method, we analyze historical cost data including both well-known overseas and domestic software projects, and define cost drivers affecting software cost. Then, the SVR model is trained using the historical data and its estimation accuracy is compared with that of the linear regression model. Experimental results show that the SVR model produces more accurate prediction than the linear regression model.

REGRESSION WITH CENSORED DATA BY LEAST SQUARES SUPPORT VECTOR MACHINE

  • Kim, Dae-Hak;Shim, Joo-Yong;Oh, Kwang-Sik
    • Journal of the Korean Statistical Society
    • /
    • v.33 no.1
    • /
    • pp.25-34
    • /
    • 2004
  • In this paper we propose a prediction method on the regression model with randomly censored observations of the training data set. The least squares support vector machine regression is applied for the regression function prediction by incorporating the weights assessed upon each observation in the optimization problem. Numerical examples are given to show the performance of the proposed prediction method.

A Fault Detection of Cyclic Signals Using Support Vector Machine-Regression (Support Vector Machine-Regression을 이용한 주기신호의 이상탐지)

  • Park, Seung-Hwan;Kim, Jun-Seok;Park, Cheong-Sool;Kim, Sung-Shick;Baek, Jun-Geol
    • Journal of Korean Society for Quality Management
    • /
    • v.38 no.3
    • /
    • pp.354-362
    • /
    • 2010
  • This paper presents a non-linear control chart based on support vector machine regression (SVM-R) to improve the accuracy of fault detection of cyclic signals. The proposed algorithm consists of the following two steps. First, the center line of the control chart is constructed by using SVM-R. Second, we calculate control limits by variances that are estimated by perpendicular and normal line of the center line. For performance evaluation, we apply proposed algorithm to the industrial data of the chemical vapor deposition process which is one of the semiconductor processes. The proposed method has better fault detection performance than other existing method

A Reliability Prediction Method for Weapon Systems using Support Vector Regression (지지벡터회귀분석을 이용한 무기체계 신뢰도 예측기법)

  • Na, Il-Yong
    • Journal of the Korea Institute of Military Science and Technology
    • /
    • v.16 no.5
    • /
    • pp.675-682
    • /
    • 2013
  • Reliability analysis and prediction of next failure time is critical to sustain weapon systems, concerning scheduled maintenance, spare parts replacement and maintenance interventions, etc. Since 1981, many methodology derived from various probabilistic and statistical theories has been suggested to do that activity. Nowadays, many A.I. tools have been used to support these predictions. Support Vector Regression(SVR) is a nonlinear regression technique extended from support vector machine. SVR can fit data flexibly and it has a wide variety of applications. This paper utilizes SVM and SVR with combining time series to predict the next failure time based on historical failure data. A numerical case using failure data from the military equipment is presented to demonstrate the performance of the proposed approach. Finally, the proposed approach is proved meaningful to predict next failure point and to estimate instantaneous failure rate and MTBF.

Improvement of rotor flux estimation performance of induction motor using Support Vector Machine $\epsilon$-insensitive Regression Method (Support Vector Machine $\epsilon$-insensitive Regression방법을 이용한 유도전동기의 회전자 자속추정 성능개선)

  • Han, Dong-Chang;Baek, Un-Jae;Kim, Seong-Rak;Park, Ju-Hyeon;Lee, Seok-Gyu;Park, Jeong-Il
    • Proceedings of the KIEE Conference
    • /
    • 2003.11b
    • /
    • pp.43-46
    • /
    • 2003
  • In this paper, a novel rotor flux estimation method of an induction motor using support vector machine(SVM) is presented. Two veil-known different flux models with respect to voltage and current are necessary to estimate the rotor flux of an induction motor. The theory of the SVM algorithm is based on statistical teaming theory. Training of SVH leads to a quadratic programming(QP) problem. The proposed SVM rotor flux estimator guarantees the improvement of performance in the transient and steady state in spite of parameter variation circumstance. The validity and the usefulness of Proposed algorithm are throughly verified through numerical simulation.

  • PDF

A concise overview of principal support vector machines and its generalization

  • Jungmin Shin;Seung Jun Shin
    • Communications for Statistical Applications and Methods
    • /
    • v.31 no.2
    • /
    • pp.235-246
    • /
    • 2024
  • In high-dimensional data analysis, sufficient dimension reduction (SDR) has been considered as an attractive tool for reducing the dimensionality of predictors while preserving regression information. The principal support vector machine (PSVM) (Li et al., 2011) offers a unified approach for both linear and nonlinear SDR. This article comprehensively explores a variety of SDR methods based on the PSVM, which we call principal machines (PM) for SDR. The PM achieves SDR by solving a sequence of convex optimizations akin to popular supervised learning methods, such as the support vector machine, logistic regression, and quantile regression, to name a few. This makes the PM straightforward to handle and extend in both theoretical and computational aspects, as we will see throughout this article.

A Recent Development in Support Vector Machine Classification

  • Hong, Dug-Hun;Hwang, Chang-Ha;Na, Eun-Young
    • 한국데이터정보과학회:학술대회논문집
    • /
    • 2002.06a
    • /
    • pp.23-28
    • /
    • 2002
  • Support vector machine(SVM) has been very successful in classification, regression, time series prediction and density estimation. In this paper, we will propose SVM for fuzzy data classification.

  • PDF

Generalized Support Vector Quantile Regression (일반화 서포트벡터 분위수회귀에 대한 연구)

  • Lee, Dongju;Choi, Sujin
    • Journal of Korean Society of Industrial and Systems Engineering
    • /
    • v.43 no.4
    • /
    • pp.107-115
    • /
    • 2020
  • Support vector regression (SVR) is devised to solve the regression problem by utilizing the excellent predictive power of Support Vector Machine. In particular, the ⲉ-insensitive loss function, which is a loss function often used in SVR, is a function thatdoes not generate penalties if the difference between the actual value and the estimated regression curve is within ⲉ. In most studies, the ⲉ-insensitive loss function is used symmetrically, and it is of interest to determine the value of ⲉ. In SVQR (Support Vector Quantile Regression), the asymmetry of the width of ⲉ and the slope of the penalty was controlled using the parameter p. However, the slope of the penalty is fixed according to the p value that determines the asymmetry of ⲉ. In this study, a new ε-insensitive loss function with p1 and p2 parameters was proposed. A new asymmetric SVR called GSVQR (Generalized Support Vector Quantile Regression) based on the new ε-insensitive loss function can control the asymmetry of the width of ⲉ and the slope of the penalty using the parameters p1 and p2, respectively. Moreover, the figures show that the asymmetry of the width of ⲉ and the slope of the penalty is controlled. Finally, through an experiment on a function, the accuracy of the existing symmetric Soft Margin, asymmetric SVQR, and asymmetric GSVQR was examined, and the characteristics of each were shown through figures.

Estimating Fuzzy Regression with Crisp Input-Output Using Quadratic Loss Support Vector Machine

  • Hwang, Chang-Ha;Hong, Dug-Hun;Lee, Sang-Bock
    • 한국데이터정보과학회:학술대회논문집
    • /
    • 2004.10a
    • /
    • pp.53-59
    • /
    • 2004
  • Support vector machine(SVM) approach to regression can be found in information science literature. SVM implements the regularization technique which has been introduced as a way of controlling the smoothness properties of regression function. In this paper, we propose a new estimation method based on quadratic loss SVM for a linear fuzzy regression model of Tanaka's, and furthermore propose a estimation method for nonlinear fuzzy regression. This approach is a very attractive approach to evaluate nonlinear fuzzy model with crisp input and output data.

  • PDF