• 제목/요약/키워드: Quadratic Regression

검색결과 248건 처리시간 0.025초

FUZZY REGRESSION ANALYSIS WITH NON-SYMMETRIC FUZZY COEFFICIENTS BASED ON QUADRATIC PROGRAMMING APPROACH

  • Lee, Haekwan;Hideo Tanaka
    • 한국지능시스템학회:학술대회논문집
    • /
    • 한국퍼지및지능시스템학회 1998년도 The Third Asian Fuzzy Systems Symposium
    • /
    • pp.63-68
    • /
    • 1998
  • This paper proposes fuzzy regression analysis with non-symmetric fuzzy coefficients. By assuming non-symmetric triangular fuzzy coefficients and applying the quadratic programming fomulation, the center of the obtained fuzzy regression model attains more central tendency compared to the one with symmetric triangular fuzzy coefficients. For a data set composed of crisp inputs-fuzzy outputs, two approximation models called an upper approximation model and a lower approximation model are considered as the regression models. Thus, we also propose an integrated quadratic programming problem by which the upper approximation model always includes the lower approximation model at any threshold level under the assumption of the same centers in the two approximation models. Sensitivities of Weight coefficients in the proposed quadratic programming approaches are investigated through real data.

  • PDF

Sensitivity Analysis in Principal Component Regression with Quadratic Approximation

  • Shin, Jae-Kyoung;Chang, Duk-Joon
    • Journal of the Korean Data and Information Science Society
    • /
    • 제14권3호
    • /
    • pp.623-630
    • /
    • 2003
  • Recently, Tanaka(1988) derived two influence functions related to an eigenvalue problem $(A-\lambda_sI)\upsilon_s=0$ of real symmetric matrix A and used them for sensitivity analysis in principal component analysis. In this paper, we deal with the perturbation expansions up to quadratic terms of the same functions and discuss the application to sensitivity analysis in principal component regression analysis(PCRA). Numerical example is given to show how the approximation improves with the quadratic term.

  • PDF

Quadratic Loss Support Vector Interval Regression Machine for Crisp Input-Output Data

  • Hwang, Chang-Ha
    • Journal of the Korean Data and Information Science Society
    • /
    • 제15권2호
    • /
    • pp.449-455
    • /
    • 2004
  • Support vector machine (SVM) has been very successful in pattern recognition and function estimation problems for crisp data. This paper proposes a new method to evaluate interval regression models for crisp input-output data. The proposed method is based on quadratic loss SVM, which implements quadratic programming approach giving more diverse spread coefficients than a linear programming one. The proposed algorithm here is model-free method in the sense that we do not have to assume the underlying model function. Experimental result is then presented which indicate the performance of this algorithm.

  • PDF

Estimating Fuzzy Regression with Crisp Input-Output Using Quadratic Loss Support Vector Machine

  • 황창하;홍덕헌;이상복
    • 한국데이터정보과학회:학술대회논문집
    • /
    • 한국데이터정보과학회 2004년도 추계학술대회
    • /
    • pp.53-59
    • /
    • 2004
  • Support vector machine(SVM) approach to regression can be found in information science literature. SVM implements the regularization technique which has been introduced as a way of controlling the smoothness properties of regression function. In this paper, we propose a new estimation method based on quadratic loss SVM for a linear fuzzy regression model of Tanaka's, and furthermore propose a estimation method for nonlinear fuzzy regression. This approach is a very attractive approach to evaluate nonlinear fuzzy model with crisp input and output data.

  • PDF

THE USE OF MATHEMATICAL PROGRAMMING FOR LINEAR REGRESSION PROBLEMS

  • Park, Sung-Hyun
    • 한국경영과학회지
    • /
    • 제3권1호
    • /
    • pp.75-79
    • /
    • 1978
  • The use of three mathematical programming techniques (quadratic programming, integer quadratic programming and linear programming) is discussed to solve some problems in linear regression analysis. When the criterion is the minimization of the sum of squared deviations and the parameters are linearly constrained, the problem may be formulated as quadratic programming problem. For the selection of variables to find "best" regression equation in statistics, the technique of integer quadratic programming is proposed and found to be a very useful tool. When the criterion of fitting a linear regression is the minimization of the sum of absolute deviations from the regression function, the problem may be reduced to a linear programming problem and can be solved reasonably well.ably well.

  • PDF

An efficient algorithm for the non-convex penalized multinomial logistic regression

  • Kwon, Sunghoon;Kim, Dongshin;Lee, Sangin
    • Communications for Statistical Applications and Methods
    • /
    • 제27권1호
    • /
    • pp.129-140
    • /
    • 2020
  • In this paper, we introduce an efficient algorithm for the non-convex penalized multinomial logistic regression that can be uniformly applied to a class of non-convex penalties. The class includes most non-convex penalties such as the smoothly clipped absolute deviation, minimax concave and bridge penalties. The algorithm is developed based on the concave-convex procedure and modified local quadratic approximation algorithm. However, usual quadratic approximation may slow down computational speed since the dimension of the Hessian matrix depends on the number of categories of the output variable. For this issue, we use a uniform bound of the Hessian matrix in the quadratic approximation. The algorithm is available from the R package ncpen developed by the authors. Numerical studies via simulations and real data sets are provided for illustration.

Support Vector Machine for Interval Regression

  • Hong Dug Hun;Hwang Changha
    • 한국통계학회:학술대회논문집
    • /
    • 한국통계학회 2004년도 학술발표논문집
    • /
    • pp.67-72
    • /
    • 2004
  • Support vector machine (SVM) has been very successful in pattern recognition and function estimation problems for crisp data. This paper proposes a new method to evaluate interval linear and nonlinear regression models combining the possibility and necessity estimation formulation with the principle of SVM. For data sets with crisp inputs and interval outputs, the possibility and necessity models have been recently utilized, which are based on quadratic programming approach giving more diverse spread coefficients than a linear programming one. SVM also uses quadratic programming approach whose another advantage in interval regression analysis is to be able to integrate both the property of central tendency in least squares and the possibilistic property In fuzzy regression. However this is not a computationally expensive way. SVM allows us to perform interval nonlinear regression analysis by constructing an interval linear regression function in a high dimensional feature space. In particular, SVM is a very attractive approach to model nonlinear interval data. The proposed algorithm here is model-free method in the sense that we do not have to assume the underlying model function for interval nonlinear regression model with crisp inputs and interval output. Experimental results are then presented which indicate the performance of this algorithm.

  • PDF

A study on log-density ratio in logistic regression model for binary data

  • Kahng, Myung-Wook
    • Journal of the Korean Data and Information Science Society
    • /
    • 제22권1호
    • /
    • pp.107-113
    • /
    • 2011
  • We present methods for studying the log-density ratio, which allow us to select which predictors are needed, and how they should be included in the logistic regression model. Under multivariate normal distributional assumptions, we investigate the form of the log-density ratio as a function of many predictors. The linear, quadratic and crossproduct terms are required in general. If two covariance matrices are equal, then the crossproduct and quadratic terms are not needed. If the variables are uncorrelated, we do not need the crossproduct terms, but we still need the linear and quadratic terms.

Quadratic Programming Approach to Pansharpening of Multispectral Images Using a Regression Model

  • Lee, Sang-Hoon
    • 대한원격탐사학회지
    • /
    • 제24권3호
    • /
    • pp.257-266
    • /
    • 2008
  • This study presents an approach to synthesize multispectral images at a higher resolution by exploiting a high-resolution image acquired in panchromatic modality. The synthesized images should be similar to the multispectral images that would have been observed by the corresponding sensor at the same high resolution. The proposed scheme is designed to reconstruct the multispectral images at the higher resolution with as less color distortion as possible. It uses a regression model of the second order to fit panchromatic data to multispectral observations. Based on the regression model, the multispectral images at the higher spatial resolution of the panchromatic image are optimized by a quadratic programming. In this study, the new method was applied to the IKONOS 1m panchromatic and 4m multispectral data, and the results were compared with them of several current approaches. Experimental results demonstrate that the proposed scheme can achieve significant improvement over other methods.

로지스틱 회귀모형에서 이변량 정규분포에 근거한 로그-밀도비 (Log-density Ratio with Two Predictors in a Logistic Regression Model)

  • 강명욱;윤재은
    • 응용통계연구
    • /
    • 제26권1호
    • /
    • pp.141-149
    • /
    • 2013
  • 로지스틱회귀모형에서 두 설명변수의 조건부 분포가 모두 이변량 정규분포라고 할 수 있다면 설명변수들의 함수로 표현되는 로그-밀도비를 통해 모형에 포함시켜야하는 항을 알 수 있다. 두개의 이변량 정규분포에서 분산-공분산행렬이 같은 경우에는 이차항과 교차항 없이 일차항만으로 충분하다. 상관계수가 모두 0이면 교차항은 설명변수의 분산과 관계없이 필요하지 않다. 또한 로지스틱회귀모형에서 로그-밀도비를 통해 이차항과 교차항이 필요하지 않게 되는 다른 조건들도 알아본다.