• 제목/요약/키워드: linear regression with constraints

검색결과 22건 처리시간 0.02초

Test for an Outlier in Multivariate Regression with Linear Constraints

  • Kim, Myung-Geun
    • Communications for Statistical Applications and Methods
    • /
    • 제9권2호
    • /
    • pp.473-478
    • /
    • 2002
  • A test for a single outlier in multivariate regression with linear constraints on regression coefficients using a mean shift model is derived. It is shown that influential observations based on case-deletions in testing linear hypotheses are determined by two types of outliers that are mean shift outliers with or without linear constraints, An illustrative example is given.

제한조건이 있는 선형회귀 모형에서의 베이지안 변수선택 (Bayesian Variable Selection in Linear Regression Models with Inequality Constraints on the Coefficients)

  • 오만숙
    • 응용통계연구
    • /
    • 제15권1호
    • /
    • pp.73-84
    • /
    • 2002
  • 계수에 대한 부등 제한조건이 있는 선형 회귀모형은 경제모형에서 가장 흔하게 다루어지는 것 중의 하나이다. 이는 특정 설명변수에 대한 계수의 부호를 음양 중 하나로 제한하거나 계수들에 대하여 순서적 관계를 주기 때문이다. 본 논문에서는 이러한 부등 제한이 있는 선형회귀 모형에서 유의한 설명변수의 선택을 해결하는 베이지안 기법을 고려한다. 베이지안 변수선택은 가능한 모든 모형의 사후확률 계산이 요구되는데 본 논문에서는 이러한 사후확률들을 동시에 계산하는 방법을 제시한다. 구체적으로 가장 일반적인 모형의 모수에 대한 사후표본을 깁스 표본기법을 적용시켜 얻은 후 이를 이용하여 모든 가능한 모형의 사후확률을 계산하고 실제적인 자료에 본 논문에서 제안된 방법을 적용시켜 본다.

Testing General Linear Constraints on the Regression Coefficient Vector : A Note

  • Jeong, Ki-Jun
    • Journal of the Korean Statistical Society
    • /
    • 제8권2호
    • /
    • pp.107-109
    • /
    • 1979
  • Consider a linear model with n observations and k explanatory variables: (1)b $y=X\beta+u, u\simN(0,\sigma^2I_n)$. We assume that the model satisfies the ideal conditions. Consider the general linear constraints on regression coefficient vector: (2) $R\beta=r$, where R and r are known matrices of orders $q\timesk$ and q\times1$ respectively, and the rank of R is $qk+q$.

  • PDF

비교차 제약식을 이용한 다중 선형 분위수 회귀모형에 관한 비교연구 (A comparison study of multiple linear quantile regression using non-crossing constraints)

  • 방성완;신승준
    • 응용통계연구
    • /
    • 제29권5호
    • /
    • pp.773-786
    • /
    • 2016
  • 분위수 회귀는 반응변수의 조건부 분위수 함수를 추정함으로써 반응변수와 예측변수의 관계에 대한 포괄적인 정보를 제공한다. 그러나 여러 개의 분위수 함수를 개별적으로 추정하게 되면 이들이 서로 교차할 가능성이 있으며, 이러한 분위수 함수의 교차(quantile crossing) 현상 분위수의 이론적 기본 특성에 위배된다. 본 논문에서는 다중 비교차 분위수 함수의 추정의 대표적인 방법들의 특성을 적합식과 계산 알고리즘의 측면에서 살펴보고, 모의실험과 실제 자료 분석을 통해 그 성능을 비교하였다.

LIKELIHOOD DISTANCE IN CONSTRAINED REGRESSION

  • Kim, Myung-Geun
    • Journal of applied mathematics & informatics
    • /
    • 제25권1_2호
    • /
    • pp.489-493
    • /
    • 2007
  • Two diagnostic measures based on the likelihood distance for constrained regression with linear constraints on regression coefficients are derived. They are used for identifying influential observations in constrained regression. A numerical example is provided for illustration.

Constrained $L_1$-Estimation in Linear Regression

  • Kim, Bu-Yong
    • Communications for Statistical Applications and Methods
    • /
    • 제5권3호
    • /
    • pp.581-589
    • /
    • 1998
  • An algorithm is proposed for the $L_1$-estimation with linear equality and inequality constraints in linear regression model. The algorithm employs a linear scaling transformation to obtain the optimal solution of linear programming type problem. And a special scheme is used to maintain the feasibility of the updated solution at each iteration. The convergence of the proposed algorithm is proved. In addition, the updating and orthogonal decomposition techniques are employed to improve the computational efficiency and numerical stability.

  • PDF

Algorithm for the Constrained Chebyshev Estimation in Linear Regression

  • Kim, Bu-yong
    • Communications for Statistical Applications and Methods
    • /
    • 제7권1호
    • /
    • pp.47-54
    • /
    • 2000
  • This article is concerned with the algorithm for the Chebyshev estimation with/without linear equality and/or inequality constraints. The algorithm employs a linear scaling transformation scheme to reduce the computational burden which is induced when the data set is quite large. The convergence of the proposed algorithm is proved. And the updating and orthogonal decomposition techniques are considered to improve the computational efficiency and numerical stability.

  • PDF

Bayesian inference for an ordered multiple linear regression with skew normal errors

  • Jeong, Jeongmun;Chung, Younshik
    • Communications for Statistical Applications and Methods
    • /
    • 제27권2호
    • /
    • pp.189-199
    • /
    • 2020
  • This paper studies a Bayesian ordered multiple linear regression model with skew normal error. It is reasonable that the kind of inherent information available in an applied regression requires some constraints on the coefficients to be estimated. In addition, the assumption of normality of the errors is sometimes not appropriate in the real data. Therefore, to explain such situations more flexibly, we use the skew-normal distribution given by Sahu et al. (The Canadian Journal of Statistics, 31, 129-150, 2003) for error-terms including normal distribution. For Bayesian methodology, the Markov chain Monte Carlo method is employed to resolve complicated integration problems. Also, under the improper priors, the propriety of the associated posterior density is shown. Our Bayesian proposed model is applied to NZAPB's apple data. For model comparison between the skew normal error model and the normal error model, we use the Bayes factor and deviance information criterion given by Spiegelhalter et al. (Journal of the Royal Statistical Society Series B (Statistical Methodology), 64, 583-639, 2002). We also consider the problem of detecting an influential point concerning skewness using Bayes factors. Finally, concluding remarks are discussed.

Patch based Semi-supervised Linear Regression for Face Recognition

  • Ding, Yuhua;Liu, Fan;Rui, Ting;Tang, Zhenmin
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • 제13권8호
    • /
    • pp.3962-3980
    • /
    • 2019
  • To deal with single sample face recognition, this paper presents a patch based semi-supervised linear regression (PSLR) algorithm, which draws facial variation information from unlabeled samples. Each facial image is divided into overlapped patches, and a regression model with mapping matrix will be constructed on each patch. Then, we adjust these matrices by mapping unlabeled patches to $[1,1,{\cdots},1]^T$. The solutions of all the mapping matrices are integrated into an overall objective function, which uses ${\ell}_{2,1}$-norm minimization constraints to improve discrimination ability of mapping matrices and reduce the impact of noise. After mapping matrices are computed, we adopt majority-voting strategy to classify the probe samples. To further learn the discrimination information between probe samples and obtain more robust mapping matrices, we also propose a multistage PSLR (MPSLR) algorithm, which iteratively updates the training dataset by adding those reliably labeled probe samples into it. The effectiveness of our approaches is evaluated using three public facial databases. Experimental results prove that our approaches are robust to illumination, expression and occlusion.