• Title/Summary/Keyword: simple linear regression

Search Result 417, Processing Time 0.698 seconds

Mean Lifetime Estimation with Censored Observations

  • Kim, Jin-Heum;Kim, Jee-Hoon
    • Journal of the Korean Statistical Society
    • /
    • 제26권3호
    • /
    • pp.299-308
    • /
    • 1997
  • In the simple linear regression model Y = .alpha.$_{0}$ + .beta.$_{0}$Z + .epsilon. under the right censorship of the response variables, the estimation of the mean lifetime E(Y) is an interesting problem. In this paper we propose a method of estimating E(Y) based on the observations modified by the arguments of Buckley and James (1979). It is shown that the proposed estimator is consistent and our proposed procedure in the simple linear regression case can be naturally extended to the multiple linear regression. Finally, we perform simulation studies to compare the proposed estimator with the estimator introduced by Gill (1983).83).

  • PDF

Performing linear regression with responses calculated using Monte Carlo transport codes

  • Price, Dean;Kochunas, Brendan
    • Nuclear Engineering and Technology
    • /
    • 제54권5호
    • /
    • pp.1902-1908
    • /
    • 2022
  • In many of the complex systems modeled in the field of nuclear engineering, it is often useful to use linear regression-based analyses to analyze relationships between model parameters and responses of interests. In cases where the response of interest is calculated by a simulation which uses Monte Carlo methods, there will be some uncertainty in the responses. Further, the reduction of this uncertainty increases the time necessary to run each calculation. This paper presents some discussion on how the Monte Carlo error in the response of interest influences the error in computed linear regression coefficients. A mathematical justification is given that shows that when performing linear regression in these scenarios, the error in regression coefficients can be largely independent of the Monte Carlo error in each individual calculation. This condition is only true if the total number of calculations are scaled to have a constant total time, or amount of work, for all calculations. An application with a simple pin cell model is used to demonstrate these observations in a practical problem.

단순선형회귀에서의 변화점에 대한 연구 (A study on change-points in simple linear regression)

  • 정광모;한미혜
    • 응용통계연구
    • /
    • 제5권1호
    • /
    • pp.29-39
    • /
    • 1992
  • 단순선형회귀모형에서 어떤 미지의 시점을 전후하여 회귀계수에 변화가 있었는지에 대한 통계적 가설을 검정하고 변화점의 추정방법을 논의한다. 이차형식의 통계량을 제안하고 그 근사분포 및 유의수준제어를 살펴보았다. 또한 하나의 예를 통하여 제안된 방법을 적용하고 우도비검정과 비교하였다.

  • PDF

단순회귀모형에서 선형성 검정통계량 (A linearity test statistic in a simple linear regression)

  • 박천건;이경은
    • Journal of the Korean Data and Information Science Society
    • /
    • 제25권2호
    • /
    • pp.305-315
    • /
    • 2014
  • 전통적으로 단순선형회귀모형에서 설명변수와 반응변수의 선형성 평가는 산점도로 쉽게 파악되었다. 보통 반복수가 존재하는 자료에서 적합결여검정은 선형성을 평가하는데 사용되었다. 하지만 반복수가 오직 하나인 경우에 선형성 검정이 수월하지 않다. 본 연구에서는 반복수가 오직 하나인 단순선형회귀모형의 선형성을 검정하는 통계량을 제안하고 모의실험 및 실증연구를 통하여 신뢰성을 파악한다.

Consistency and Bounds on the Bias of $S^2$ in the Linear Regression Model with Moving Average Disturbances

  • Song, Seuck-Heun
    • Journal of the Korean Statistical Society
    • /
    • 제24권2호
    • /
    • pp.507-518
    • /
    • 1995
  • The ordinary least squares based estiamte $S^2$ of the disturbance variance is considered in the linear regression model when the disturbances follow the first-order moving-average process. It is shown that $S^2$ is weakly consistent estimate for the disturbance varaince without any restriction on the regressor matrix X. Also, simple exact bounds on the relative bias of $S^2$ are given in finite sample sizes.

  • PDF

Change-Points with Jump in Nonparametric Regression Functions

  • Kim, Jong-Tae
    • 한국데이터정보과학회:학술대회논문집
    • /
    • 한국데이터정보과학회 2005년도 춘계학술대회
    • /
    • pp.193-199
    • /
    • 2005
  • A simple method is proposed to detect the number of change points with jump discontinuities in nonparamteric regression functions. The proposed estimators are based on a local linear regression fit by the comparison of left and right one-side kernel smoother. Also, the proposed methodology is suggested as the test statistic for detecting of change points and the direction of jump discontinuities.

  • PDF

On study for change point regression problems using a difference-based regression model

  • Park, Jong Suk;Park, Chun Gun;Lee, Kyeong Eun
    • Communications for Statistical Applications and Methods
    • /
    • 제26권6호
    • /
    • pp.539-556
    • /
    • 2019
  • This paper derive a method to solve change point regression problems via a process for obtaining consequential results using properties of a difference-based intercept estimator first introduced by Park and Kim (Communications in Statistics - Theory Methods, 2019) for outlier detection in multiple linear regression models. We describe the statistical properties of the difference-based regression model in a piecewise simple linear regression model and then propose an efficient algorithm for change point detection. We illustrate the merits of our proposed method in the light of comparison with several existing methods under simulation studies and real data analysis. This methodology is quite valuable, "no matter what regression lines" and "no matter what the number of change points".

Detection of Change-Points by Local Linear Regression Fit;

  • Kim, Jong Tae;Choi, Hyemi;Huh, Jib
    • Communications for Statistical Applications and Methods
    • /
    • 제10권1호
    • /
    • pp.31-38
    • /
    • 2003
  • A simple method is proposed to detect the number of change points and test the location and size of multiple change points with jump discontinuities in an otherwise smooth regression model. The proposed estimators are based on a local linear regression fit by the comparison of left and right one-side kernel smoother. Our proposed methodology is explained and applied to real data and simulated data.

Tree-Structured Nonlinear Regression

  • Chang, Young-Jae;Kim, Hyeon-Soo
    • 응용통계연구
    • /
    • 제24권5호
    • /
    • pp.759-768
    • /
    • 2011
  • Tree algorithms have been widely developed for regression problems. One of the good features of a regression tree is the flexibility of fitting because it can correctly capture the nonlinearity of data well. Especially, data with sudden structural breaks such as the price of oil and exchange rates could be fitted well with a simple mixture of a few piecewise linear regression models. Now that split points are determined by chi-squared statistics related with residuals from fitting piecewise linear models and the split variable is chosen by an objective criterion, we can get a quite reasonable fitting result which goes in line with the visual interpretation of data. The piecewise linear regression by a regression tree can be used as a good fitting method, and can be applied to a dataset with much fluctuation.

상관성과 단순선형회귀분석 (Correlation and Simple Linear Regression)

  • 박선일;오태호
    • 한국임상수의학회지
    • /
    • 제27권4호
    • /
    • pp.427-434
    • /
    • 2010
  • Correlation is a technique used to measure the strength or the degree of closeness of the linear association between two quantitative variables. Common misuses of this technique are highlighted. Linear regression is a technique used to identify a relationship between two continuous variables in mathematical equations, which could be used for comparison or estimation purposes. Specifically, regression analysis can provide answers for questions such as how much does one variable change for a given change in the other, how accurately can the value of one variable be predicted from the knowledge of the other. Regression does not give any indication of how good the association is while correlation provides a measure of how well a least-squares regression line fits the given set of data. The better the correlation, the closer the data points are to the regression line. In this tutorial article, the process of obtaining a linear regression relationship for a given set of bivariate data was described. The least square method to obtain the line which minimizes the total error between the data points and the regression line was employed and illustrated. The coefficient of determination, the ratio of the explained variation of the values of the independent variable to total variation, was described. Finally, the process of calculating confidence and prediction interval was reviewed and demonstrated.