• Title/Summary/Keyword: autocorrelated process

Search Result 30, Processing Time 0.021 seconds

A PC-Based System for Gear Pitch Analysis and Monitoring in Gear Manufacturing Process (기어피치분석 및 공정관측을 위한 PC기반시스템 구축)

  • 김성준;지용수
    • Journal of Korean Society for Quality Management
    • /
    • v.30 no.3
    • /
    • pp.111-119
    • /
    • 2002
  • Gears are essential elements for mechanical power transmission. Geometric precision is the main factor for characterizing gear grade and qualify. Gear pitch is one of the crucial measurements, which is defined as a distance between two adjacent gear teeth. It is well-known that variability in gear pitches may causes wear-out and vibration noise. Therefore maintaining pitch errors at a low level plays a key role in assuring the gear quality to customers. This paper is concerned with a case study, which presents a computerized system for Inspecting pitch errors in a gear machining process. This system consists of a PC and window-based programs. Although the start and stop is manually accomplished, the process of measuring and analyzing pitch data is automatically conducted in this system. Our purpose lies in reducing inspection cost and time as well as Increasing test reliability. Its operation is briefly illustrated by example. Sometimes a strong autocorrelation is observed from pitch data. We also discuss a process monitoring scheme taking account of autocorrelations.

An Effective Control Chart for Monitoring Mean Shift in AR(1) Processes (AR(1) 공정에서의 효과적인 공정평균 관리도)

  • 원경수;강창욱;이배진
    • Journal of Korean Society of Industrial and Systems Engineering
    • /
    • v.24 no.67
    • /
    • pp.27-36
    • /
    • 2001
  • A standard assumption when using a control chart to monitor a process is that the observations from the process output are statistically independent. However, for many processes the observations are autocorrelated and this autocorrelation can have a significant effect on the performance of the control chart. In this paper, we consider combined control chart of monitoring the mean of a process in which the observations can be modeled as a first-order autoregressive process. The Shewhart control chart of residuals-EWMA control chart of the observations is considered and the method of combination is recommended. The performance of the proposed control chart is compared with the performance of other control charts using a simulation.

  • PDF

A Bayesian Test for First Order Autocorrelation in Regression Errors : An Application to SPC Approach (회귀모형 오차항의 1차 자기상관에 대한 베이즈 검정법 : SPC 분야에의 응용)

  • Kim, Hea-Jung;Han, Sung-Sil
    • Journal of Korean Society for Quality Management
    • /
    • v.24 no.4
    • /
    • pp.190-206
    • /
    • 1996
  • In case measurements are made on units of production in time order, it is reasonable to expect that the measurement errors will sometimes be first order autocorrelated, and a technique to test such autocorrelation is required to give good control of the productive process. Tool-wear process provide an example for which regression model can sometimes be useful in modeling and controlling the process. For the control of such process, we present a simple method for testing first order autocorrelation in regression errors. The method is based on Bayesian test method via Bayes factor and derived by observing that in general, a Bayes factor can be written as the product of a quantity called the Savage-Dickey density ratio and a correction factor ; both terms are easily estimated from Gibbs sampling technique. Performance of the method is examined by means of Monte Carlo simulation. It is noted that the test not only achieves satisfactory power but eliminates the inconvenience occurred in using the well-known Durbin-Watson test.

  • PDF

A Linear Filtering Method for Statistical Process Control with Autocorrelated Data (자기상관 데이터의 통계적 공정관리를 위한 선형 필터 기법)

  • Jin Chang-Ho;Apley Daniel W.
    • Proceedings of the Korean Operations and Management Science Society Conference
    • /
    • 2006.05a
    • /
    • pp.92-100
    • /
    • 2006
  • In many common control charting situations, the statistic to be charted can be viewed as the output of a linear filter applied to the sequence of process measurement data. In recent work that has generalized this concept, the charted statistic is the output of a general linear filter in impulse response form, and the filter is designed by selecting its impulse response coefficients in order to optimize its average run length performance. In this work, we restrict attention to the class of all second-order linear filters applied to the residuals of a time series model of the process data. We present an algorithm for optimizing the design of the second-order filter that is more computationally efficient and robust than the algorithm for optimizing the general linear filter. We demonstrate that the optimal second-order filter performs almost as well as the optimal general linear filter in many situations. Both methods share a number of interesting characteristics and are tuned to detect any distinct features of the process mean shift, as it manifests itself in the residuals.

  • PDF

A Study on UBM Method Detecting Mean Shift in Autocorrelated Process Control

  • Jun, Sang-Pyo
    • Journal of the Korea Society of Computer and Information
    • /
    • v.25 no.12
    • /
    • pp.187-194
    • /
    • 2020
  • In today's process-oriented industries, such as semiconductor and petrochemical processes, autocorrelation exists between observed data. As a management method for the process where autocorrelation exists, a method of using the observations is to construct a batch so that the batch mean approaches to independence, or to apply the EWMA (Exponentially Weighted Moving Average) statistic of the observed value to the EWMA control chart. In this paper, we propose a method to determine the batch size of UBM (Unweighted Batch Mean), which is commonly used as a management method for observations, and a method to determine the optimal batch size based on ARL (Average Run Length) We propose a method to estimate the standard deviation of the process. We propose an improved control chart for processes in which autocorrelation exists.

Model Parameter Based Fault Detection for Time-series Data (시계열을 따르는 공정데이터의 모델 모수기반 이상탐지)

  • Park, Si-Jeo;Park, Cheong-Sool;Kim, Sung-Shick;Baek, Jun-Geol
    • Journal of the Korea Society for Simulation
    • /
    • v.20 no.4
    • /
    • pp.67-79
    • /
    • 2011
  • The statistical process control (SPC) assumes that observations follow the particular statistical distribution and they are independent to each other. However, the time-series data do not always follow the particular distribution, and most of cases are autocorrelated, therefore, it has limit to adopt the general SPC in tim series process. In this study, we propose a MPBC (Model Parameter Based Control-chart) method for fault detection in time-series processes. The MPBC builds up the process as a time-series model, and it can determine the faults by detecting changes parameters in the model. The process we analyze in the study assumes that the data follow the ARMA (p,q) model. The MPBC estimates model parameters using RLS (Recursive Least Square), and $K^2$-control chart is used for detecting out-of control process. The results of simulations support the idea that our proposed method performs better in time-series process.

A Study on the TQM and 6 Sigma Management - Primarily on service industry - (TQM과 6시그마 경영에 관한 고찰 - 서비스산업을 중심으로)

  • 김동훈;장영준
    • Journal of Korean Society for Quality Management
    • /
    • v.30 no.3
    • /
    • pp.120-138
    • /
    • 2002
  • In order to carry out TQM and 6 Sigma management, it is the key elements to make clear a goal and a mission. Above all, we can succeed In achieving a dramatic change and a great outcome only when we make clear the method, incentives, organization and plans under the clear objective. In order to secure the competitiveness against the external challenge, it is essential to keep the several crucial factors such as CEO's will, the systematic process to measure and manage, monitoring to satisfy customer's needs and an aggressive development of TQM activity to encourage the endeavour of the relentless enhancement, and also a positive effort Is to be made for evaluating all quality culture like training experts internally by an outstanding training program under CEO's firm leadership. This study is carried out to understand that which features and factors of success can exist in a company culture if a company accepts a theoretical basis and concept, the general of TQM and 6 Sigma which are one of a management strategy, and carries out TQM and 6 Sigma for achieving improvement of quality and customer's satisfaction.

An Alternative Method for Assessing Local Spatial Association Among Inter-paired Location Events: Vector Spatial Autocorrelation in Housing Transactions (쌍대위치 이벤트들의 국지적 공간적 연관성을 평가하기 위한 방법론적 연구: 주택거래의 벡터 공간적 자기상관)

  • Lee, Gun-Hak
    • Journal of the Economic Geographical Society of Korea
    • /
    • v.11 no.4
    • /
    • pp.564-579
    • /
    • 2008
  • It is often challenging to evaluate local spatial association among onedimensional vectors generally representing paired-location events where two points are physically or functionally connected. This is largely because of complex process of such geographic phenomena itself and partially representational complexity. This paper addresses an alternative way to identify spatially autocorrelated paired-location events (or vectors) at a local scale. In doing so, we propose a statistical algorithm combining univariate point pattern analysis for evaluating local clustering of origin-points and similarity measure of corresponding vectors. For practical use of the suggested method, we present an empirical application using transactions data in a local housing market, particularly recorded from 2004 to 2006 in Franklin County, Ohio in the United States. As a result, several locally characterized similar transactions are identified among a set of vectors showing various local moves associated with communities defined.

  • PDF

Spatial Data Analysis for the U.S. Regional Income Convergence,1969-1999: A Critical Appraisal of $\beta$-convergence (미국 소득분포의 지역적 수렴에 대한 공간자료 분석(1969∼1999년) - 베타-수렴에 대한 비판적 검토 -)

  • Sang-Il Lee
    • Journal of the Korean Geographical Society
    • /
    • v.39 no.2
    • /
    • pp.212-228
    • /
    • 2004
  • This paper is concerned with an important aspect of regional income convergence, ${\beta}$-convergence, which refers to the negative relationship between initial income levels and income growth rates of regions over a period of time. The common research framework on ${\beta}$-convergence which is based on OLS regression models has two drawbacks. First, it ignores spatially autocorrelated residuals. Second, it does not provide any way of exploring spatial heterogeneity across regions in terms of ${\beta}$-convergence. Given that empirical studies on ${\beta}$-convergence need to be edified by spatial data analysis, this paper aims to: (1) provide a critical review of empirical studies on ${\beta}$-convergence from a spatial perspective; (2) investigate spatio-temporal income dynamics across the U.S. labor market areas for the last 30 years (1969-1999) by fitting spatial regression models and applying bivariate ESDA techniques. The major findings are as follows. First, the hypothesis of ${\beta}$-convergence was only partially evidenced, and the trend substantively varied across sub-periods. Second, a SAR model indicated that ${\beta}$-coefficient for the entire period was not significant at the 99% confidence level, which may lead to a conclusion that there is no statistical evidence of regional income convergence in the US over the last three decades. Third, the results from bivariate ESDA techniques and a GWR model report that there was a substantive level of spatial heterogeneity in the catch-up process, and suggested possible spatial regimes. It was also observed that the sub-periods showed a substantial level of spatio-temporal heterogeneity in ${\beta}$-convergence: the catch-up scenario in a spatial sense was least pronounced during the 1980s.

On-Line Determination Steady State in Simulation Output (시뮬레이션 출력의 안정상태 온라인 결정에 관한 연구)

  • 이영해;정창식;경규형
    • Proceedings of the Korea Society for Simulation Conference
    • /
    • 1996.05a
    • /
    • pp.1-3
    • /
    • 1996
  • 시뮬레이션 기법을 이용한 시스템의 분석에 있어서 실험의 자동화는 현재 많은 연구와 개발이 진행 중인 분야이다. 컴퓨터와 정보통신 시스템에 대한 시뮬레이션의 예를 들어 보면, 수많은 모델을 대한 시뮬레이션을 수행할 경우 자동화된 실험의 제어가 요구되고 있다. 시뮬레이션 수행회수, 수행길이, 데이터 수집방법 등과 관련하여 시뮬레이션 실험방법이 자동화가 되지 않으면, 시뮬레이션 실험에 필요한 시간과 인적 자원이 상당히 커지게 되며 출력데이터에 대한 분석에 있어서도 어려움이 따르게 된다. 시뮬레이션 실험방법을 자동화하면서 효율적인 시뮬레이션 출력분석을 위해서는 시뮬레이션을 수행하는 경우에 항상 발생하는 초기편의 (initial bias)를 제거하는 문제가 선결되어야 한다. 시뮬레이션 출력분석에 사용되는 데이터들이 초기편의를 반영하지 않는 안정상태에서 수집된 것이어야만 실제 시스템에 대한 올바른 해석이 가능하다. 실제로 시뮬레이션 출력분석과 관련하여 가장 중요하면서도 어려운 문제는 시뮬레이션의 출력데이터가 이루는 추계적 과정 (stochastic process)의 안정상태 평균과 이 평균에 대한 신뢰구간(confidence interval: c. i.)을 구하는 것이다. 한 신뢰구간에 포함되어 있는 정보는 의사결정자에게 얼마나 정확하게 평균을 추정할 구 있는지 알려 준다. 그러나, 신뢰구간을 구성하는 일은 하나의 시뮬레이션으로부터 얻어진 출력데이터가 일반적으로 비정체상태(nonstationary)이고 자동상관(autocorrelated)되어 있기 때문에, 전통적인 통계적인 기법을 직접적으로 이용할 수 없다. 이러한 문제를 해결하기 위해 시뮬레이션 출력데이터 분석기법이 사용된다.본 논문에서는 초기편의를 제거하기 위해서 필요한 출력데이터의 제거시점을 찾는 새로운 기법으로, 유클리드 거리(Euclidean distance: ED)를 이용한 방법과 현재 패턴 분류(pattern classification) 문제에 널리 사용 중인 역전파 신경망(backpropagation neural networks: BNN) 알고리듬을 이용하는 방법을 제시한다. 이 기법들은 대다수의 기존의 기법과는 달리 시험수행(pilot run)이 필요 없으며, 시뮬레이션의 단일수행(single run) 중에 제거시점을 결정할 수 있다. 제거시점과 관련된 기존 연구는 다음과 같다. 콘웨이방법은 현재의 데이터가 이후 데이터의 최대값이나 최소값이 아니면 이 데이터를 제거시점으로 결정하는데, 알고기듬 구조상 온라인으로 제거시점 결정이 불가능하다. 콘웨이방법이 알고리듬의 성격상 온라인이 불가능한 반면, 수정콘웨이방법 (Modified Conway Rule: MCR)은 현재의 데이터가 이전 데이터와 비교했을 때 최대값이나 최소값이 아닌 경우 현재의 데이터를 제거시점으로 결정하기 때문에 온라인이 가능하다. 평균교차방법(Crossings-of-the-Mean Rule: CMR)은 누적평균을 이용하면서 이 평균을 중심으로 관측치가 위에서 아래로, 또는 아래서 위로 교차하는 회수로 결정한다. 이 기법을 사용하려면 교차회수를 결정해야 하는데, 일반적으로 결정된 교차회수가 시스템에 상관없이 일반적으로 적용가능하지 않다는 문제점이 있다. 누적평균방법(Cumulative-Mean Rule: CMR2)은 여러 번의 시험수행을 통해서 얻어진 출력데이터에 대한 총누적평균(grand cumulative mean)을 그래프로 그린 다음, 안정상태인 점을 육안으로 결정한다. 이 방법은 여러 번의 시뮬레이션을 수행에서 얻어진 데이터들의 평균들에 대한 누적평균을 사용하기 매문에 온라인 제거시점 결정이 불가능하며, 작업자가 그래프를 보고 임의로 결정해야 하는 단점이 있다. Welch방법(Welch's Method: WM)은 브라운 브리지(Brownian bridge) 통계량()을 사용하는데, n이 무한에 가까워질 때, 이 브라운 브리지 분포(Brownian bridge distribution)에 수렴하는 성질을 이용한다. 시뮬레이션 출력데이터를 가지고 배치를 구성한 후 하나의 배치를 표본으로 사용한다. 이 기법은 알고리듬이 복잡하고, 값을 추정해야 하는 단점이 있다. Law-Kelton방법(Law-Kelton's Method: LKM)은 회귀 (regression)이론에 기초하는데, 시뮬레이션이 종료된 후 누적평균데이터에 대해서 회귀직선을 적합(fitting)시킨다. 회귀직선의 기울기가 0이라는 귀무가설이 채택되면 그 시점을 제거시점으로 결정한다. 일단 시뮬레이션이 종료된 다음, 데이터가 모아진 순서의 반대 순서로 데이터를 이용하기 때문에 온라인이 불가능하다. Welch절차(Welch's Procedure: WP)는 5회이상의 시뮬레이션수행을 통해 수집한 데이터의 이동평균을 이용해서 시각적으로 제거시점을 결정해야 하며, 반복제거방법을 사용해야 하기 때문에 온라인 제거시점의 결정이 불가능하다. 또한, 한번에 이동할 데이터의 크기(window size)를 결정해야 한다. 지금까지 알아 본 것처럼, 기존의 방법들은 시뮬레이션의 단일 수행 중의 온라인 제거시점 결정의 관점에서는 미약한 면이 있다. 또한, 현재의 시뮬레이션 상용소프트웨어는 작업자로 하여금 제거시점을 임의로 결정하도록 하기 때문에, 실험중인 시스템에 대해서 정확하고도 정량적으로 제거시점을 결정할 수 없게 되어 있다. 사용자가 임의로 제거시점을 결정하게 되면, 초기편의 문제를 효과적으로 해결하기 어려울 뿐만 아니라, 필요 이상으로 너무 많은 양을 제거하거나 초기편의를 해결하지 못할 만큼 너무 적은 양을 제거할 가능성이 커지게 된다. 또한, 기존의 방법들의 대부분은 제거시점을 찾기 위해서 시험수행이 필요하다. 즉, 안정상태 시점만을 찾기 위한 시뮬레이션 수행이 필요하며, 이렇게 사용된 시뮬레이션은 출력분석에 사용되지 않기 때문에 시간적인 손실이 크게 된다.

  • PDF