• Title/Summary/Keyword: process variance

Search Result 920, Processing Time 0.028 seconds

A Control Scheme for a Gradual Drift in the Process Variance

  • Kang, Hunku
    • Journal of Korean Society of Industrial and Systems Engineering
    • /
    • v.23 no.56
    • /
    • pp.83-92
    • /
    • 2000
  • This paper presents a study on control schemes for gradual increases (drifts) in a process variance. A new control chart, the Drifting Variance Control Chart (DVCC) is designed using Likelihood Ratio Test (LRT), and the ARL performance of the chart is evaluated for different subgroup sizes. The performance of this chart is then compared to some of the popular control schemes for the process dispersion, like the Shewhart S$^2$chart, the CUSUM chart and the EWMA chart. Results are presented and discussed. Also included is a sensitivity analysis that investigates how the DVCC performs when applied to a stepped change in process variance.

  • PDF

Design and efficiency of the variance component model control chart (분산성분모형 관리도의 설계와 효율)

  • Cho, Chan Yang;Park, Changsoon
    • Journal of the Korean Data and Information Science Society
    • /
    • v.28 no.5
    • /
    • pp.981-999
    • /
    • 2017
  • In the standard control chart assuming a simple random model, we estimate the process variance without considering the between-sample variance. If the between-sample exists in the process, the process variance is under-estimated. When the process variance is under-estimated, the narrower control limits result in the excessive false alarm rate although the sensitivity of the control chart is improved. In this paper, using the variance component model to incorporate the between-sample variance, we set the control limits using both the within- and between-sample variances, and evaluate the efficiency of the control chart in terms of the average run length (ARL). Considering the most widely used control chart types such as ${\bar{X}}$, EWMA and CUSUM control charts, we compared the differences between two cases, Case I and Case II, where the between-sample variance is ignored and considered, respectively. We also considered the two cases when the process parameters are given and estimated. The results showed that the false alarm rate of Case I increased sharply as the between-sample variance increases, while that of Case II remains the same regardless of the size of the between-sample variance, as expected.

Treatability Evaluation of $A_{2}O$ System by Principal Component Analysis (주성분분석에 의한 $A_{2}O$공법의 처리성 평가)

  • 김복현;이재형;이수환;윤조희
    • Journal of Environmental Health Sciences
    • /
    • v.18 no.2
    • /
    • pp.67-74
    • /
    • 1992
  • The lab-scale biological A$_{2}$O system was applied from treating piggery wastewater highly polluted organic material which nitrogen and phosphorous are much contained relatively in conversion with other wastewater. The objective of this study was to investigate the effect of variance parameters on the treatability of this system according to operation conditions. An obtained experimental data were analysed by using principal component analysis (PCA) method. The results are summarized as follows: 1. From Varimax rotated factor loading in raw wastewater, variance of factor 1 was 36.8% and cumulative percentage of variance from factor 1 to factor 4 was 81.5% and of these was related to BOD, TKN and BOD loading. 2. In anaerobic process, variance of factor 1 was 33.5% and cumulative percentage of variance from factor I to factor 4 was 81.8% and of these was related to PO$_{4}$-P, BOD, DO and Temperature. 3. In anoxic process, variance of factor 1 was 30.1% and cumulative percentage of variance from factor i to factor 4 was 84.3% and of these was related to pH, DO, TKN and temperature. 4. In aerobic process, variance of factor 1 was 43.8% and cumulative percentage of variance from factor 1 to factor 4 was 81.5% and of these was highly related to DO, PO$_{4}$-P and BOD. 5. It was better to be operated below 0.30 kg/kg$\cdot$day F/M ratio to keep over 90% of BOD and SS, 80% of TKN, and 60% of PO$_{4}$-P in treatment efficiencies. 6. Treatment efficiencies was over 93% of BOD and SS, 81% of TKN and 60% of PO$_{4}$-P at over 20$^{\circ}$C, respectively.

  • PDF

A Study of Option Pricing Using Variance Gamma Process (Variance Gamma 과정을 이용한 옵션 가격의 결정 연구)

  • Lee, Hyun-Eui;Song, Seong-Joo
    • The Korean Journal of Applied Statistics
    • /
    • v.25 no.1
    • /
    • pp.55-66
    • /
    • 2012
  • Option pricing models using L$\acute{e}$evy processes are suggested as an alternative to the Black-Scholes model since empirical studies showed that the Black-Sholes model could not reflect the movement of underlying assets. In this paper, we investigate whether the Variance Gamma model can reflect the movement of underlying assets in the Korean stock market better than the Black-Scholes model. For this purpose, we estimate parameters and perform likelihood ratio tests using KOSPI 200 data based on the density for the log return and the option pricing formula proposed in Madan et al. (1998). We also calculate some statistics to compare the models and examine if the volatility smile is corrected through regression analysis. The results show that the option price estimated under the Variance Gamma process is closer to the market price than the Black-Scholes price; however, the Variance Gamma model still cannot solve the volatility smile phenomenon.

Heuristic Process Capability Indices Using Distribution-decomposition Methods (분포분할법을 이용한 휴리스틱 공정능력지수의 비교 분석)

  • Chang, Youngsoon
    • Journal of Korean Society for Quality Management
    • /
    • v.41 no.2
    • /
    • pp.233-248
    • /
    • 2013
  • Purpose: This study develops heuristic process capability indices (PCIs) using distribution-decomposition methods and evaluates the performances. The heuristic methods decompose the variation of a quality characteristic into upper and lower deviations and adjust the value of the PCIs using decomposed deviations in accordance with the skewness. The weighted variance(WV), new WV(NWV), scaled WV(SWV), and weighted standard deviation(WSD) methods are considered. Methods: The performances of the heuristic PCIs are investigated under the varied situations such as various skewed distributions, sample sizes, and specifications. Results: WV PCI is the best under the normal populations, WSD and SWV PCIs are the best under the low skewed populations, NWV PCI is the best under the moderate and high skewed populations. Conclusion: Comprehensive analysis shows that the NWV method is most adequate for a practical use.

Procedures for Monitoring the Process Mean and Variance with One Control Chart (하나의 관리도로 공정 평균과 분산의 변화를 탐지하는 절차)

  • Jung, Sang-Hyun;Lee, Jae-Heon
    • The Korean Journal of Applied Statistics
    • /
    • v.21 no.3
    • /
    • pp.509-521
    • /
    • 2008
  • Two control charts are usually required to monitor both the process mean and variance. In this paper, we introduce control procedures for jointly monitoring the process mean and variance with one control chart, and investigate efficiency of the introduced charts by comparing with the combined two EWMA charts. Our numerical results show that the GLR chart, the Omnibus EWMA chart, and the Interval chart have good ARL properties for simultaneous changes in the process mean and variance.

Evaluation of Non - Normal Process Capability by Johnson System (존슨 시스템에 의한 비정규 공정능력의 평가)

  • 김진수;김홍준
    • Journal of the Korea Safety Management & Science
    • /
    • v.3 no.3
    • /
    • pp.175-190
    • /
    • 2001
  • We propose, a new process capability index $C_{psk}$(WV) applying the weighted variance control charting method for non-normally distributed. The main idea of the weighted variance method(WVM) is to divide a skewed or asymmetric distribution into two normal distributions from its mean to create two new distributions which have the same mean but different standard deviations. In this paper we propose an example, a distributions generated from the Johnson family of distributions, to demonstrate how the weighted variance-based process capability indices perform in comparison with another two non-normal methods, namely the Clements and the Wright methods. This example shows that the weighted valiance-based indices are more consistent than the other two methods in terms of sensitivity to departure to the process mean/median from the target value for non-normal processes. Second method show using the percentage nonconforming by the Pearson, Johnson and Burr systems. This example shows a little difference between the Pearson system and Burr system, but Johnson system underestimated than the two systems for process capability.

  • PDF

Variance Swap Pricing with a Regime-Switching Market Environment

  • Roh, Kum-Hwan
    • Management Science and Financial Engineering
    • /
    • v.19 no.1
    • /
    • pp.49-52
    • /
    • 2013
  • In this paper we provide a valuation formula for a variance swap with regime switching. A variance swap is a forward contract on variance, the square of realized volatility of the underlying asset. We assume that the volatility of underlying asset is governed by Markov regime-switching process with finite states. We find that the proposed model can provide ease of calculation and be superior to the models currently available.

GARCH-X(1, 1) model allowing a non-linear function of the variance to follow an AR(1) process

  • Didit B Nugroho;Bernadus AA Wicaksono;Lennox Larwuy
    • Communications for Statistical Applications and Methods
    • /
    • v.30 no.2
    • /
    • pp.163-178
    • /
    • 2023
  • GARCH-X(1, 1) model specifies that conditional variance follows an AR(1) process and includes a past exogenous variable. This study proposes a new class from that model by allowing a more general (non-linear) variance function to follow an AR(1) process. The functions applied to the variance equation include exponential, Tukey's ladder, and Yeo-Johnson transformations. In the framework of normal and student-t distributions for return errors, the empirical analysis focuses on two stock indices data in developed countries (FTSE100 and SP500) over the daily period from January 2000 to December 2020. This study uses 10-minute realized volatility as the exogenous component. The parameters of considered models are estimated using the adaptive random walk metropolis method in the Monte Carlo Markov chain algorithm and implemented in the Matlab program. The 95% highest posterior density intervals show that the three transformations are significant for the GARCHX(1, 1) model. In general, based on the Akaike information criterion, the GARCH-X(1, 1) model that has return errors with student-t distribution and variance transformed by Tukey's ladder function provides the best data fit. In forecasting value-at-risk with the 95% confidence level, the Christoffersen's independence test suggest that non-linear models is the most suitable for modeling return data, especially model with the Tukey's ladder transformation.