• 제목/요약/키워드: New Weighted Variance

검색결과 21건 처리시간 0.018초

WEIGHTED POSSIBILISTIC VARIANCE AND MOMENTS OF FUZZY NUMBERS

  • Pasha, E.;Asady, B.;Saeidifar, A.
    • Journal of applied mathematics & informatics
    • /
    • 제26권5_6호
    • /
    • pp.1169-1183
    • /
    • 2008
  • In this paper, a method to find the weighted possibilistic variance and moments about the mean value of fuzzy numbers via applying a difuzzification using minimizer of the weighted distance between two fuzzy numbers is introduced. In this way, we obtain the nearest weighted point with respect to a fuzzy number, this main result is a new and interesting alternative justification to define of weighted mean of a fuzzy number. Considering this point and the weighted distance quantity, we introduce the weighted possibilistic mean (WPM) value and the weighted possibilistic variance(WPV) of fuzzy numbers. This paper shows that WPM is the nearest weighted point to fuzzy number and the WPV of fuzzy number is preserved more properties of variance in probability theory so that it can simply introduce the possibilistic moments about the mean of fuzzy numbers without problem. The moments of fuzzy numbers play an important role to estimate of parameters, skewness, kurtosis in many of fuzzy times series models.

  • PDF

분포분할법을 이용한 휴리스틱 공정능력지수의 비교 분석 (Heuristic Process Capability Indices Using Distribution-decomposition Methods)

  • 장영순
    • 품질경영학회지
    • /
    • 제41권2호
    • /
    • pp.233-248
    • /
    • 2013
  • Purpose: This study develops heuristic process capability indices (PCIs) using distribution-decomposition methods and evaluates the performances. The heuristic methods decompose the variation of a quality characteristic into upper and lower deviations and adjust the value of the PCIs using decomposed deviations in accordance with the skewness. The weighted variance(WV), new WV(NWV), scaled WV(SWV), and weighted standard deviation(WSD) methods are considered. Methods: The performances of the heuristic PCIs are investigated under the varied situations such as various skewed distributions, sample sizes, and specifications. Results: WV PCI is the best under the normal populations, WSD and SWV PCIs are the best under the low skewed populations, NWV PCI is the best under the moderate and high skewed populations. Conclusion: Comprehensive analysis shows that the NWV method is most adequate for a practical use.

존슨 시스템에 의한 비정규 공정능력의 평가 (Evaluation of Non - Normal Process Capability by Johnson System)

  • 김진수;김홍준
    • 대한안전경영과학회지
    • /
    • 제3권3호
    • /
    • pp.175-190
    • /
    • 2001
  • We propose, a new process capability index $C_{psk}$(WV) applying the weighted variance control charting method for non-normally distributed. The main idea of the weighted variance method(WVM) is to divide a skewed or asymmetric distribution into two normal distributions from its mean to create two new distributions which have the same mean but different standard deviations. In this paper we propose an example, a distributions generated from the Johnson family of distributions, to demonstrate how the weighted variance-based process capability indices perform in comparison with another two non-normal methods, namely the Clements and the Wright methods. This example shows that the weighted valiance-based indices are more consistent than the other two methods in terms of sensitivity to departure to the process mean/median from the target value for non-normal processes. Second method show using the percentage nonconforming by the Pearson, Johnson and Burr systems. This example shows a little difference between the Pearson system and Burr system, but Johnson system underestimated than the two systems for process capability.

  • PDF

비정규 공정능력 측도에 관한 연구 (A Study on a Measure for Non-Normal Process Capability)

  • 김홍준;김진수;조남호
    • 한국신뢰성학회:학술대회논문집
    • /
    • 한국신뢰성학회 2001년도 정기학술대회
    • /
    • pp.311-319
    • /
    • 2001
  • All indices that are now in use assume normally distributed data, and any use of the indices on non-normal data results in inaccurate capability measurements. Therefore, $C_{s}$ is proposed which extends the most useful index to date, the Pearn-Kotz-Johnson $C_{pmk}$, by not only taking into account that the process mean may not lie midway between the specification limits and incorporating a penalty when the mean deviates from its target, but also incorporating a penalty for skewness. Therefore we propose, a new process capability index $C_{psk}$( WV) applying the weighted variance control charting method for non-normally distributed. The main idea of the weighted variance method(WVM) is to divide a skewed or asymmetric distribution into two normal distribution from its mean to create two new distributions which have the same mean but different standard distributions. In this paper we propose an example, a distribution generated from the Johnson family of distributions, to demonstrate how the weighted variance-based process capability indices perform in comparison with another two non-normal methods, namely the Clements and the Wright methods. This example shows that the weighted valiance-based indices are more consistent than the other two methods In terms of sensitivity to departure to the process mean/median from the target value for non-normal process.s.s.s.

  • PDF

클러스터 시스템의 부하분산 알고리즘의 효율성 비교분석 (An Analysis and Comparison on Efficiency of Load Distribution Algorithm in a Clustered System)

  • 김석찬;이영
    • 한국정보과학회논문지:컴퓨팅의 실제 및 레터
    • /
    • 제12권2호
    • /
    • pp.111-118
    • /
    • 2006
  • 본 연구에서는 클러스터 시스템에 적용되는 새로운 부하할당 알고리즘을 기존의 알고리즘과 비교하여 분석하고자 한다. PWLC 알고리즘은 설정된 가중치 산정주기마다. 시스템의 부하를 감지하여, 각 서버에 가중치를 부여하여 다음 주기에 가중치에 의하여 부하를 분산시키는 알고리즘이다. PWLC 알고리즘과 DWRR 알고리즘을 가중치 산정주기를 변화시키면서 분산과 대기시간 등에 비교하였다. 가중치 산정주기가 너무 짧으면 시스템은 부하를 감지하는데 잉여부하가 소요될 수 있으며, 이와 반대로, 가중치 산정주기가 너무 길면 알고리즘 적용에 의한 부하할당이 비효율적으로 될 수 있다. PWLC 알고리즘이 DWRR 알고리즘보다. 더 효율적임을 알 수 있다.

A Comparative Study for Several Bayesian Estimators Under Squared Error Loss Function

  • Kim, Yeong-Hwa
    • Journal of the Korean Data and Information Science Society
    • /
    • 제16권2호
    • /
    • pp.371-382
    • /
    • 2005
  • The paper compares the performance of some widely used Bayesian estimators such as Bayes estimator, empirical Bayes estimator, constrained Bayes estimator and constrained Bayes estimator by means of a new measurement under squared error loss function for the typical normal-normal situation. The proposed measurement is a weighted sum of the precisions of first and second moments. As a result, one can gets the criterion according to the size of prior variance against the population variance.

  • PDF

최적 가변 극점 배치 자기동조 제어에 관한 연구 (A study on optimal variable pole assignment self-tuning control)

  • 전종암;조병선;박민용;이상배
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 제어로봇시스템학회 1988년도 한국자동제어학술회의논문집(국내학술편); 한국전력공사연수원, 서울; 21-22 Oct. 1988
    • /
    • pp.246-249
    • /
    • 1988
  • In this paper, a new design technique which uses weighted least-sqare approach for the solution of the pole assignment problem is represented. This technique maybe used to assign some closed loop poles to places which reduce the large system input and output variance due to near pole-zero condition. The least-square approach is also applied to the design of servo self-tuning controller with integrator.

  • PDF

Large-Scale Phase Retrieval via Stochastic Reweighted Amplitude Flow

  • Xiao, Zhuolei;Zhang, Yerong;Yang, Jie
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • 제14권11호
    • /
    • pp.4355-4371
    • /
    • 2020
  • Phase retrieval, recovering a signal from phaseless measurements, is generally considered to be an NP-hard problem. This paper adopts an amplitude-based nonconvex optimization cost function to develop a new stochastic gradient algorithm, named stochastic reweighted phase retrieval (SRPR). SRPR is a stochastic gradient iteration algorithm, which runs in two stages: First, we use a truncated sample stochastic variance reduction algorithm to initialize the objective function. The second stage is the gradient refinement stage, which uses continuous updating of the amplitude-based stochastic weighted gradient algorithm to improve the initial estimate. Because of the stochastic method, each iteration of the two stages of SRPR involves only one equation. Therefore, SRPR is simple, scalable, and fast. Compared with the state-of-the-art phase retrieval algorithm, simulation results show that SRPR has a faster convergence speed and fewer magnitude-only measurements required to reconstruct the signal, under the real- or complex- cases.

가중 합치도 Hω와 κ의 새로운 역설 (Weighted Hω and New Paradox of κ)

  • 권나영;김진곤;박용규
    • 응용통계연구
    • /
    • 제22권5호
    • /
    • pp.1073-1084
    • /
    • 2009
  • 두 평정자가 R개의 순서형 반응 범주로 각 개체를 분류한 $R{\times}R$ 분할표에 대해, 불합치의 정도를 가중치로 부여한 가중 합치도 $H_{\omega}$를 제안하고, 최대 우도추정량 및 분산을 유도하였다. 또한 $2{\times}2$ 분할표에서 Feinstein과 Cicchetti(1990)가 제기한 마지막 역설을 새롭게 정의하고 증명하였으며, ${\kappa}$의 새로운 역설을 제기하고, ${\kappa}$와 주변분포의 전반적인 관계를 정리하였다.

Optimal Portfolio Models for an Inefficient Market

  • GINTING, Josep;GINTING, Neshia Wilhelmina;PUTRI, Leonita;NIDAR, Sulaeman Rahman
    • The Journal of Asian Finance, Economics and Business
    • /
    • 제8권2호
    • /
    • pp.57-64
    • /
    • 2021
  • This research attempts to formulate a new mean-risk model to replace the Markowitz mean-variance model by altering the risk measurement using ARCH variance instead of the original variance. In building the portfolio, samples used are closing prices of Indonesia Composite Stock Index and Indonesia Composite Bonds Index from 2013 to 2018. This study is a qualitative study using secondary data from the Indonesia Stock Exchange and Indonesia Bonds Pricing Agency. This research found that Markowitz's model is still superior when utilized in daily data, while the mean-ARCH model is appropriate with wider gap data like monthly observation. The Historical return has also proven to be more appropriate as a benchmark in selecting an optimal portfolio rather than a risk-free rate in an inefficient market. Therefore Mean-ARCH is more appropriate when utilized under data that have a wider gap between the period. The research findings show that the portfolio combination produced is inefficient due to the market inefficiency indicated by the meager return of the stock, while bears notable standard deviation. Therefore, the researcher of this study proposed to replace the risk-free rate as a benchmark with the historical return. The Historical return proved to be more realistic than the risk-free rate in inefficient market conditions.