• Title/Summary/Keyword: total variation metrics

Search Result 6, Processing Time 0.024 seconds

SECOND ORDER REGULAR VARIATION AND ITS APPLICATIONS TO RATES OF CONVERGENCE IN EXTREME-VALUE DISTRIBUTION

  • Lin, Fuming;Peng, Zuoxiang;Nadarajah, Saralees
    • Bulletin of the Korean Mathematical Society
    • /
    • v.45 no.1
    • /
    • pp.75-93
    • /
    • 2008
  • The rate of convergence of the distribution of order statistics to the corresponding extreme-value distribution may be characterized by the uniform and total variation metrics. de Haan and Resnick [4] derived the convergence rate when the second order generalized regularly varying function has second order derivatives. In this paper, based on the properties of the generalized regular variation and the second order generalized variation and characterized by uniform and total variation metrics, the convergence rates of the distribution of the largest order statistic are obtained under weaker conditions.

A Comparison of the Rudin-Osher-Fatemi Total Variation model and the Nonlocal Means Algorithm

  • Adiya, Enkhbolor;Choi, Heung-Kook
    • Proceedings of the Korea Multimedia Society Conference
    • /
    • 2012.05a
    • /
    • pp.6-9
    • /
    • 2012
  • In this study, we compare two image denoising methods which are the Rudin-Osher-Fatemi total variation (TV) model and the nonlocal means (NLM) algorithm on medical images. To evaluate those methods, we used two well known measuring metrics. The methods are tested with a CT image, one X-Ray image, and three MRI images. Experimental result shows that the NML algorithm can give better results than the ROF TV model, but computational complexity is high.

  • PDF

An Application Study of Six Sigma in Clinical Chemistry (6 시그마의 적용에 대한 연구)

  • Chang, Sang Wu;Kim, Nam Yong;Choi, Ho Sung;Park, Yong Won;Chu, Kyung Bok;Yun, Kyeun Young
    • Korean Journal of Clinical Laboratory Science
    • /
    • v.36 no.2
    • /
    • pp.121-126
    • /
    • 2004
  • The primary goal of six sigma is to improve patient satisfaction, and thereby profitability, by reducing and eliminating defects. Defects may be related to any aspect of customer satisfaction: high product quality, schedule adherence, cost minimization, process capability indices, defects per unit, and yield. Many six sigma metrics can be mathematically related to the others. Literally, six means six standard deviations from the mean or median value. As applied to quality metrics, the term indicates that failures are at least six standard deviations from the mean or norm. This would mean about 3.4 failures per million opportunities for failure. The objective of six sigma quality is to reduce process output variation so that on a long term basis, which is the customer's aggregate experience with our process over time, this will result in no more than 3.4 defect Parts Per Million(PPM) opportunities (or 3.4 Defects Per Million Opportunities. For a process with only one specification limit (upper or lower), this results in six process standard deviations between the mean of the process and the customer's specification limit (hence, 6 Sigma). The results of applicative six sigma experiment studied on 18 items TP, ALB, T.B, ALP, AST, ALT, CL, CK, LD, K, Na, CRE, BUN, T.C, GLU, AML, CA tests in clinical chemistry were follows. Assessment of process performance fits within six sigma tolerance limits were TP, ALB, T.B, ALP, AST, ALT, CL, CK, LD, K, Na, CRE, BUN, T.C, GLU, AML, CA with 72.2%, items that fit within five sigma limits were total bilirubin, chloride and sodium were 3 sigma. We were sure that the goal of six sigma would reduce test variation in the process.

  • PDF

A Study of Six Sigma and Total Error Allowable in Chematology Laboratory (6 시그마와 총 오차 허용범위의 개발에 대한 연구)

  • Chang, Sang-Wu;Kim, Nam-Yong;Choi, Ho-Sung;Kim, Yong-Whan;Chu, Kyung-Bok;Jung, Hae-Jin;Park, Byong-Ok
    • Korean Journal of Clinical Laboratory Science
    • /
    • v.37 no.2
    • /
    • pp.65-70
    • /
    • 2005
  • Those specifications of the CLIA analytical tolerance limits are consistent with the performance goals in Six Sigma Quality Management. Six sigma analysis determines performance quality from bias and precision statistics. It also shows if the method meets the criteria for the six sigma performance. Performance standards calculates allowable total error from several different criteria. Six sigma means six standard deviations from the target value or mean value and about 3.4 failures per million opportunities for failure. Sigma Quality Level is an indicator of process centering and process variation total error allowable. Tolerance specification is replaced by a Total Error specification, which is a common form of a quality specification for a laboratory test. The CLIA criteria for acceptable performance in proficiency testing events are given in the form of an allowable total error, TEa. Thus there is a published list of TEa specifications for regulated analytes. In terms of TEa, Six Sigma Quality Management sets a precision goal of TEa/6 and an accuracy goal of 1.5 (TEa/6). This concept is based on the proficiency testing specification of target value +/-3s, TEa from reference intervals, biological variation, and peer group median mean surveys. We have found rules to calculate as a fraction of a reference interval and peer group median mean surveys. We studied to develop total error allowable from peer group survey results and CLIA 88 rules in US on 19 items TP, ALB, T.B, ALP, AST, ALT, CL, LD, K, Na, CRE, BUN, T.C, GLU, GGT, CA, phosphorus, UA, TG tests in chematology were follows. Sigma level versus TEa from peer group median mean CV of each item by group mean were assessed by process performance, fitting within six sigma tolerance limits were TP ($6.1{\delta}$/9.3%), ALB ($6.9{\delta}$/11.3%), T.B ($3.4{\delta}$/25.6%), ALP ($6.8{\delta}$/31.5%), AST ($4.5{\delta}$/16.8%), ALT ($1.6{\delta}$/19.3%), CL ($4.6{\delta}$/8.4%), LD ($11.5{\delta}$/20.07%), K ($2.5{\delta}$/0.39mmol/L), Na ($3.6{\delta}$/6.87mmol/L), CRE ($9.9{\delta}$/21.8%), BUN ($4.3{\delta}$/13.3%), UA ($5.9{\delta}$/11.5%), T.C ($2.2{\delta}$/10.7%), GLU ($4.8{\delta}$/10.2%), GGT ($7.5{\delta}$/27.3%), CA ($5.5{\delta}$/0.87mmol/L), IP ($8.5{\delta}$/13.17%), TG ($9.6{\delta}$/17.7%). Peer group survey median CV in Korean External Assessment greater than CLIA criteria were CL (8.45%/5%), BUN (13.3%/9%), CRE (21.8%/15%), T.B (25.6%/20%), and Na (6.87mmol/L/4mmol/L). Peer group survey median CV less than it were as TP (9.3%/10%), AST (16.8%/20%), ALT (19.3%/20%), K (0.39mmol/L/0.5mmol/L), UA (11.5%/17%), Ca (0.87mg/dL1mg/L), TG (17.7%/25%). TEa in 17 items were same one in 14 items with 82.35%. We found out the truth on increasing sigma level due to increased total error allowable, and were sure that the goal of setting total error allowable would affect the evaluation of sigma metrics in the process, if sustaining the same process.

  • PDF

An Analytic solution for the Hadoop Configuration Combinatorial Puzzle based on General Factorial Design

  • Priya, R. Sathia;Prakash, A. John;Uthariaraj, V. Rhymend
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • v.16 no.11
    • /
    • pp.3619-3637
    • /
    • 2022
  • Big data analytics offers endless opportunities for operational enhancement by extracting valuable insights from complex voluminous data. Hadoop is a comprehensive technological suite which offers solutions for the large scale storage and computing needs of Big data. The performance of Hadoop is closely tied with its configuration settings which depends on the cluster capacity and the application profile. Since Hadoop has over 190 configuration parameters, tuning them to gain optimal application performance is a daunting challenge. Our approach is to extract a subset of impactful parameters from which the performance enhancing sub-optimal configuration is then narrowed down. This paper presents a statistical model to analyze the significance of the effect of Hadoop parameters on a variety of performance metrics. Our model decomposes the total observed performance variation and ascribes them to the main parameters, their interaction effects and noise factors. The method clearly segregates impactful parameters from the rest. The configuration setting determined by our methodology has reduced the Job completion time by 22%, resource utilization in terms of memory and CPU by 15% and 12% respectively, the number of killed Maps by 50% and Disk spillage by 23%. The proposed technique can be leveraged to ease the configuration tuning task of any Hadoop cluster despite the differences in the underlying infrastructure and the application running on it.

Applications and Assessments of a Multimetric Model to Namyang Reservoir (남양호에서 다변수 메트릭 모델 적용 및 평가)

  • Han, Jung-Ho;An, Kwang-Guk
    • Korean Journal of Ecology and Environment
    • /
    • v.41 no.2
    • /
    • pp.228-236
    • /
    • 2008
  • The purpose of this study was to evaluate fish metric attributes using a model of Lentic Ecosystem Health Assessment (LEHA) and apply the model to the dataset sampled from six sites of Namyang Reservoir during October 2005$\sim$May 2006. The model was composed of 11 metries and the metric attributes were made of physical, chemical and biological parameters. Trophic composition's metrics showed that tolerant species ($M_3$, 80%) and omnivore species ($M_4$, 92%) dominated the fish fauna, indicating a biological degradation in the aquatic ecosystem. The metric of $M_7$, relative proportions of exotic species, also showed greater than 8% of the total, indicating a ecological disturbance. The average value of LEHA model was 24.3 (n= 12) in the reservoir, indicating a "poor condition" by the criteria of An and Han (2007). Spatial variation based on the model values was low (range: $21{\sim}26$), and temporal variation occurred due to a monsoon rainfall. Electrical conductivity (EC) and tropic state index of chlorophyll-$\alpha$ [TSI(CHL)] was greater in the premonsoon than the postmonsoon.