• Title/Summary/Keyword: statistical confidence

Search Result 1,002, Processing Time 0.024 seconds

A study on market-production model building for small bar steels (소봉제품의 시장생산 모형 구축에 관한 연구)

  • 김수홍;유정빈
    • Proceedings of the Korean Operations and Management Science Society Conference
    • /
    • 1996.10a
    • /
    • pp.139-145
    • /
    • 1996
  • A forecast on the past output data sets of small bar steels is very important information to make a decision on the future production quantities. In many cases, however, it has been mainly determined by experience (or rule of thumb). In this paper, past basic data sets of each small bar steels are statistically analyzed by some graphical and statistical forecasting methods. This work is mainly done by SAS. Among various quantitative forecasting methods in SAS, STEPAR forecasting method was best performed to the above data sets. By the method, the future production quantities of each small bar steels are forecasted. As a result of this statistical analysis, 95% confidence intervals for future forecast quantities are very wide. To improve this problem, a suitable systematic database system, integrated management system of demand-production-inventory and integrated computer system should be required.

  • PDF

Development of On-Line Diagnostic Expert System Algorithmic Sensor Validation (진단 전문가시스템의 개발 : 연산적 센서검증)

  • 김영진
    • Transactions of the Korean Society of Mechanical Engineers
    • /
    • v.18 no.2
    • /
    • pp.323-338
    • /
    • 1994
  • This paper outlines a framework for performing intelligent sensor validation for a diagnostic expert system while reasoning under uncertainty. The emphasis is on the algorithmic preprocess technique. A companion paper focusses on heuristic post-processing. Sensor validation plays a vital role in the ability of the overall system to correctly detemine the state of a plant monitored by imperfect sensors. Especially, several theoretical developments were made in understanding uncertain sensory data in statistical aspect. Uncertain information in sensory values is represented through probability assignments on three discrete states, "high", "normal", and "low", and additional sensor confidence measures in Algorithmic Sv.Upper and lower warning limits are generated from the historical learning sets, which represents the borderlines for heat rate degradation generated in the Algorithmic SV initiates a historic data base for better reference in future use. All the information generated in the Algorithmic SV initiate a session to differentiate the sensor fault from the process fault and to make an inference on the system performance. This framework for a diagnostic expert system with sensor validation and reasonig under uncertainty applies in HEATXPRT$^{TM}$, a data-driven on-line expert system for diagnosing heat rate degradation problems in fossil power plants.

Estimation of Composite Laminate Design Allowables Using the Statistical Characteristics of Lamina Level Test Data

  • Nam, Kyungmin;Park, Kook Jin;Shin, SangJoon;Kim, Seung Jo;Choi, Ik-Hyeon
    • International Journal of Aeronautical and Space Sciences
    • /
    • v.16 no.3
    • /
    • pp.360-369
    • /
    • 2015
  • A methodology for determining the design allowables of composite laminates by using lamina level test data and finite element analysis (FEA) is proposed and verified in this paper. An existing method that yields the laminate design allowables by using the complete test results for laminates was improved to reduce the expensive and time-consuming tests. Input property samples for FEA were generated after considering the statistical distribution characteristics of lamina level test data., and design allowables were derived from several FEA analyses of laminates. To apply and verify the proposed method, Hexcel 8552 IM7 test data were used. For both un-notched and open-hole laminate configurations, it was found that the design allowables obtained from the analysis correctly predicted the laminate test data within the confidence interval. The potential of the present simulation to substitute the laminate tests was demonstrated well.

Estimating the Transmittable Prevalence of Infectious Diseases Using a Back-Calculation Approach

  • Lee, Youngsaeng;Jang, Hyun Gap;Kim, Tae Yoon;Park, Jeong-Soo
    • Communications for Statistical Applications and Methods
    • /
    • v.21 no.6
    • /
    • pp.487-500
    • /
    • 2014
  • A new method to calculate the transmittable prevalence of an epidemic disease is proposed based on a back-calculation formula. We calculated the probabilities of reactivation and of parasitemia as well as transmittable prevalence (the number of persons with parasitemia in the incubation period) of malaria in South Korea using incidence of 12 years(2001-2012). For this computation, a new probability function of transmittable condition is obtained. The probability of reactivation is estimated by the least squares method for the back-calculated longterm incubation period. The probability of parasitemia is calculated by a convolution of the survival function of the short-term incubation function and the probability of reactivation. Transmittable prevalence is computed by a convolution of the infected numbers and the probabilities of transmission. Confidence intervals are calculated using the parametric bootstrap method. The method proposed is applicable to other epidemic diseases in other countries where incidence and a long incubation period are available. We found the estimated transmittable prevalence in South Korea was concentrated in the summer with 276 cases on a peak at the $31^{st}$ week and with about a 60% reduction in the peak from the naive prevalence. The statistics of transmittable prevalence can be used for malaria prevention programs and to select blood transfusion donors.

Application of Quality Statistical Techniques Based on the Review and the Interpretation of Medical Decision Metrics (의학적 의사결정 지표의 고찰 및 해석에 기초한 품질통계기법의 적용)

  • Choi, Sungwoon
    • Journal of the Korea Safety Management & Science
    • /
    • v.15 no.2
    • /
    • pp.243-253
    • /
    • 2013
  • This research paper introduces the application and implementation of medical decision metrics that classifies medical decision-making into four different metrics using statistical diagnostic tools, such as confusion matrix, normal distribution, Bayesian prediction and Receiver Operating Curve(ROC). In this study, the metrics are developed based on cross-section study, cohort study and case-control study done by systematic literature review and reformulated the structure of type I error, type II error, confidence level and power of detection. The study proposed implementation strategies for 10 quality improvement activities via 14 medical decision metrics which consider specificity and sensitivity in terms of ${\alpha}$ and ${\beta}$. Examples of ROC implication are depicted in this paper with a useful guidelines to implement a continuous quality improvement, not only in a variable acceptance sampling in Quality Control(QC) but also in a supplier grading score chart in Supplier Chain Management(SCM) quality. This research paper is the first to apply and implement medical decision-making tools as quality improvement activities. These proposed models will help quality practitioners to enhance the process and product quality level.

A Comparative Study on Productivity of the Single PPM Quality Certification Company by using the Bootstrapped Malmquist Productivity Indices (부트스트랩 맘퀴스트 생산성지수를 이용한 Single PPM 인증기업의 생산성 비교 연구)

  • Song, Gwang-Suk;Yoo, Han-Joo
    • Journal of Korean Society for Quality Management
    • /
    • v.38 no.2
    • /
    • pp.261-275
    • /
    • 2010
  • The purpose of this study is to empirically analyze the productivity change of the 10 Single PPM Certification Company in the 3 Industry(Electronics, Motor-Parts, Machines). In this study, Productivity change over the time in Korean small and medium sized firms in the 3 industries by the bootstrapped Malmquist Productivity Index(MPI). The traditional Malmquist Productivity Index(MPI) and Data Envelopment Analysis(DEA) Models have not only bias but also lack statistical confidence intervals. they could lead to wrong evaluations of the efficiency and productivity scores. In this paper, DEA and a MPI are combined with a bootstrap method in order to provide statistical inferences that analyze the performance of the Single PPM Certification Company. The data cover the period between 2004 and 2007. The result of this paper reveals : 1) The Electronics Industry had productivity effect of 17%, but there was not direct effect for other Industries(Motor-Parts, Machines). 2) average productivity Progress of the 7DMU(Electronics), 1DMU(Motor-Parts) and none(Machines).

Sequential patient recruitment monitoring in multi-center clinical trials

  • Kim, Dong-Yun;Han, Sung-Min;Youngblood, Marston Jr.
    • Communications for Statistical Applications and Methods
    • /
    • v.25 no.5
    • /
    • pp.501-512
    • /
    • 2018
  • We propose Sequential Patient Recruitment Monitoring (SPRM), a new monitoring procedure for patient recruitment in a clinical trial. Based on the sequential probability ratio test using improved stopping boundaries by Woodroofe, the method allows for continuous monitoring of the rate of enrollment. It gives an early warning when the recruitment is unlikely to achieve the target enrollment. The packet data approach combined with the Central Limit Theorem makes the method robust to the distribution of the recruitment entry pattern. A straightforward application of the counting process framework can be used to estimate the probability to achieve the target enrollment under the assumption that the current trend continues. The required extension of the recruitment period can also be derived for a given confidence level. SPRM is a new, continuous patient recruitment monitoring tool that provides an opportunity for corrective action in a timely manner. It is suitable for the modern, centralized data management environment and requires minimal effort to maintain. We illustrate this method using real data from two well-known, multicenter, phase III clinical trials.

Uncertainty in Determination of Menthol from Mentholated Cigarette (담배 중 멘톨 분석에 대한 불확도 측정)

  • 장기철;이운철;백순옥;한상빈
    • Journal of the Korean Society of Tobacco Science
    • /
    • v.22 no.1
    • /
    • pp.91-98
    • /
    • 2000
  • This study was carried out to evaluate the uncertainty in the analysis of menthol content from the mentholated cigarette. Menthol in the sample cigarette was extracted with methanol containing an anethole as an internal standard, and then analyzed by gas chromatography. As the sources of uncertainty associated with the analysis of menthol, were the following points tested, such as the weighing of sample, the preparation of extracting solution, the pipetting of extracting solution into the sample, the preparation of standard solution, the precision of GC injections for standard & sample solution, the GC response factor of standard solution, the reproducibility of menthol analysis, and the determination of water content in tobacco, etc. For calculating the uncertainties, type A of uncertainty was evaluated by the statistical analysis of a series of observation, and type B by the information based on supplier's catalogue and/or certificated of calibration. Sources of uncertainty were subsequently included and mathematically combined with the uncertainty arising from the assessment of accuracy to provide the overall uncertainty. It was shown that the main source of uncertainty came from the errors in the reproducibility of menthol and water determination, the purity of menthol reference material in the preparation of standard solution, and the precision of GC injections for sample solution. The errors in sample weighing and volume measurement contributed relatively little to the overall uncertainty. The expanded uncertainty in the mentholated cigarettes, Korean and American brand, at 0.95 level of statistical confidence was $\pm$0.06 and $\pm$0.07 mg/g for a menthol content of 1.89 and 2.32 mg/g, respectively.

  • PDF

GLP Perspectives of Bioequivalence Studies

  • Jeong, Eun-Ju
    • Proceedings of the Korean Society of Toxicology Conference
    • /
    • 2006.11a
    • /
    • pp.80-86
    • /
    • 2006
  • Bioequivalence is a term in pharmacokinetics used to access the expected in vivo biological equivalence of two proprietary preparations of a drug. Bioequivalence studies are usually performed for generic drugs. Two pharmaceutical products are bioequivalent if they are pharmaceutically equivalent and their bioavailabilioes after administration in the same molar dose are similar. Bioequivalence is usually accessed by single dose in vivo studies in healthy volunteers and the reference product is usually the innovator product that is marketed. Regulatory definition of bioequivalence is based on the statistical analysis of thebioavailability of the reference and test product. In general, two products are evaluated as bioequivalent if the 90% confidence interval of the relative mean Cmaxand AUC of the test to reference product are within 80.00% to 125.00% in the fasting state. Key process in bioequivalence study is development and validation of bioanalytical method, determination of the drug concentration in the biosamples (usually plasma or serum) obtained from volunteers, calculation of the pharmacokinetic parameters and statistical analysis of the pharmacokinetic parameters. Although current guidelines and regulations do not require the bioequivalence studies to be done under good laboratory practice (CLP), the issues to perform the bioequivalence studies under GLP environment is emerged both from the regulatory and industry side. GLP perspectives of bioequivalence studiesare needed to be discussed in respect to achieve quality assurance in bioequivalence studies.

  • PDF

A Study on the Statistical Representativeness of Samples taken from Radioactive Soil (방사성 토양폐기물 시료의 통계적 대표성에 관한 연구)

  • Cho Han-Seok;Kim T.K.;Lee K.M.;Ahn S.J.;Shon J.S.
    • Proceedings of the Korean Radioactive Waste Society Conference
    • /
    • 2005.06a
    • /
    • pp.151-157
    • /
    • 2005
  • For the treatment of regulatory clearance of the soils, a procedure for the radionuclides and radioactivity concentration analysis is under development. A strategy for soil sampling including random sampling after homogenization and standardization was set up. Statistical representativeness is considered for not only sampling strategy but also sample size. In this study, designed sample size was designed with confidence interval and error bound of soil using the pilot samples which were taken following the sampling strategy.

  • PDF