• Title/Summary/Keyword: quantitative estimation

Search Result 802, Processing Time 0.037 seconds

Efficiency Estimation of Process Plan Using Tolerance Chart

  • Kim I.H.;Dong Zuomin
    • Korean Journal of Computational Design and Engineering
    • /
    • v.11 no.2
    • /
    • pp.148-155
    • /
    • 2006
  • This paper presents a new method for assessing the efficiency of production process plans using tolerance chart to lower production cost. The tolerance chart is used to predict the accuracy of a part that is to be produced following the process plan, and to carry out the quantitative measurement on the efficiency of the process plan. By comparing the values of design tolerances and their corresponding resultant tolerances calculated using the tolerance chart, the process plan that is incapable of satisfying the design requirements and the faulty production operations can be identified. Similarly, the process plan that imposes unnecessarily high accuracy and wasteful production operations can also be identified. For the latter, a quantitative measure on the efficiency of the process plan is introduced. The higher the unnecessary cost of the production, the poor is the efficiency of the process plan. A coefficient is introduced for measuring the process plan efficiency. The coefficient also incorporates two weighting factors to reflect the difficulty of manufacturing operations and number of dimensional tolerances involved. To facilitate the identification of the machining operations and the machined surfaces, which are related to the unnecessarily tight resultant tolerances caused by the process plan, a rooted tree representation of the tolerance chart is introduced, and its use is demonstrated. An example is presented to illustrate the new method. This research introduces a new quantitative process plan evaluation method that may lead to the optimization of process plans.

Quantitative Assessment of Joint Roughness Coefficient from Televiewer and Core scan Images (텔레뷰어 및 코어 스캔 이미지를 이용한 절리면 거칠기 계수의 정량적인 평가)

  • Kim, Jung-Yul;Kim, Yoo-Sung
    • Proceedings of the Korean Geotechical Society Conference
    • /
    • 2005.03a
    • /
    • pp.1205-1210
    • /
    • 2005
  • The behavior of rock mass and solute(e.g. groundwater, radioactivity) flow in fractured rock can be directly influenced by joint roughness. The characteristics of joint roughness is also a main factor for the rock classification(e.g. RMR, Q system) which is usually used in tunnel design. Nevertheless, most of JRC estimation has been carried out only by the examination with the naked eye. This JRC estimation has a lack of objectivity because each investigator judges JRC by his subjective opinion. Therefore, it will be desirable that the assessment of JRC is performed by a numerical analysis which can give a quantitative value corresponding to the characteristics of a roughness curve. Meanwhile, roughness curves for joint surfaces which are observed in drill cores have been obtained only along linear profiles. Although roughness curves are measured in the same joint surface, they can frequently show diverse aspects in a standpoint of roughness characteristics. If roughness curves can be measured along the elliptical circumferences of joint surfaces from core scanning images or Televiewer images, they will certainly be more comprehensive than those measured along linear profiles for roughness characteristics of joint surfaces. This study is focus on dealing with (1) extracting automatically roughness curves from core scan image or Televiewer image, (2) improving the accuracy of quantitative assessment of JRC using fractal dimension concept.

  • PDF

Analysis of quantitative high throughput screening data using a robust method for nonlinear mixed effects models

  • Park, Chorong;Lee, Jongga;Lim, Changwon
    • Communications for Statistical Applications and Methods
    • /
    • v.27 no.6
    • /
    • pp.701-714
    • /
    • 2020
  • Quantitative high throughput screening (qHTS) assays are used to assess toxicity for many chemicals in a short period by collectively analyzing them at several concentrations. Data are routinely analyzed using nonlinear regression models; however, we propose a new method to analyze qHTS data using a nonlinear mixed effects model. qHTS data are generated by repeating the same experiment several times for each chemical; therefor, they can be viewed as if they are repeated measures data and hence analyzed using a nonlinear mixed effects model which accounts for both intra- and inter-individual variabilities. Furthermore, we apply a one-step approach incorporating robust estimation methods to estimate fixed effect parameters and the variance-covariance structure since outliers or influential observations are not uncommon in qHTS data. The toxicity of chemicals from a qHTS assay is classified based on the significance of a parameter related to the efficacy of the chemicals using the proposed method. We evaluate the performance of the proposed method in terms of power and false discovery rate using simulation studies comparing with one existing method. The proposed method is illustrated using a dataset obtained from the National Toxicology Program.

A Quantitative Evaluation of ${\Delta}K_{eff}$ Estimation Methods Based on Random Loading Crack Growth Data. (랜덤하중하의 피로균열진전 데이터를 이용한 ${\Delta}K_{eff}$ 평가법의 정량적 평가)

  • Koo, Ja-Suk;Song, Ji-Ho;Kang, Jae-Youn
    • Proceedings of the KSME Conference
    • /
    • 2004.11a
    • /
    • pp.208-213
    • /
    • 2004
  • Methods for estimation of the effective stress intensity factor range (${\Delta}K_{eff}$) are evaluated for narrow and wide band random loading crack growth test data of 2024-T351 aluminum alloy. Three methods of determining $K_{op}$, visual measurement, ASTM offset compliance method, and the neural network method proposed by Kang and Song, and three methods of estimating ${\Delta}K_{eff}$, conventional, the 2/PI0 and 2/PI methods proposed by Donald and Paris, are compared in a quantitative manner by using the results of fatigue crack growth life prediction under random loading. For all $K_{op}$ determination methods discussed, the 2/PI0 and 2/PI methods of estimating ${\Delta}K_{eff}$ provide better results than conventional method for narrow and wide band random loading data.

  • PDF

Research on the weld quality estimation system using fuzzy expert system (퍼지 전문가 시스템을 활용한 용접 품질 예측 시스템에 관한 연구)

  • 박주용;강병윤;박현철
    • Journal of Ocean Engineering and Technology
    • /
    • v.11 no.1
    • /
    • pp.36-43
    • /
    • 1997
  • Weld bead shape is an important measure for evaluation of weld quality. Many welding parameters have influence on the weld bead shape. The quantitative relationship between welding parameters and bead shape, however, is not determined yet because of their high complexity and many unknown factors. Fuzzy expert system is an advanced expert system which uses fuzzy rules and approximate reasoning. It is a vert useful tool for welding technology because is can process rationally the uncertain and inexact information such as the welding information. In this paper, the empirical and the qualitative relationship between welding parameters and bead shape are analyzed and represented by fuzzy rules. They are converted to the quantitative relationship by use of approximate reasoning of fuzzy expert system. Weld bead shape is estimated from the welding parameters using fuzzy expert system. The result of comparison between measured values of weld bead by welding experiments and the estimates values by fuzzy expert system shows a good consistancy.

  • PDF

A Quantitative Model for the Projection of Health Expenditure (의료비 결정요인 분석을 위한 계량적 모형 고안)

  • Kim, Han-Joong;Lee, Young-Doo;Nam, Chung-Mo
    • Journal of Preventive Medicine and Public Health
    • /
    • v.24 no.1 s.33
    • /
    • pp.29-36
    • /
    • 1991
  • A multiple regression analysis using ordinary least square (OLS) is frequently used for the projection of health expenditure as well as for the identification of factors affecting health care costs. Data for the analysis often have mixed characteristics of time series and cross section. Parameters as a result of OLS estimation, in this case, are no longer the best linear unbiased estimators (BLUE) because the data do not satisfy basic assumptions of regression analysis. The study theoretically examined statistical problems induced when OLS estimation was applied with the time series cross section data. Then both the OLS regression and time series cross section regression (TSCS regression) were applied to the same empirical da. Finally, the difference in parameters between the two estimations were explained through residual analysis.

  • PDF

A Study on the Validity of Government Cloud SaaS Service Migration using TCO Approach (TCO 접근방법을 통한 정부클라우드 SaaS 서비스 전환의 타당성에 관한 연구)

  • Yoon, Seong-Jeong;Kim, In-Hwan;Seo, Jung Wook;Kim, Min-Yong
    • Journal of Information Technology Services
    • /
    • v.11 no.4
    • /
    • pp.215-231
    • /
    • 2012
  • It is well known that SaaS(Software as a Service) changeover gives several advantages to organization. One of the advantages is the cost reduction effect of IT resources as well as IT human resources. Another one is the curtailment of software development workload in the field of informatization promotions. Nonetheless, it is hard to find comparison cases regarding the quantitative measurement of the introduction of SaaS before and after. Accordingly, when the Government IDC tries to adopt SaaS, it absolutely needs the empirical study whether SaaS is cost-effectiveness or not. In this study, we focus on variation in the Government administration common tasks, processes and labor costs. Using the Man-Month(MM) estimation methods, We verify that how much TCO(Total Cost of Ownership) is reduced per year.

Inference for exponentiated Weibull distribution under constant stress partially accelerated life tests with multiple censored

  • Nassr, Said G.;Elharoun, Neema M.
    • Communications for Statistical Applications and Methods
    • /
    • v.26 no.2
    • /
    • pp.131-148
    • /
    • 2019
  • Constant stress partially accelerated life tests are studied according to exponentiated Weibull distribution. Grounded on multiple censoring, the maximum likelihood estimators are determined in connection with unknown distribution parameters and accelerated factor. The confidence intervals of the unknown parameters and acceleration factor are constructed for large sample size. However, it is not possible to obtain the Bayes estimates in plain form, so we apply a Markov chain Monte Carlo method to deal with this issue, which permits us to create a credible interval of the associated parameters. Finally, based on constant stress partially accelerated life tests scheme with exponentiated Weibull distribution under multiple censoring, the illustrative example and the simulation results are used to investigate the maximum likelihood, and Bayesian estimates of the unknown parameters.

Construction and Estimation of Web-based Real Time ERP System - A Case Study (웹기반 실시간 ERP 시스템 구축 및 평가-사례연구)

  • Kim, Jae-Saeng;Choi, Sang-Kyoon
    • The Journal of the Korea Contents Association
    • /
    • v.8 no.4
    • /
    • pp.30-38
    • /
    • 2008
  • At past, but smaller enterprises had operated each business automation system about office, at present, are making use of integration ERP information system that take advantage of accessible web technology in outside environment. In this research we took smaller enterprise's example, constructed ERP system that real-time processing is available to supply ERP information to customer by real time and to handle business in outside site. With construction result of this system, we verified efficiency of system and customers' satisfaction etc of customer's complaint solution, business productivity elevation, economical curtailment and so on through various kinds quantitative estimation.