• Title/Summary/Keyword: component reliability function

Search Result 92, Processing Time 0.019 seconds

A Method for Reliability Analysis of Armored Fighting Vehicle using RBD based on Integrated Hit Probabilities of Crews and Components (통합 피격 확률 분석을 이용한 RBD 기반의 전차 신뢰도 분석 방법)

  • Hwang, Hun-Gyu;Kang, Ji-Won;Lee, Jang-Se
    • Journal of the Korea Institute of Information and Communication Engineering
    • /
    • v.20 no.5
    • /
    • pp.1040-1048
    • /
    • 2016
  • Recently, the studies of integrated reliability analysis for combat systems are actively progressing. Especially, the research of integrated reliability analysis is emphasized to overcome limitations of the previous studies. In this paper, we propose a calculation technique for integrated hit probability based on front and side hit probabilities that analyzed in previous studies to improve the time-effectiveness. Also, we find out the integrated reliability of each component based on the integrated hit probability which is calculated, and we propose the method which applied the reliability block diagram technique to analyze the whole combat system of the reliability by function kills. For verifying the proposed method, we applied the proposed method to armored fighting vehicle model. The proposed method considers crew which does not considered the element in the previous study and expects to enhance the accuracy of reliability analysis and the time-effectiveness compared with the previous study.

Generalizability of Polygraph Test Procedures using Backster ZCT: Changes in reliability as a function of the number of relevant questions, the number of repeated tests, and the number of raters (Backster ZCT를 사용한 폴리그라프 검사절차의 일반화가능도: 관련 질문의 개수, 반복측정 횟수, 채점자의 수에 따른 신뢰도의 변화)

  • Eom, Jin-Sup;Han, Yu-Hwa;Ji, Hyung-Ki;Park, Kwang-Bai
    • Science of Emotion and Sensibility
    • /
    • v.11 no.4
    • /
    • pp.553-564
    • /
    • 2008
  • Generalizability theory was employed to examine how the reliability of polygraph test is affected by the number of relevant questions, the number of repeated tests (the number of of charts), and the number of raters(scorers). The data consisted of the results of the polygraph tests administered to 31 crime suspects. The sample was drawn from the real polygraph tests based on Backster ZCT and archived by the Prosecutor's Office of the Republic of Korea. The numerical scores assigned by thirteen raters to the test charts were analyzed to determine the generalizability of the scores. The largest variance component was accounted for by the examinee factor(43.97%) and the residual variance component was 16.84% of the total variance. The variance component due to the interaction between the examinee and the chart factors was 12.17% and the variance component due to the three way interaction of the examinee, the repeated test, and the relevant question factors was 10.31%. The generalizability coefficient for the current measurement procedure as practiced by the Korean Prosecutor's Office was 0.74 which suggests that the current procedure is acceptable. However, measurement procedures with the combination of more than two relevant questions, more than three repeated tests, and more than two raters were generally found to yield generalizability coefficients larger than 0.80. Therefore, such procedures need to be considered seriously in order to significantly improve the reliability of polygraph test.

  • PDF

A Study on Failure Mode Analysis of Machining Center (머시닝센터의 고장모드 해석에 관한 연구)

  • Kim, Bong-Suk;Kim, Jong-Soo;Lee, Soo-Hun;Song, Jun-Yeup;Park, Hwa-Young
    • Journal of the Korean Society for Precision Engineering
    • /
    • v.18 no.6
    • /
    • pp.74-79
    • /
    • 2001
  • In this study, a failure mode analysis of CNC machining center is described. First, the system is classified through subsystems into components using part lists and drawings. The component failure rate and failure mode analysis are performed to identify the weak components of a machining center with reliability database. The failure probabilistic function of mechanical part is analyzed by Weibull distribution. The Kolmogorov-Smirnov test is also used to verify the goodness of fit.

  • PDF

Development of Maintenance Effectiveness Monitoring Program based on Design Characteristics for New Nuclear Power Plant (신규원전의 설계특성 기반 정비효과성감시 프로그램 개발)

  • Yeom, Dong-Un;Hyun, Jin-Woo;Song, Tae-Young
    • Transactions of the Korean Society of Pressure Vessels and Piping
    • /
    • v.8 no.1
    • /
    • pp.25-32
    • /
    • 2012
  • Korea Hydro & Nuclear Power Co. (KHNP) has developed and implemented the maintenance effectiveness monitoring (MR) programs for the operating nuclear power plants. The MR program is developed by reflecting design characteristics of the operating nuclear power plants to monitor the plant performance for improving the safety and reliability. Recently, KHNP has built a new nuclear power plant, and developed the MR program to establish the advanced maintenance system by reflecting unique design characteristics based on the OPR1000 standard model. So, the MR program developed in this study has another characteristics in comparison with the OPR1000 standard model, and we will verify the suitability of the MR program through evaluating initial performance of the plant. The safety and reliability of the new plant will be improved by developing and implementing the MR program.

An Application of the HRA Methodology in PSA: A Gas Valve Station (PSA의 인간신뢰도분석 모델의 적용)

  • 제무성
    • Journal of the Korean Society of Safety
    • /
    • v.15 no.4
    • /
    • pp.150-156
    • /
    • 2000
  • In this paper, the human error contributions to the system unavailability are calculated and compared to the mechanical failure contributions. The system unavailability is a probability that a system is in the failed state at time t, given that it was the normal state at time zero. It is a function of human errors committed during maintenance and tests, component failure rates, surveillance test intervals, and allowed outage time. The THERP (Technique for Human Error Rate Prediction), generally called "HRA handbook", is used here for evaluating human error rates. This method treats the operator as one of the system components, and human reliability is assessed in the same manner as that of components. Based on the calculation results, the human error contribution to the system unavailability is shown to be more important than the mechanical failure contribution in the example system. It is also demonstrated that this method is very flexible in that it can be applied to any hazardous facilities, such as gas valve stations and chemical process plants.ss plants.

  • PDF

Bayesian Approach for Software Reliability Models (소프트웨어 신뢰모형에 대한 베이지안 접근)

  • Choi, Ki-Heon
    • Journal of the Korean Data and Information Science Society
    • /
    • v.10 no.1
    • /
    • pp.119-133
    • /
    • 1999
  • A Markov Chain Monte Carlo method is developed to compute the software reliability model. We consider computation problem for determining of posterior distibution in Bayseian inference. Metropolis algorithms along with Gibbs sampling are proposed to preform the Bayesian inference of the Mixed model with record value statistics. For model determiniation, we explored the prequential conditional predictive ordinate criterion that selects the best model with the largest posterior likelihood among models using all possible subsets of the component intensity functions. To relax the monotonic intensity function assumptions. A numerical example with simulated data set is given.

  • PDF

Study on Algorithm of Micro Surface Roughness Measurement Using Laser Reflectance Light (레이저 반사광을 이용한 미세 표면 거칠기 측정 알고리즘에 관한 연구)

  • Choi, Gyu-Jong;Kim, Hwa-Young;Ahn, Jung-Hwan
    • Transactions of the Korean Society of Mechanical Engineers A
    • /
    • v.32 no.4
    • /
    • pp.347-353
    • /
    • 2008
  • Reflected light can be decomposed into specular and diffuse components according to the light reflectance theory and experiments. The specular component appears in smooth surfaces mainly, while the diffuse one is visible in rough surfaces mostly. Therefore, each component can be used in forming their correlations to a surface roughness. However, they cannot represent the whole surface roughness seamlessly, because each formulation is merely validated in their available surface roughness regions. To solve this problem, new approaches to properly blend two light components in all regions are proposed in this paper. First is the weighting function method that a blending zone and rate can be flexibly adjusted, and second is the neural network method based on the learning from the measurement data. Simulations based on the light reflectance theory were conducted to examine its performance, and then experiments conducted to prove the enhancement of the measurement accuracy and reliability through the whole surface roughness regions.

Two-Layer Approach Using FTA and BBN for Reliability Analysis of Combat Systems (전투 시스템의 신뢰성 분석을 위한 FTA와 BBN을 이용한 2계층 접근에 관한 연구)

  • Kang, Ji-Won;Lee, Jang-Se
    • Journal of the Korea Institute of Information and Communication Engineering
    • /
    • v.23 no.3
    • /
    • pp.333-340
    • /
    • 2019
  • A combat system performs a given mission enduring various threats. It is important to analyze the reliability of combat systems in order to increase their ability to perform a given mission. Most of studies considered no threat or on threat and didn't analyze all the dependent relationships among the components. In this paper, we analyze the loss probability of the function of the combat system and use it to analyze the reliability. The proposed method is divided into two layers, A lower layer and a upper layer. In lower layer, the failure probability of each components is derived by using FTA to consider various threats. In the upper layer, The loss probability of function is analyzed using the failure probability of the component derived from lower layer and BBN in order to consider the dependent relationships among the components. Using the proposed method, it is possible to analyze considering various threats and the dependency between components.

A new methodology of the development of seismic fragility curves

  • Lee, Young-Joo;Moon, Do-Soo
    • Smart Structures and Systems
    • /
    • v.14 no.5
    • /
    • pp.847-867
    • /
    • 2014
  • There are continuous efforts to mitigate structural losses from earthquakes and manage risk through seismic risk assessment; seismic fragility curves are widely accepted as an essential tool of such efforts. Seismic fragility curves can be classified into four groups based on how they are derived: empirical, judgmental, analytical, and hybrid. Analytical fragility curves are the most widely used and can be further categorized into two subgroups, depending on whether an analytical function or simulation method is used. Although both methods have shown decent performances for many seismic fragility problems, they often oversimplify the given problems in reliability or structural analyses owing to their built-in assumptions. In this paper, a new method is proposed for the development of seismic fragility curves. Integration with sophisticated software packages for reliability analysis (FERUM) and structural analysis (ZEUS-NL) allows the new method to obtain more accurate seismic fragility curves for less computational cost. Because the proposed method performs reliability analysis using the first-order reliability method, it provides component probabilities as well as useful byproducts and allows further fragility analysis at the system level. The new method was applied to a numerical example of a 2D frame structure, and the results were compared with those by Monte Carlo simulation. The method was found to generate seismic fragility curves more accurately and efficiently. Also, the effect of system reliability analysis on the development of seismic fragility curves was investigated using the given numerical example and its necessity was discussed.

An Application of Dirichlet Mixture Model for Failure Time Density Estimation to Components of Naval Combat System (디리슈레 혼합모형을 이용한 함정 전투체계 부품의 고장시간 분포 추정)

  • Lee, Jinwhan;Kim, Jung Hun;Jung, BongJoo;Kim, Kyeongtaek
    • Journal of Korean Society of Industrial and Systems Engineering
    • /
    • v.42 no.4
    • /
    • pp.194-202
    • /
    • 2019
  • Reliability analysis of the components frequently starts with the data that manufacturer provides. If enough failure data are collected from the field operations, the reliability should be recomputed and updated on the basis of the field failure data. However, when the failure time record for a component contains only a few observations, all statistical methodologies are limited. In this case, where the failure records for multiple number of identical components are available, a valid alternative is combining all the data from each component into one data set with enough sample size and utilizing the useful information in the censored data. The ROK Navy has been operating multiple Patrol Killer Guided missiles (PKGs) for several years. The Korea Multi-Function Control Console (KMFCC) is one of key components in PKG combat system. The maintenance record for the KMFCC contains less than ten failure observations and a censored datum. This paper proposes a Bayesian approach with a Dirichlet mixture model to estimate failure time density for KMFCC. Trends test for each component record indicated that null hypothesis, that failure occurrence is renewal process, is not rejected. Since the KMFCCs have been functioning under different operating environment, the failure time distribution may be a composition of a number of unknown distributions, i.e. a mixture distribution, rather than a single distribution. The Dirichlet mixture model was coded as probabilistic programming in Python using PyMC3. Then Markov Chain Monte Carlo (MCMC) sampling technique employed in PyMC3 probabilistically estimated the parameters' posterior distribution through the Dirichlet mixture model. The simulation results revealed that the mixture models provide superior fits to the combined data set over single models.