• Title/Summary/Keyword: Software failure reliability model

Search Result 170, Processing Time 0.026 seconds

Bayesian Estimation for Inflection S-shaped Software Reliability Growth Model (변곡 S-형 소프트웨어 신뢰도성장모형의 베이지안 모수추정)

  • Kim, Hee-Soo;Lee, Chong-Hyung;Park, Dong-Ho
    • Journal of Korean Society for Quality Management
    • /
    • v.37 no.4
    • /
    • pp.16-22
    • /
    • 2009
  • The inflection S-shaped software reliability growth model (SRGM) proposed by Ohba(1984) is one of the most commonly used models and has been discussed by many authors. The main purpose of this paper is to estimate the parameters of Ohba's SRGM within the Bayesian framework by applying the Markov chain Monte Carlo techniques. While the maximum likelihood estimates for these parameters are well known, the Bayesian method for the inflection S-shaped SRGM have not been discussed in the literature. The proposed methods can be quite flexible depending on the choice of prior distributions for the parameters of interests. We also compare the Bayesian methods with the maximum likelihood method numerically based on the real data.

The Comparative Study of Software Optimal Release Time Based on Intensity Function property (강도함수 특성에 근거한 소프트웨어 최적 방출시기에 관한 비교 연구)

  • Kim, Hee-Cheul;Park, Hyoung-Keun
    • Journal of the Korea Academia-Industrial cooperation Society
    • /
    • v.11 no.4
    • /
    • pp.1239-1247
    • /
    • 2010
  • In this paper, we were researched decision problem called an optimal release policies after testing a software system in development phase and transferring it to the user. The applied model of release time exploited infinite failure non-homogeneous Poisson process This infinite failure non-homogeneous Poisson process is a model which reflects the possibility of introducing new faults when correcting or modifying the software. The intensity function used Gompertz, Preto and Log-logstic pattern which has the efficient various property. Thus, optimal software release policies which minimize a total average software cost of development and maintenance under the constraint of satisfying a software reliability requirement becomes an optimal release policies. In a numerical example, after trend test applied and estimated the parameters using maximum likelihood estimation of inter-failure time data, estimated software optimal release time.

A Study on the Attributes of Software Reliability Cost Model with Shape Parameter Change of Type-2 Gumbel Life Distribution (Type-2 Gumbel 수명분포의 형상모수 변화에 따른 소프트웨어 신뢰성 비용모형의 속성에 관한 연구)

  • Yang, Tae-Jin
    • The Journal of Korea Institute of Information, Electronics, and Communication Technology
    • /
    • v.12 no.3
    • /
    • pp.211-217
    • /
    • 2019
  • In this study, we compare and analyze the attributes of the software development cost model according to the shape parameters change of the Type-2 Gumbel lifetime distribution using the NHPP model. In order to analyze the software failure phenomena, the parametric estimation is applied to the maximum likelihood estimation method, and the nonlinear equations are calculated using the bisection method. As a result, when the attributes of the cost curves according to the change of shape parameters are compared, it is found that the larger the number of shape parameters, the lower the software development cost and the faster the release time. Through this study, it is expected that it will be helpful for the software developers to search for the development cost according to the software shape parameters change, and also to provide the necessary information for the attributes of the software development cost.

The Study for Process Capability Analysis of Software Failure Interval Time (소프트웨어 고장 간격 시간에 대한 공정능력분석에 관한 연구)

  • Kim, Hee-Cheul;Shin, Hyun-Cheul
    • Convergence Security Journal
    • /
    • v.7 no.2
    • /
    • pp.49-55
    • /
    • 2007
  • Software failure time presented in the literature exhibit either constant, monotonic increasing or monotonic decreasing. For data analysis of software reliability model, data scale tools of trend analysis are developed. The methods of trend analysis are arithmetic mean test and Laplace trend test. Trend analysis only offer information of outline content. From the subdivision of this analysis, new attemp needs the side of the quality control. In this paper, we discuss process capability analysis using process capability indexs. Because of software failure interval time is pattern of nonnegative value, instead of capability analysis of suppose to normal distribution, capability analysis of process distribution using to Box-Cox transformation is attermpted. The used software failure time data for capability analysis of process is SS3, the result of analysis listed on this chapter 4 and 5. The practical use is presented.

  • PDF

The Comparative Study of Software Optimal Release Time Based on Gamma Exponential and Non-exponential Family Distribution Model (지수 및 비지수족 분포 모형에 근거한 소프트웨어 최적방출시기에 관한 비교 연구)

  • Kim, Hee-Cheul;Shin, Hyun-Cheul
    • Journal of the Korea Society of Computer and Information
    • /
    • v.15 no.5
    • /
    • pp.125-132
    • /
    • 2010
  • Decision problem called an optimal release policies, after testing a software system in development phase and transfer it to the user, is studied. The applied model of release time exploited infinite non-homogeneous Poisson process. This infinite non-homogeneous Poisson process is a model which reflects the possibility of introducing new faults when correcting or modifying the software. The failure life-cycle distribution used exponential and non-exponential family which has various intensity. Thus, software release policies which minimize a total average software cost of development and maintenance under the constraint of satisfying a software reliability requirement becomes an optimal release policies. In a numerical example, after trend test applied and estimated the parameters using maximum likelihood estimation of inter-failure time data, estimated software optimal release time.

A Study on the Reliability Prediction about ECM of Packaging Substrate PCB by Using Accelerated Life Test (가속수명시험을 이용한 Packaging Substrate PCB의 ECM에 대한 신뢰성 예측에 관한 연구)

  • Kang, Dae-Joong;Lee, Hwa-Ki
    • Journal of the Korea Safety Management & Science
    • /
    • v.15 no.1
    • /
    • pp.109-120
    • /
    • 2013
  • As information-oriented industry has been developed and electronic devices has come to be smaller, lighter, multifunctional, and high speed, the components used to the devices need to be much high density and should have find pattern due to high integration. Also, diverse reliability problems happen as user environment is getting harsher. For this reasons, establishing and securing products and components reliability comes to key factor in company's competitiveness. It makes accelerated test important to check product reliability in fast way. Out of fine pattern failure modes, failure of Electrochemical Migration(ECM) is kind of degradation of insulation resistance by electro-chemical reaction, which it comes to be accelerated by biased voltage in high temperature and high humidity environment. In this thesis, the accelerated life test for failure caused by ECM on fine pattern substrate, $20/20{\mu}m$ pattern width/space applied by Semi Additive Process, was performed, and through this test, the investigation of failure mechanism and the life-time prediction evaluation under actual user environment was implemented. The result of accelerated test has been compared and estimated with life distribution and life stress relatively by using Minitab software and its acceleration rate was also tested. Through estimated weibull distribution, B10 life has been estimated under 95% confidence level of failure data happened in each test conditions. And the life in actual usage environment has been predicted by using generalized Eyring model considering temperature and humidity by developing Arrhenius reaction rate theory, and acceleration factors by test conditions have been calculated.

New paradigm of common cause human behavior error domain in human-software interation

  • Park, P.;Lee, K.S.
    • Proceedings of the ESK Conference
    • /
    • 1992.10a
    • /
    • pp.84-89
    • /
    • 1992
  • This study is to develop a cognitive paradigm including a new model of common cause human behavior error domain and to analyze their causal factors and their properties of common cause huamn error characteristics in software engineering.l A laboratory study was performed to analyze the common causes of human behavior domain error in software develoment and to indentify software design factors contributing to the common cause effects in common cause failure redundancy. The results and analytical paradigm developed in this resuarch can be applied to reliability improvement and cost reduction in software development for many applications. Results are also expected to provide training guideliness for software engineers and for more effective design of ultra-high reliabile software packages.

  • PDF

An Accelerated Life Test for Burnout of Tungsten Filament of Incandescent Lamp (텅스텐 백열전구의 필라멘트 단선에 대한 가속수명시험)

  • 이재국;김진우;신재철;김명수
    • Proceedings of the Korean Reliability Society Conference
    • /
    • 2004.07a
    • /
    • pp.129-137
    • /
    • 2004
  • This paper presents an accelerated life test for burnout of tungsten filament of incandescent lamp. From failure analyses of field samples, it is shown that their root causes are local heating or hot sports in the filament caused by tungsten evaporation and wire sag. Finite element analysis is performed to evaluate the effect of vibration and impact for burnout, but any points of stress concentration or structural weakness are not found in the sample. To estimate the burnout life of lamp, an accelerated life test is planned by using quality function deployment and fractional factorial design, where voltage, vibration, and temperature are selected as accelerating variables. We assumed that Weibull lifetime distribution and a generalized linear model of life-stress relationship hold through goodness of fit test and test for common shape parameter of the distribution. Using accelerated life testing software, we estimated the common shape parameter of Weibull distribution, life-stress relationship, and accelerating factor.

  • PDF

The Comparative Study for NHPP Software Reliability Model based on the Property of Learning Effect of Log Linear Shaped Hazard Function (대수 선형 위험함수 학습효과에 근거한 NHPP 신뢰성장 소프트웨어 모형에 관한 비교 연구)

  • Kim, Hee-Cheul;Shin, Hyun-Cheul
    • Convergence Security Journal
    • /
    • v.12 no.3
    • /
    • pp.19-26
    • /
    • 2012
  • In this study, software products developed in the course of testing, software managers in the process of testing software and tools for effective learning effects perspective has been studied using the NHPP software. The log type hazard function applied to distribution was based on finite failure NHPP. Software error detection techniques known in advance, but influencing factors for considering the errors found automatically and learning factors, by prior experience, to find precisely the error factor setting up the testing manager are presented comparing the problem. As a result, the learning factor is greater than autonomous errors-detected factor that is generally efficient model could be confirmed. This paper, a failure data analysis of applying using time between failures and parameter estimation using maximum likelihood estimation method, after the efficiency of the data through trend analysis model selection were efficient using the mean square error and $R^2$(coefficient of determination).

Structural reliability analysis using temporal deep learning-based model and importance sampling

  • Nguyen, Truong-Thang;Dang, Viet-Hung
    • Structural Engineering and Mechanics
    • /
    • v.84 no.3
    • /
    • pp.323-335
    • /
    • 2022
  • The main idea of the framework is to seamlessly combine a reasonably accurate and fast surrogate model with the importance sampling strategy. Developing a surrogate model for predicting structures' dynamic responses is challenging because it involves high-dimensional inputs and outputs. For this purpose, a novel surrogate model based on cutting-edge deep learning architectures specialized for capturing temporal relationships within time-series data, namely Long-Short term memory layer and Transformer layer, is designed. After being properly trained, the surrogate model could be utilized in place of the finite element method to evaluate structures' responses without requiring any specialized software. On the other hand, the importance sampling is adopted to reduce the number of calculations required when computing the failure probability by drawing more relevant samples near critical areas. Thanks to the portability of the trained surrogate model, one can integrate the latter with the Importance sampling in a straightforward fashion, forming an efficient framework called TTIS, which represents double advantages: less number of calculations is needed, and the computational time of each calculation is significantly reduced. The proposed approach's applicability and efficiency are demonstrated through three examples with increasing complexity, involving a 1D beam, a 2D frame, and a 3D building structure. The results show that compared to the conventional Monte Carlo simulation, the proposed method can provide highly similar reliability results with a reduction of up to four orders of magnitudes in time complexity.