• Title/Summary/Keyword: lognormal

Search Result 315, Processing Time 0.024 seconds

A Vtub-Shaped Hazard Rate Function with Applications to System Safety

  • Pham, Hoang
    • International Journal of Reliability and Applications
    • /
    • v.3 no.1
    • /
    • pp.1-16
    • /
    • 2002
  • In reliability engineering, the bathtub-shaped hazard rates play an important role in survival analysis and many other applications as well. For the bathtub-shaped, initially the hazard rate decreases from a relatively high value due to manufacturing defects or infant mortality to a relatively stable middle useful life value and then slowly increases with the onset of old age or wear out. In this paper, we present a new two-parameter lifetime distribution function, called the Loglog distribution, with Vtub-shaped hazard rate function. We illustrate the usefulness of the new Vtub-shaped hazard rate function by evaluating the reliability of several helicopter parts based on the data obtained in the maintenance malfunction information reporting system database collected from October 1995 to September 1999. We develop the S-Plus add-in software tool, called Reliability and Safety Assessment (RSA), to calculate reliability measures include mean time to failure, mean residual function, and confidence Intervals of the two helicopter critical parts. We use the mean squared error to compare relative goodness of fit test of the distribution models include normal, lognormal, and Weibull within the two data sets. This research indicates that the result of the new Vtub-shaped hazard rate function is worth the extra function-complexity for a better relative fit. More application in broader validation of this conclusion is needed using other data sets for reliability modeling in a general industrial setting.

  • PDF

Probabilities of initiation of response modes of rigid bodies subjected to base excitations

  • Aydin, Kamil
    • Structural Engineering and Mechanics
    • /
    • v.23 no.5
    • /
    • pp.505-523
    • /
    • 2006
  • An unrestrained plane rigid body resting on a horizontal surface which shakes horizontally and vertically may assume one of the five modes of response: rest, slide, slide-rock, rock, and free flight. The first four are nontrivial modes of motion. It is important to study which one of these responses is started from rest as in most studies it is often assumed that the initial mode is the particular mode of response. Criteria governing the initiation of modes are first briefly discussed. It is shown that the commencement of response modes depends on the aspect ratio of the body, coefficients of static and kinetic friction at the body-base interface, and the magnitude of maximum base accelerations. Considering the last two factors as random variables, the initiation of response modes is next studied from a probabilistic point of view. Type 1 extreme value and lognormal distributions are employed for maximum base excitations and coefficient of friction respectively. Analytical expressions for computing the probability values of each mode of response are derived. The effects of slenderness ratio, vertical acceleration, and statistical distributions of maximum acceleration and coefficient of friction are shown through numerical results and plots.

Re-estimation of Model Parameters in Growth Curves When Adjusting Market Potential and Time of Maximum Sales (성장곡선 예측 모형의 특성치 보정에 따른 매개변수의 재추정)

  • Park, Ju-Seok;Ko, Young-Hyun;Jun, Chi-Hyuck;Lee, Jae-Hwan;Hong, Seung-Pyo;Moon, Hyung-Don
    • IE interfaces
    • /
    • v.16 no.1
    • /
    • pp.103-110
    • /
    • 2003
  • Growth curves are widely used in forecasting the market demand. When there are only a few data points available, the estimated model parameters have a low confidence. In this case, if some expert opinions are available, it would be better for predicting future demand to adjust the model parameters using these information. This paper proposes the methodology for re-estimation of model parameters in growth curves when adjusting market potential and/or time of maximum sales. We also provide the detailed procedures for five growth curves including Bass, Logistic, Gompertz, Weibull and Cumulative Lognormal models. Applications to real data are also included.

A Cryptography Algorithm using Telescoping Series (망원급수를 이용한 암호화 알고리즘)

  • Choi, Eun Jung;Sakong, Yung;Park, Wang Keun
    • Journal of Korea Society of Digital Industry and Information Management
    • /
    • v.9 no.4
    • /
    • pp.103-110
    • /
    • 2013
  • In Information Technology era, various amazing IT technologies, for example Big Data, are appearing and are available as the amount of information increase. The number of counselling for violation of personal data protection is also increasing every year that it amounts to over 160,000 in 2012. According to Korean Privacy Act, in the case of treating unique personal identification information, appropriate measures like encipherment should be taken. The technologies of encipherment are the most basic countermeasures for personal data invasion and the base elements in information technology. So various cryptography algorithms exist and are used for encipherment technology. Therefore studies on safer new cryptography algorithms are executed. Cryptography algorithms started from classical replacement enciphering and developed to computationally secure code to increase complexity. Nowadays, various mathematic theories such as 'factorization into prime factor', 'extracting square root', 'discrete lognormal distribution', 'elliptical interaction curve' are adapted to cryptography algorithms. RSA public key cryptography algorithm which was based on 'factorization into prime factor' is the most representative one. This paper suggests algorithm utilizing telescoping series as a safer cryptography algorithm which can maximize the complexity. Telescoping series is a type of infinite series which can generate various types of function for given value-the plain text. Among these generated functions, one can be selected as a original equation. Some part of this equation can be defined as a key. And then the original equation can be transformed into final equation by improving the complexity of original equation through the command of "FullSimplify" of "Mathematica" software.

A Study on the Trend Change Point of NBUE-property

  • Kim, Dae-Kyung
    • Communications for Statistical Applications and Methods
    • /
    • v.3 no.2
    • /
    • pp.275-282
    • /
    • 1996
  • A life distribution F with survival function $\overline{F}$=1-F, finite mean $\mu$ and mean residual life m(t) is said to be NBUE(NWUE) if m(t)$\leq$($\geq$) .$\mu$ for t$\geq$0. This NBUE property can equivalently be characterized by the fact that $\varphi$(u)$\geq$($\leq$)u for 0$\leq$u$\leq$1, where $\varphi$(u) is the scaled total-time-on test transform of F. A generalization of the NBUE properties is that there is a value of p such that $\varphi$(u)\geq.u$ for 0$\leq$u$\leq$p and $\varphi$(u)\leq$$\leq$u$\leq$1, or vice versa. This means that we have a trend change in the NBUE property. In this paper we point out an error of Klefsjo's paper (1988). He erroneously takes advantage of trend change point of failure rate to calculate the empirical test size and power in lognormal distribution. We solves the trend change point of mean residual lifetime and recalculate the empirical test size and power of Klefsjo (1988) in mocensoring case.

  • PDF

Design of Median Control Chart for Nonnormally Distributed Processes (비정규분포공정(非正規分布工程)에서 메디안특수관리도(特殊管理圖)의 모형설계(模型設計))

  • Sin, Yong-Baek
    • Journal of Korean Society for Quality Management
    • /
    • v.15 no.2
    • /
    • pp.10-19
    • /
    • 1987
  • Statistical control charts are useful tools to monitor and control the manufacturing processes and are widely used in most Korean industries. Many Korean companies, however, do not always obtain desired results from the traditional control charts by Shewhart such as the $\overline{X}$-chart, X-chart, $\widetilde{X}$-chart, etc. This is partly because the quality charterstics of the process are not distributed normally but are skewed due to the intermittent production, small lot size, etc. In the Shewhart $\overline{X}$-chart, which is the most widely used one in Korea, such skewed distributions make the plots to be inclined below or above the central line or outside the control limits although no assignable causes can be found. To overcome such shortcomings in nonnormally distributed processes, a distribution-free type of confidence interval can be used, which should be based on order statistics. This thesis is concerned with the design of control chart based on a sample median which is easy to use in practical situation and therefore properties for nonnormal distributions may be easily analyzed. Control limits and central lines are given for the more famous nonnormal distributions, such as Gamma, Beta, Lognormal, Weibull, Pareto, and Truncated-normal distributions.

  • PDF

Estimation of Reliability of a System Based on Two Typed Data (두 형태의 데이터를 이용하여 시스템의 신뢰도를 추정하는 방법)

  • Shim, Kyubark;Yim, Jaegeol
    • Journal of Korea Multimedia Society
    • /
    • v.16 no.3
    • /
    • pp.336-341
    • /
    • 2013
  • Reliability analysis for various forms of data obtained from complicated electronic circuits is a necessary process for guaranteeing reliability of the system. Reliability assessment of a system starts from the estimation of failure function. A system can be composed of one item, but in most cases, several items are correlated to each other in one system. This study suggests an estimation method of failure function and reliabilities for infrequent failure events, by considering different form of data obtained from different systems. Estimates of failure function and reliabilities for complex systems composed of two or more items in parallel or in mixed connections can be done by further application of proposed method.

Seismic vulnerability assessment of RC buildings according to the 2007 and 2018 Turkish seismic codes

  • Yon, Burak
    • Earthquakes and Structures
    • /
    • v.18 no.6
    • /
    • pp.709-718
    • /
    • 2020
  • Fragility curves are useful tools to estimate the damage probability of buildings owing to seismic actions. The purpose of this study is to investigate seismic vulnerability of reinforced concrete (RC) buildings, according to the 2007 and 2018 Turkish Seismic Codes, using fragility curves. For the numerical analyses, typical five- and seven-storey RC buildings were selected and incremental dynamic analyses (IDA) were performed. To complete the IDAs, eleven earthquake acceleration records multiplied by various scaling factors from 0.2g to 0.8g were used. To predict nonlinearity, a distributed hinge model that involves material and geometric nonlinearity of the structural members was used. Damages to confined concrete and reinforcement bar of structural members were obtained by considering the unit deformation demands of the 2007 Turkish Seismic Code (TSC-2007) and the 2018 Turkey Building Earthquake Code (TBEC-2018). Vulnerability evaluation of these buildings was performed using fragility curves based on the results of incremental dynamic analyses. Fragility curves were generated in terms of damage levels occurring in confined concrete and reinforcement bar of structural members with a lognormal distribution assumption. The fragility curves show that the probability of damage occurring is more according to TBEC-2018 than according to TSC-2007 for selected buildings.

Speckle Removal of SAR Imagery Using a Point-Jacobian Iteration MAP Estimation

  • Lee, Sang-Hoon
    • Korean Journal of Remote Sensing
    • /
    • v.23 no.1
    • /
    • pp.33-42
    • /
    • 2007
  • In this paper, an iterative MAP approach using a Bayesian model based on the lognormal distribution for image intensity and a GRF for image texture is proposed for despeckling the SAR images that are corrupted by multiplicative speckle noise. When the image intensity is logarithmically transformed, the speckle noise is approximately Gaussian additive noise, and it tends to a normal probability much faster than the intensity distribution. MRFs have been used to model spatially correlated and signal-dependent phenomena for SAR speckled images. The MRF is incorporated into digital image analysis by viewing pixel types as slates of molecules in a lattice-like physical system defined on a GRF Because of the MRF-SRF equivalence, the assignment of an energy function to the physical system determines its Gibbs measure, which is used to model molecular interactions. The proposed Point-Jacobian Iterative MAP estimation method was first evaluated using simulation data generated by the Monte Carlo method. The methodology was then applied to data acquired by the ESA's ERS satellite on Nonsan area of Korean Peninsula. In the extensive experiments of this study, The proposed method demonstrated the capability to relax speckle noise and estimate noise-free intensity.

Robust second-order rotatable designs invariably applicable for some lifetime distributions

  • Kim, Jinseog;Das, Rabindra Nath;Singh, Poonam;Lee, Youngjo
    • Communications for Statistical Applications and Methods
    • /
    • v.28 no.6
    • /
    • pp.595-610
    • /
    • 2021
  • Recently a few articles have derived robust first-order rotatable and D-optimal designs for the lifetime response having distributions gamma, lognormal, Weibull, exponential assuming errors that are correlated with different correlation structures such as autocorrelated, intra-class, inter-class, tri-diagonal, compound symmetry. Practically, a first-order model is an adequate approximation to the true surface in a small region of the explanatory variables. A second-order model is always appropriate for an unknown region, or if there is any curvature in the system. The current article aims to extend the ideas of these articles for second-order models. Invariant (free of the above four distributions) robust (free of correlation parameter values) second-order rotatable designs have been derived for the intra-class and inter-class correlated error structures. Second-order rotatability conditions have been derived herein assuming the response follows non-normal distribution (any one of the above four distributions) and errors have a general correlated error structure. These conditions are further simplified under intra-class and inter-class correlated error structures, and second-order rotatable designs are developed under these two structures for the response having anyone of the above four distributions. It is derived herein that robust second-order rotatable designs depend on the respective error variance covariance structure but they are independent of the correlation parameter values, as well as the considered four response lifetime distributions.