• Title/Summary/Keyword: Interval estimation

Search Result 658, Processing Time 0.021 seconds

Probabilistic assessment on the basis of interval data

  • Thacker, Ben H.;Huyse, Luc J.
    • Structural Engineering and Mechanics
    • /
    • v.25 no.3
    • /
    • pp.331-345
    • /
    • 2007
  • Uncertainties enter a complex analysis from a variety of sources: variability, lack of data, human errors, model simplification and lack of understanding of the underlying physics. However, for many important engineering applications insufficient data are available to justify the choice of a particular probability density function (PDF). Sometimes the only data available are in the form of interval estimates which represent, often conflicting, expert opinion. In this paper we demonstrate that Bayesian estimation techniques can successfully be used in applications where only vague interval measurements are available. The proposed approach is intended to fit within a probabilistic framework, which is established and widely accepted. To circumvent the problem of selecting a specific PDF when only little or vague data are available, a hierarchical model of a continuous family of PDF's is used. The classical Bayesian estimation methods are expanded to make use of imprecise interval data. Each of the expert opinions (interval data) are interpreted as random interval samples of a parent PDF. Consequently, a partial conflict between experts is automatically accounted for through the likelihood function.

Interval Estimation for a Binomial Proportion Based on Weighted Polya Posterior (이항 비율의 가중 POLYA POSTERIOR 구간추정)

  • Lee Seung-Chun
    • The Korean Journal of Applied Statistics
    • /
    • v.18 no.3
    • /
    • pp.607-615
    • /
    • 2005
  • Recently the interval estimation of a binomial proportion is revisited in various literatures. This is mainly due to the erratic behavior of the coverage probability of the will-known Wald confidence interval. Various alternatives have been proposed. Among them, Agresti-Coull confidence interval has been recommended by Brown et al. (2001) with other confidence intervals for large sample, say n $\ge$ 40. On the other hand, a noninformative Bayesian approach called Polya posterior often produces statistics with good frequentist's properties. In this note, an interval estimator is developed using weighted Polya posterior. The resulting interval estimator is essentially the Agresti-Coull confidence interval with some improved features. It is shown that the weighted Polys posterior produce an effective interval estimator for small sample size and a severely skewed binomial distribution.

A Study on Interval Estimation of Technology R&D Investment Value using Black-Scholes Model (블랙-숄즈모형을 이용한 기술 R&D 투자가치 구간추정 연구)

  • Seong, Ung-Hyeon
    • Journal of Korea Technology Innovation Society
    • /
    • v.8 no.1
    • /
    • pp.29-50
    • /
    • 2005
  • Real options provide a new and productive way to view corporate r&d investment decisions. DCF approach is well established and beloved of financial executives, but is known to systematically underestimate investment value under significant uncertainty. Though real options are not inherent in a r&d investment, they can be used to compute the investment value including managerial flexibility like option value. In this paper, we explain how the interval of option value in black-scholes model can be estimated using simulation. We also present a process framework for interval estimation of volatility and efficient of period of investment value. In such a setting, we can obtain the appropriate interval estimation of the expanded investment value.

  • PDF

Mixing matrix estimation method for dual-channel time-frequency overlapped signals based on interval probability

  • Liu, Zhipeng;Li, Lichun;Zheng, Ziru
    • ETRI Journal
    • /
    • v.41 no.5
    • /
    • pp.658-669
    • /
    • 2019
  • For dual-channel time-frequency (TF) overlapped signals with low sparsity in underdetermined blind source separation (UBSS), this paper proposes an effective method based on interval probability to estimate and expand the types of mixing matrices. First, the detection of TF single-source points (TF-SSP) is used to improve the TF sparsity of each source. For more distinguishability, as the ratios of the coefficients from different columns of the mixing matrix are close, a local peak-detection mechanism based on interval probability (LPIP) is proposed. LPIP utilizes uniform subintervals to optimize and classify the TF coefficient ratios of the detected TF-SSP effectively in the case of a high level of TF overlap among sources and reduces the TF interference points and redundant signal features greatly to enhance the estimation accuracy. The simulation results show that under both noiseless and noisy cases, the proposed method performs better than the selected mainstream traditional methods, has good robustness, and has low algorithm complexity.

Estimation of the exponentiated half-logistic distribution based on multiply Type-I hybrid censoring

  • Jeon, Young Eun;Kang, Suk-Bok
    • Communications for Statistical Applications and Methods
    • /
    • v.27 no.1
    • /
    • pp.47-64
    • /
    • 2020
  • In this paper, we derive some estimators of the scale parameter of the exponentiated half-logistic distribution based on the multiply Type-I hybrid censoring scheme. We assume that the shape parameter λ is known. We obtain the maximum likelihood estimator of the scale parameter σ. The scale parameter is estimated by approximating the given likelihood function using two different Taylor series expansions since the likelihood equation is not explicitly solved. We also obtain Bayes estimators using prior distribution. To obtain the Bayes estimators, we use the squared error loss function and general entropy loss function (shape parameter q = -0.5, 1.0). We also derive interval estimation such as the asymptotic confidence interval, the credible interval, and the highest posterior density interval. Finally, we compare the proposed estimators in the sense of the mean squared error through Monte Carlo simulation. The average length of 95% intervals and the corresponding coverage probability are also obtained.

Balanced Accuracy and Confidence Probability of Interval Estimates

  • Liu, Yi-Hsin;Stan Lipovetsky;Betty L. Hickman
    • International Journal of Reliability and Applications
    • /
    • v.3 no.1
    • /
    • pp.37-50
    • /
    • 2002
  • Simultaneous estimation of accuracy and probability corresponding to a prediction interval is considered in this study. Traditional application of confidence interval forecasting consists in evaluation of interval limits for a given significance level. The wider is this interval, the higher is probability and the lower is the forecast precision. In this paper a measure of stochastic forecast accuracy is introduced, and a procedure for balanced estimation of both the predicting accuracy and confidence probability is elaborated. Solution can be obtained in an optimizing approach. Suggested method is applied to constructing confidence intervals for parameters estimated by normal and t distributions

  • PDF

Support Vector Machine for Interval Regression

  • Hong Dug Hun;Hwang Changha
    • Proceedings of the Korean Statistical Society Conference
    • /
    • 2004.11a
    • /
    • pp.67-72
    • /
    • 2004
  • Support vector machine (SVM) has been very successful in pattern recognition and function estimation problems for crisp data. This paper proposes a new method to evaluate interval linear and nonlinear regression models combining the possibility and necessity estimation formulation with the principle of SVM. For data sets with crisp inputs and interval outputs, the possibility and necessity models have been recently utilized, which are based on quadratic programming approach giving more diverse spread coefficients than a linear programming one. SVM also uses quadratic programming approach whose another advantage in interval regression analysis is to be able to integrate both the property of central tendency in least squares and the possibilistic property In fuzzy regression. However this is not a computationally expensive way. SVM allows us to perform interval nonlinear regression analysis by constructing an interval linear regression function in a high dimensional feature space. In particular, SVM is a very attractive approach to model nonlinear interval data. The proposed algorithm here is model-free method in the sense that we do not have to assume the underlying model function for interval nonlinear regression model with crisp inputs and interval output. Experimental results are then presented which indicate the performance of this algorithm.

  • PDF

Nonparametric Estimation for Ramp Stress Tests with Stress Bound under Intermittent Inspection (단속적 검사에서 스트레스한계를 가지는 램프스트레스시험을 위한 비모수적 추정)

  • Lee Nak-Young;Ahn Ung-Hwan
    • Journal of Korean Society for Quality Management
    • /
    • v.32 no.4
    • /
    • pp.208-219
    • /
    • 2004
  • This paper considers a nonparametric estimation of lifetime distribution for ramp stress tests with stress bound under intermittent inspection. The test items are inspected only at specified time points an⊂1 so the collected observations are grouped data. Under the cumulative exposure model, two nonparametric estimation methods of estimating the lifetime distribution at use condition stress are proposed for the situation which the time transformation function relating stress to lifetime is a type of the inverse power law. Each of items is initially put on test under ramp stress and then survivors are put on test under constant stress, where all failures in the Inspection interval are assumed to occur at the midi)oint or the endpoint of that interval. Two proposed estimators of quantile from grouped data consisting of the number of items failed in each inspection interval are numerically compared with the maximum likelihood estimator(MLE) based on Weibull distribution.

Confidence intervals for the COVID-19 neutralizing antibody retention rate in the Korean population

  • Apio, Catherine;Kamruzzaman, Md.;Park, Taesung
    • Genomics & Informatics
    • /
    • v.18 no.3
    • /
    • pp.31.1-31.8
    • /
    • 2020
  • The coronavirus disease 2019 (COVID-19), caused by severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2), has become a global pandemic. No specific therapeutic agents or vaccines for COVID-19 are available, though several antiviral drugs, are under investigation as treatment agents for COVID-19. The use of convalescent plasma transfusion that contain neutralizing antibodies for COVID-19 has become the major focus. This requires mass screening of populations for these antibodies. While several countries started reporting population based antibody rate, its simple point estimate may be misinterpreted without proper estimation of standard error and confidence intervals. In this paper, we review the importance of antibody studies and present the 95% confidence intervals COVID-19 antibody rate for the Korean population using two recently performed antibody tests in Korea. Due to the sparsity of data, the estimation of confidence interval is a big challenge. Thus, we consider several confidence intervals using Asymptotic, Exact and Bayesian estimation methods. In this article, we found that the Wald method gives the narrowest interval among all Asymptotic methods whereas mid p-value gives the narrowest among all Exact methods and Jeffrey's method gives the narrowest from Bayesian method. The most conservative 95% confidence interval estimation shows that as of 00:00 on September 15, 2020, at least 32,602 people were infected but not confirmed in Korea.