• Title/Summary/Keyword: prior probability

Search Result 285, Processing Time 0.029 seconds

Noninformative priors for Pareto distribution

  • Kim, Dal-Ho;Kang, Sang-Gil;Lee, Woo-Dong
    • Journal of the Korean Data and Information Science Society
    • /
    • v.20 no.6
    • /
    • pp.1213-1223
    • /
    • 2009
  • In this paper, we develop noninformative priors for two parameter Pareto distribution. Specially, we derive Jereys' prior, probability matching prior and reference prior for the parameter of interest. In our case, the probability matching prior is only a first order matching prior and there does not exist a second order matching prior. Some simulation reveals that the matching prior performs better to achieve the coverage probability. A real example is also considered.

  • PDF

Development of Noninformative Priors in the Burr Model

  • Cho, Jang-Sik;Kang, Sang-Gil;Baek, Sung-Uk
    • Journal of the Korean Data and Information Science Society
    • /
    • v.14 no.1
    • /
    • pp.83-92
    • /
    • 2003
  • In this paper, we derive noninformative priors for the ratio of parameters in the Burr model. We obtain Jeffreys' prior, reference prior and second order probability matching prior. Also we prove that the noninformative prior matches the alternative coverage probabilities and a HPD matching prior up to the second order, respectively. Finally, we provide simulated frequentist coverage probabilities under the derived noninformative priors for small and moderate size of samples.

  • PDF

NONINFORMATIVE PRIORS FOR PARETO DISTRIBUTION : REGULAR CASE

  • Kim, Dal-Ho;Lee, Woo-Dong;Kang, Sang-Gil
    • 한국데이터정보과학회:학술대회논문집
    • /
    • 2003.05a
    • /
    • pp.27-37
    • /
    • 2003
  • In this paper, we develop noninformative priors for two parameter Pareto distribution. Specially, we derive Jeffrey's prior, probability matching prior and reference prior for the parameter of interest. In our case, the probability matching prior is only a first order and there does not exist a second order matching prior. Some simulation reveals that the matching prior performs better to achieve the coverage probability. And a real example will be given.

  • PDF

Reference Priors in a Two-Way Mixed-Effects Analysis of Variance Model

  • Chang, In-Hong;Kim, Byung-Hwee
    • Journal of the Korean Data and Information Science Society
    • /
    • v.13 no.2
    • /
    • pp.317-328
    • /
    • 2002
  • We first derive group ordering reference priors in a two-way mixed-effects analysis of variance (ANOVA) model. We show that posterior distributions are proper and provide marginal posterior distributions under reference priors. We also examine whether the reference priors satisfy the probability matching criterion. Finally, the reference prior satisfying the probability matching criterion is shown to be good in the sense of frequentist coverage probability of the posterior quantile.

  • PDF

Effect of Prior Probabilities on the Classification Accuracy under the Condition of Poor Separability

  • Kim, Chang-Jae;Eo, Yang-Dam;Lee, Byoung-Kil
    • Journal of the Korean Society of Surveying, Geodesy, Photogrammetry and Cartography
    • /
    • v.26 no.4
    • /
    • pp.333-340
    • /
    • 2008
  • This paper shows that the use of prior probabilities of the involved classes improve the accuracy of classification in case of poor separability between classes. Three cases of experiments are designed with two LiDAR datasets while considering three different classes (building, tree, and flat grass area). Moreover, random sampling method with human interpretation is used to achieve the approximate prior probabilities in this research. Based on the experimental results, Bayesian classification with the appropriate prior probability makes the improved classification results comparing with the case of non-prior probability when the ratio of prior probability of one class to that of the other is significantly different to 1.0.

On the Development of Probability Matching Priors for Non-regular Pareto Distribution

  • Lee, Woo Dong;Kang, Sang Gil;Cho, Jang Sik
    • Communications for Statistical Applications and Methods
    • /
    • v.10 no.2
    • /
    • pp.333-339
    • /
    • 2003
  • In this paper, we develop the probability matching priors for the parameters of non-regular Pareto distribution. We prove the propriety of joint posterior distribution induced by probability matching priors. Through the simulation study, we show that the proposed probability matching Prior matches the coverage probabilities in a frequentist sense. A real data example is given.

DEVELOPING NONINFORMATIVE PRIORS FOR THE FAMILIAL DATA

  • Heo, Jung-Eun;Kim, Yeong-Hwa
    • Journal of the Korean Statistical Society
    • /
    • v.36 no.1
    • /
    • pp.77-91
    • /
    • 2007
  • This paper considers development of noninformative priors for the familial data when the families have equal number of offspring. Several noninformative priors including the widely used Jeffreys' prior as well as the different reference priors are derived. Also, a simultaneously-marginally-probability-matching prior is considered and probability matching priors are derived when the parameter of interest is inter- or intra-class correlation coefficient. The simulation study implemented by Gibbs sampler shows that two-group reference prior is slightly edge over the others in terms of coverage probability.

A Study on Noninformative Priors of Intraclass Correlation Coefficients in Familial Data

  • Jin, Bong-Soo;Kim, Byung-Hwee
    • Communications for Statistical Applications and Methods
    • /
    • v.12 no.2
    • /
    • pp.395-411
    • /
    • 2005
  • In this paper, we develop the Jeffreys' prior, reference prior and the the probability matching priors for the difference of intraclass correlation coefficients in familial data. e prove the sufficient condition for propriety of posterior distributions. Using marginal posterior distributions under those noninformative priors, we compare posterior quantiles and frequentist coverage probability.

Noninformative Priors for the Difference of Two Quantiles in Exponential Models

  • Kang, Sang-Gil;Kim, Dal-Ho;Lee, Woo-Dong
    • Communications for Statistical Applications and Methods
    • /
    • v.14 no.2
    • /
    • pp.431-442
    • /
    • 2007
  • In this paper, we develop the noninformative priors when the parameter of interest is the difference between quantiles of two exponential distributions. We want to develop the first and second order probability matching priors. But we prove that the second order probability matching prior does not exist. It turns out that Jeffreys' prior does not satisfy the first order matching criterion. The Bayesian credible intervals based on the first order probability matching prior meet the frequentist target coverage probabilities much better than the frequentist intervals of Jeffreys' prior. Some simulation and real example will be given.

Supervised Classification Using Training Parameters and Prior Probability Generated from VITD - The Case of QuickBird Multispectral Imagery

  • Eo, Yang-Dam;Lee, Gyeong-Wook;Park, Doo-Youl;Park, Wang-Yong;Lee, Chang-No
    • Korean Journal of Remote Sensing
    • /
    • v.24 no.5
    • /
    • pp.517-524
    • /
    • 2008
  • In order to classify an satellite imagery into geospatial features of interest, the supervised classification needs to be trained to distinguish these features through training sampling. However, even though an imagery is classified, different results of classification could be generated according to operator's experience and expertise in training process. Users who practically exploit an classification result to their applications need the research accomplishment for the consistent result as well as the accuracy improvement. The experiment includes the classification results for training process used VITD polygons as a prior probability and training parameter, instead of manual sampling. As results, classification accuracy using VITD polygons as prior probabilities shows the highest results in several methods. The training using unsupervised classification with VITD have produced similar classification results as manual training and/or with prior probability.