• Title/Summary/Keyword: posterior probabilities

Search Result 97, Processing Time 0.022 seconds

Bayesian Multiple Comparisons for Normal Variances

  • Kim, Hea-Jung
    • Journal of the Korean Statistical Society
    • /
    • v.29 no.2
    • /
    • pp.155-168
    • /
    • 2000
  • Regarding to multiple comparison problem (MCP) of k normal population variances, we suggest a Bayesian method for calculating posterior probabilities for various hypotheses of equality among population variances. This leads to a simple method for obtaining pairwise comparisons of variances in a statistical experiment with a partition on the parameter space induced by equality and inequality relationships among the variances. The method is derived from the fact that certain features of the hierarchical nonparametric family of Dirichlet process priors, in general, make it amenable to solving the MCP and estimating the posterior probabilities by means of posterior simulation, the Gibbs sampling. Two examples are illustrated for the method. For these examples, the method is straightforward for specifying distributionally and to implement computationally, with output readily adapted for required comparison.

  • PDF

Nonparametric Bayesian Multiple Comparisons for Geometric Populations

  • Ali, M. Masoom;Cho, J.S.;Begum, Munni
    • Journal of the Korean Data and Information Science Society
    • /
    • v.16 no.4
    • /
    • pp.1129-1140
    • /
    • 2005
  • A nonparametric Bayesian method for calculating posterior probabilities of the multiple comparison problem on the parameters of several Geometric populations is presented. Bayesian multiple comparisons under two different prior/ likelihood combinations was studied by Gopalan and Berry(1998) using Dirichlet process priors. In this paper, we followed the same approach to calculate posterior probabilities for various hypotheses in a statistical experiment with a partition on the parameter space induced by equality and inequality relationships on the parameters of several geometric populations. This also leads to a simple method for obtaining pairwise comparisons of probability of successes. Gibbs sampling technique was used to evaluate the posterior probabilities of all possible hypotheses that are analytically intractable. A numerical example is given to illustrate the procedure.

  • PDF

Semiparametric Bayesian multiple comparisons for Poisson Populations

  • Cho, Jang Sik;Kim, Dal Ho;Kang, Sang Gil
    • Communications for Statistical Applications and Methods
    • /
    • v.8 no.2
    • /
    • pp.427-434
    • /
    • 2001
  • In this paper, we consider the nonparametric Bayesian approach to the multiple comparisons problem for I Poisson populations using Dirichlet process priors. We describe Gibbs sampling algorithm for calculating posterior probabilities for the hypotheses and calculate posterior probabilities for the hypotheses using Markov chain Monte Carlo. Also we provide a numerical example to illustrate the developed numerical technique.

  • PDF

A study on classification accuracy improvements using orthogonal summation of posterior probabilities (사후확률 결합에 의한 분류정확도 향상에 관한 연구)

  • 정재준
    • Spatial Information Research
    • /
    • v.12 no.1
    • /
    • pp.111-125
    • /
    • 2004
  • Improvements of classification accuracy are main issues in satellite image classification. Considering the facts that multiple images in the same area are available, there are needs on researches aiming improvements of classification accuracy using multiple data sets. In this study, orthogonal summation method of Dempster-Shafer theory (theory of evidence) is proposed as a multiple imagery classification method and posterior probabilities and classification uncertainty are used in calculation process. Accuracies of the proposed method are higher than conventional classification methods, maximum likelihood classification(MLC) of each data and MLC of merged data sets, which can be certified through statistical tests of mean difference.

  • PDF

Bayesian Variable Selection in Linear Regression Models with Inequality Constraints on the Coefficients (제한조건이 있는 선형회귀 모형에서의 베이지안 변수선택)

  • 오만숙
    • The Korean Journal of Applied Statistics
    • /
    • v.15 no.1
    • /
    • pp.73-84
    • /
    • 2002
  • Linear regression models with inequality constraints on the coefficients are frequently used in economic models due to sign or order constraints on the coefficients. In this paper, we propose a Bayesian approach to selecting significant explanatory variables in linear regression models with inequality constraints on the coefficients. Bayesian variable selection requires computation of posterior probability of each candidate model. We propose a method which computes all the necessary posterior model probabilities simultaneously. In specific, we obtain posterior samples form the most general model via Gibbs sampling algorithm (Gelfand and Smith, 1990) and compute the posterior probabilities by using the samples. A real example is given to illustrate the method.

Bayesian Analysis for Burr-Type X Strength-Stress Model

  • Kang, Sang-Gil;Ko, Jeong-Hwan;Lee, Woo-Dong
    • Proceedings of the Korea Society for Industrial Systems Conference
    • /
    • 1999.05a
    • /
    • pp.191-197
    • /
    • 1999
  • In this paper, we develop noninformative priors that are used for estimating the reliability of stress-strength system under the Burr-type X distribution. A class of priors is found by matching the coverage probabilities of one-sided Bayesian credible interval with the corresponding frequentist coverage probabilities. It turns out that the reference prior is a first order matching prior. The propriety of posterior under matching prior is provided. The frequentist coverage probabilities are given for small samples.

  • PDF

Variational Expectation-Maximization Algorithm in Posterior Distribution of a Latent Dirichlet Allocation Model for Research Topic Analysis

  • Kim, Jong Nam
    • Journal of Korea Multimedia Society
    • /
    • v.23 no.7
    • /
    • pp.883-890
    • /
    • 2020
  • In this paper, we propose a variational expectation-maximization algorithm that computes posterior probabilities from Latent Dirichlet Allocation (LDA) model. The algorithm approximates the intractable posterior distribution of a document term matrix generated from a corpus made up by 50 papers. It approximates the posterior by searching the local optima using lower bound of the true posterior distribution. Moreover, it maximizes the lower bound of the log-likelihood of the true posterior by minimizing the relative entropy of the prior and the posterior distribution known as KL-Divergence. The experimental results indicate that documents clustered to image classification and segmentation are correlated at 0.79 while those clustered to object detection and image segmentation are highly correlated at 0.96. The proposed variational inference algorithm performs efficiently and faster than Gibbs sampling at a computational time of 0.029s.

Generative probabilistic model with Dirichlet prior distribution for similarity analysis of research topic

  • Milyahilu, John;Kim, Jong Nam
    • Journal of Korea Multimedia Society
    • /
    • v.23 no.4
    • /
    • pp.595-602
    • /
    • 2020
  • We propose a generative probabilistic model with Dirichlet prior distribution for topic modeling and text similarity analysis. It assigns a topic and calculates text correlation between documents within a corpus. It also provides posterior probabilities that are assigned to each topic of a document based on the prior distribution in the corpus. We then present a Gibbs sampling algorithm for inference about the posterior distribution and compute text correlation among 50 abstracts from the papers published by IEEE. We also conduct a supervised learning to set a benchmark that justifies the performance of the LDA (Latent Dirichlet Allocation). The experiments show that the accuracy for topic assignment to a certain document is 76% for LDA. The results for supervised learning show the accuracy of 61%, the precision of 93% and the f1-score of 96%. A discussion for experimental results indicates a thorough justification based on probabilities, distributions, evaluation metrics and correlation coefficients with respect to topic assignment.

A Novel Posterior Probability Estimation Method for Multi-label Naive Bayes Classification

  • Kim, Hae-Cheon;Lee, Jaesung
    • Journal of the Korea Society of Computer and Information
    • /
    • v.23 no.6
    • /
    • pp.1-7
    • /
    • 2018
  • A multi-label classification is to find multiple labels associated with the input pattern. Multi-label classification can be achieved by extending conventional single-label classification. Common extension techniques are known as Binary relevance, Label powerset, and Classifier chains. However, most of the extended multi-label naive bayes classifier has not been able to accurately estimate posterior probabilities because it does not reflect the label dependency. And the remaining extended multi-label naive bayes classifier has a problem that it is unstable to estimate posterior probability according to the label selection order. To estimate posterior probability well, we propose a new posterior probability estimation method that reflects the probability between all labels and labels efficiently. The proposed method reflects the correlation between labels. And we have confirmed through experiments that the extended multi-label naive bayes classifier using the proposed method has higher accuracy then the existing multi-label naive bayes classifiers.

A Bayes Rule for Determining the Number of Common Factors in Oblique Factor Model

  • Kim, Hea-Jung
    • Journal of the Korean Statistical Society
    • /
    • v.29 no.1
    • /
    • pp.95-108
    • /
    • 2000
  • Consider the oblique factor model X=Af+$\varepsilon$, with defining relation $\Sigma$$\Phi$Λ'+Ψ. This paper is concerned with suggesting an optimal Bayes criterion for determining the number of factors in the model, i.e. dimension of the vector f. The use of marginal likelihood as a method for calculating posterior probability of each model with given dimension is developed under a generalized conjugate prior. Then based on an appropriate loss function, a Bayes rule is developed by use of the posterior probabilities. It is shown that the approach is straightforward to specify distributionally and to imploement computationally, with output readily adopted for constructing required cirterion.

  • PDF