• Title/Summary/Keyword: Gibbs priors

Search Result 35, Processing Time 0.023 seconds

A Bayesian uncertainty analysis for nonignorable nonresponse in two-way contingency table

  • Woo, Namkyo;Kim, Dal Ho
    • Journal of the Korean Data and Information Science Society
    • /
    • v.26 no.6
    • /
    • pp.1547-1555
    • /
    • 2015
  • We study the problem of nonignorable nonresponse in a two-way contingency table and there may be one or two missing categories. We describe a nonignorable nonresponse model for the analysis of two-way categorical table. One approach to analyze these data is to construct several tables (one complete and the others incomplete). There are nonidentifiable parameters in incomplete tables. We describe a hierarchical Bayesian model to analyze two-way categorical data. We use a nonignorable nonresponse model with Bayesian uncertainty analysis by placing priors in nonidentifiable parameters instead of a sensitivity analysis for nonidentifiable parameters. To reduce the effects of nonidentifiable parameters, we project the parameters to a lower dimensional space and we allow the reduced set of parameters to share a common distribution. We use the griddy Gibbs sampler to fit our models and compute DIC and BPP for model diagnostics. We illustrate our method using data from NHANES III data to obtain the finite population proportions.

USE OF TRAINING DATA TO ESTIMATE THE SMOOTHING PARAMETER FOR BAYESIAN IMAGE RECONSTRUCTION

  • SooJinLee
    • Journal of the Korean Geophysical Society
    • /
    • v.4 no.3
    • /
    • pp.175-182
    • /
    • 2001
  • We consider the problem of determining smoothing parameters of Gibbs priors for Bayesian methods used in the medical imaging application of emission tomographic reconstruction. We address a simple smoothing prior (membrane) whose global hyperparameter (the smoothing parameter) controls the bias/variance tradeoff of the solution. We base our maximum-likelihood (ML) estimates of hyperparameters on observed training data, and argue the motivation for this approach. Good results are obtained with a simple ML estimate of the smoothing parameter for the membrane prior.

  • PDF

Bayesian Multiple Change-Point Estimation of Multivariate Mean Vectors for Small Data

  • Cheon, Sooyoung;Yu, Wenxing
    • The Korean Journal of Applied Statistics
    • /
    • v.25 no.6
    • /
    • pp.999-1008
    • /
    • 2012
  • A Bayesian multiple change-point model for small data is proposed for multivariate means and is an extension of the univariate case of Cheon and Yu (2012). The proposed model requires data from a multivariate noncentral $t$-distribution and conjugate priors for the distributional parameters. We apply the Metropolis-Hastings-within-Gibbs Sampling algorithm to the proposed model to detecte multiple change-points. The performance of our proposed algorithm has been investigated on simulated and real dataset, Hanwoo fat content bivariate data.

Use of Training Data to Estimate the Smoothing Parameter for Bayesian Image Reconstruction

  • Lee, Soo-Jin
    • The Journal of Engineering Research
    • /
    • v.4 no.1
    • /
    • pp.47-54
    • /
    • 2002
  • We consider the problem of determining smoothing parameters of Gibbs priors for Bayesian methods used in the medical imaging application of emission tomographic reconstruction. We address a simple smoothing prior (membrane) whose global hyperparameter (the smoothing parameter) controls the bias/variance tradeoff of the solution. We base our maximum-likelihood(ML) estimates of hyperparameters on observed training data, and argue the motivation for this approach. Good results are obtained with a simple ML estimate of the smoothing parameter for the membrane prior.

  • PDF

A Bayesian Approach for Accelerated Failure Time Model with Skewed Normal Error

  • Kim, Chansoo
    • Communications for Statistical Applications and Methods
    • /
    • v.10 no.2
    • /
    • pp.268-275
    • /
    • 2003
  • We consider the Bayesian accelerated failure time model. The error distribution is assigned a skewed normal distribution which is including normal distribution. For noninformative priors of regression coefficients, we show the propriety of posterior distribution. A Markov Chain Monte Carlo algorithm(i.e., Gibbs Sampler) is used to obtain a predictive distribution for a future observation and Bayes estimates of regression coefficients.

Bayesian Interval Estimation of Tobit Regression Model (토빗회귀모형에서 베이지안 구간추정)

  • Lee, Seung-Chun;Choi, Byung Su
    • The Korean Journal of Applied Statistics
    • /
    • v.26 no.5
    • /
    • pp.737-746
    • /
    • 2013
  • The Bayesian method can be applied successfully to the estimation of the censored regression model introduced by Tobin (1958). The Bayes estimates show improvements over the maximum likelihood estimate; however, the performance of the Bayesian interval estimation is questionable. In Bayesian paradigm, the prior distribution usually reflects personal beliefs about the parameters. Such subjective priors will typically yield interval estimators with poor frequentist properties; however, an objective noninformative often yields a Bayesian procedure with good frequentist properties. We examine the performance of frequentist properties of noninformative priors for the Tobit regression model.

Robust Bayesian meta analysis (로버스트 베이지안 메타분석)

  • Choi, Seong-Mi;Kim, Dal-Ho;Shin, Im-Hee;Kim, Ho-Gak;Kim, Sang-Gyung
    • Journal of the Korean Data and Information Science Society
    • /
    • v.22 no.3
    • /
    • pp.459-466
    • /
    • 2011
  • This article addresses robust Bayesian modeling for meta analysis which derives general conclusion by combining independently performed individual studies. Specifically, we propose hierarchical Bayesian models with unknown variances for meta analysis under priors which are scale mixtures of normal, and thus have tail heavier than that of the normal. For the numerical analysis, we use the Gibbs sampler for calculating Bayesian estimators and illustrate the proposed methods using actual data.

A Study on the Ordered Subsets Expectation Maximization Reconstruction Method Using Gibbs Priors for Emission Computed Tomography (Gibbs 선행치를 사용한 배열된부분집합 기대값최대화 방출단층영상 재구성방법에 관한 연구)

  • Im, K. C.;Choi, Y.;Kim, J. H.;Lee, S. J.;Woo, S. K.;Seo, H. K.;Lee, K. H.;Kim, S. E.;Choe, Y. S.;Park, C. C;Kim, B. T.
    • Journal of Biomedical Engineering Research
    • /
    • v.21 no.5
    • /
    • pp.441-448
    • /
    • 2000
  • 방출단층영상 재구성을 위한 최대우도 기대값최대화(maximum likelihood expectation maximization, MLEM) 방법은 영상 획득과정을 통계학적으로 모델링하여 영상을 재구성한다. MLEM은 일반적으로 사용하여 여과후역투사(filtered backprojection)방법에 비해 많은 장점을 가지고 있으나 반복횟수 증가에 따른 발산과 재구성 시간이 오래 걸리는 단점을 가지고 있다. 이 논문에서는 이러한 단점을 보완하기 위해 계산시간을 현저히 단축시킨 배열된부분집합 기대값최대화(ordered subsets expectation maximization. OSEM)에 Gibbs 선행치인 membrance (MM) 또는 thin plate(TP)을 첨가한 OSEM-MAP (maximum a posteriori)을 구현함으로써 알고리즘의 안정성 및 재구성된 영상의 질을 향상시키고자 g나다. 실험에서 알고리즘의 수렴시간을 가속화하기 위해 투사 데이터를 16개의 부분집합으로 분할하여 반복연산을 수행하였으며, 알고리즘의 성능을 비교하기 위해 소프트웨어 모형(원숭이 뇌 자가방사선, 수학적심장흉부)을 사용한 영상재구성 결과를 제곱오차로 비교하였다. 또한 알고리즘의 사용 가능성을 평가하기 위해 물리모형을 사용하여 PET 기기로부터 획득한 실제 투사 데이터를 사용하였다.

  • PDF

Statistical Methods for Tomographic Image Reconstruction in Nuclear Medicine (핵의학 단층영상 재구성을 위한 통계학적 방법)

  • Lee, Soo-Jin
    • Nuclear Medicine and Molecular Imaging
    • /
    • v.42 no.2
    • /
    • pp.118-126
    • /
    • 2008
  • Statistical image reconstruction methods have played an important role in emission computed tomography (ECT) since they accurately model the statistical noise associated with gamma-ray projection data. Although the use of statistical methods in clinical practice in early days was of a difficult problem due to high per-iteration costs and large numbers of iterations, with the development of fast algorithms and dramatically improved speed of computers, it is now inevitably becoming more practical. Some statistical methods are indeed commonly available from nuclear medicine equipment suppliers. In this paper, we first describe a mathematical background for statistical reconstruction methods, which includes assumptions underlying the Poisson statistical model, maximum likelihood and maximum a posteriori approaches, and prior models in the context of a Bayesian framework. We then review a recent progress in developing fast iterative algorithms.

Bayesian Analysis of Software Reliability Growth Model with Negative Binomial Information (음이항분포 정보를 가진 베이지안 소프트웨어 신뢰도 성장모형에 관한 연구)

  • Kim, Hui-Cheol;Park, Jong-Gu;Lee, Byeong-Su
    • The Transactions of the Korea Information Processing Society
    • /
    • v.7 no.3
    • /
    • pp.852-861
    • /
    • 2000
  • Software reliability growth models are used in testing stages of software development to model the error content and time intervals betwewn software failures. In this paper, using priors for the number of fault with the negative binomial distribution nd the error rate with gamma distribution, Bayesian inference and model selection method for Jelinski-Moranda and Goel-Okumoto and Schick-Wolverton models in software reliability. For model selection, we explored the sum of the relative error, Braun statistic and median variation. In Bayesian computation process, we could avoid the multiple integration by the use of Gibbs sampling, which is a kind of Markov Chain Monte Carolo method to compute the posterior distribution. Using simulated data, Bayesian inference and model selection is studied.

  • PDF