• 제목/요약/키워드: Metropolis

검색결과 323건 처리시간 0.021초

Detecting the Influential Observation Using Intrinsic Bayes Factors

  • Chung, Younshik
    • Journal of the Korean Statistical Society
    • /
    • 제29권1호
    • /
    • pp.81-94
    • /
    • 2000
  • For the balanced variance component model, sometimes intraclass correlation coefficient is of interest. If there is little information about the parameter, then the reference prior(Berger and Bernardo, 1992) is widely used. Pettit nd Young(1990) considered a measrue of the effect of a single observation on a logarithmic Bayes factor. However, under such a reference prior, the Bayes factor depends on the ratio of unspecified constants. In order to discard this problem, influence diagnostic measures using the intrinsic Bayes factor(Berger and Pericchi, 1996) is presented. Finally, one simulated dataset is provided which illustrates the methodology with appropriate simulation based computational formulas. In order to overcome the difficult Bayesian computation, MCMC methods, such as Gibbs sampler(Gelfand and Smith, 1990) and Metropolis algorithm, are empolyed.

  • PDF

Bayesian Parameter Estimation of the Four-Parameter Gamma Distribution

  • Oh, Mi-Ra;Kim, Kyung-Sook;Cho, Wan-Hyun;Son, Young-Sook
    • Communications for Statistical Applications and Methods
    • /
    • 제14권1호
    • /
    • pp.255-266
    • /
    • 2007
  • A Bayesian estimation of the four-parameter gamma distribution is considered under the noninformative prior. The Bayesian estimators are obtained by the Gibbs sampling. The generation of the shape/power parameter and the power parameter in the Gibbs sampler is implemented using the adaptive rejection sampling algorithm of Gilks and Wild (1992). Also, the location parameter is generated using the adaptive rejection Metropolis sampling algorithm of Gilks, Best and Tan (1995). Finally, the simulation result is presented.

Optimal Design of Truss Structures by Resealed Simulated Annealing

  • Park, Jungsun;Miran Ryu
    • Journal of Mechanical Science and Technology
    • /
    • 제18권9호
    • /
    • pp.1512-1518
    • /
    • 2004
  • Rescaled Simulated Annealing (RSA) has been adapted to solve combinatorial optimization problems in which the available computational resources are limited. Simulated Annealing (SA) is one of the most popular combinatorial optimization algorithms because of its convenience of use and because of the good asymptotic results of convergence to optimal solutions. However, SA is too slow to converge in many problems. RSA was introduced by extending the Metropolis procedure in SA. The extension rescales the state's energy candidate for a transition before applying the Metropolis criterion. The rescaling process accelerates convergence to the optimal solutions by reducing transitions from high energy local minima. In this paper, structural optimization examples using RSA are provided. Truss structures of which design variables are discrete or continuous are optimized with stress and displacement constraints. The optimization results by RSA are compared with the results from classical SA. The comparison shows that the numbers of optimization iterations can be effectively reduced using RSA.

A Comparison study of Hybrid Monte Carlo Algorithm

  • 황진수;전성해;이찬범
    • 한국통계학회:학술대회논문집
    • /
    • 한국통계학회 2000년도 추계학술발표회 논문집
    • /
    • pp.135-140
    • /
    • 2000
  • 베이지안 신경망 모형(Bayesian Neural Networks Models)에서 주어진 입력값(input)은 블랙 박스(Black-Box)와 같은 신경망 구조의 각 층(layer)을 거쳐서 출력값(output)으로 계산된다. 새로운 입력 데이터에 대한 예측값은 사후분포(posterior distribution)의 기대값(mean)에 의해 계산된다. 주어진 사전분포(prior distribution)와 학습데이터에 의한 가능도함수(likelihood functions)를 통해 계산되어진 사후분포는 매우 복잡한 구조를 갖게 됨으로서 기대값의 적분계산에 대한 어려움이 발생한다. 이때 확률적 추정에 의한 근사 방법인 몬테칼로 적분을 이용한다. 이러한 방법으로서 Hybrid Monte Carlo 알고리즘은 우수한 결과를 제공하여준다(Neal 1996). 본 논문에서는 Hybrid Monte Carlo 알고리즘과 기존에 많이 사용되고 있는 Gibbs sampling, Metropolis algorithm, 그리고 Slice Sampling등의 몬테칼로 방법들을 비교한다.

  • PDF

Bayesian Estimation of the Two-Parameter Kappa Distribution

  • Oh, Mi-Ra;Kim, Sun-Worl;Park, Jeong-Soo;Son, Young-Sook
    • Communications for Statistical Applications and Methods
    • /
    • 제14권2호
    • /
    • pp.355-363
    • /
    • 2007
  • In this paper a Bayesian estimation of the two-parameter kappa distribution was discussed under the noninformative prior. The Bayesian estimators are obtained by the Gibbs sampling. The generation of the shape parameter and scale parameter in the Gibbs sampler is implemented using the adaptive rejection Metropolis sampling algorithm of Gilks et al. (1995). A Monte Carlo study showed that the Bayesian estimators proposed outperform other estimators in the sense of mean squared error.

베이지안 통계 추론 (On the Bayesian Statistical Inference)

  • 이호석
    • 한국정보과학회:학술대회논문집
    • /
    • 한국정보과학회 2007년도 한국컴퓨터종합학술대회논문집 Vol.34 No.1 (C)
    • /
    • pp.263-266
    • /
    • 2007
  • 본 논문은 베이지안 통계 추론에 대하여 논의한다. 논문은 베이지안 추론, Markov Chain과 Monte Carlo 적분, MCMC(Markov Chain Monte Carlo) 기법, Metropolis-Hastings 알고리즘, Gibbs 샘플링, Maximum Likelihood Estimation, EM 알고리즘, 상실된 데이터 보완 기법, BMA(Bayesian Model Averaging) 순서로 논의를 진행한다. 이러한 통계적 기법들은 대용량의 데이터를 처리하는 생물학, 의학, 생명 공학, 과학과 공학, 그리고 일반 데이터 조사와 처리 등에 사용되고 있으며, 최적의 추론 결과를 이끌어 내는데 중요한 방법을 제공하고 있다. 그리고 마지막으로 PC(Principal Component) 분석 기법에 대하여 논의한다. PC 분석 기법도 데이터 분석과 연구에 많이 활용된다.

  • PDF

DEFAULT BAYESIAN INFERENCE OF REGRESSION MODELS WITH ARMA ERRORS UNDER EXACT FULL LIKELIHOODS

  • Son, Young-Sook
    • Journal of the Korean Statistical Society
    • /
    • 제33권2호
    • /
    • pp.169-189
    • /
    • 2004
  • Under the assumption of default priors, such as noninformative priors, Bayesian model determination and parameter estimation of regression models with stationary and invertible ARMA errors are developed under exact full likelihoods. The default Bayes factors, the fractional Bayes factor (FBF) of O'Hagan (1995) and the arithmetic intrinsic Bayes factors (AIBF) of Berger and Pericchi (1996a), are used as tools for the selection of the Bayesian model. Bayesian estimates are obtained by running the Metropolis-Hastings subchain in the Gibbs sampler. Finally, the results of numerical studies, designed to check the performance of the theoretical results discussed here, are presented.

On an Optimal Bayesian Variable Selection Method for Generalized Logit Model

  • Kim, Hea-Jung;Lee, Ae Kuoung
    • Communications for Statistical Applications and Methods
    • /
    • 제7권2호
    • /
    • pp.617-631
    • /
    • 2000
  • This paper is concerned with suggesting a Bayesian method for variable selection in generalized logit model. It is based on Laplace-Metropolis algorithm intended to propose a simple method for estimating the marginal likelihood of the model. The algorithm then leads to a criterion for the selection of variables. The criterion is to find a subset of variables that maximizes the marginal likelihood of the model and it is seen to be a Bayes rule in a sense that it minimizes the risk of the variable selection under 0-1 loss function. Based upon two examples, the suggested method is illustrated and compared with existing frequentist methods.

  • PDF

Bayesian Parameter Estimation using the MCMC method for the Mean Change Model of Multivariate Normal Random Variates

  • Oh, Mi-Ra;Kim, Eoi-Lyoung;Sim, Jung-Wook;Son, Young-Sook
    • Communications for Statistical Applications and Methods
    • /
    • 제11권1호
    • /
    • pp.79-91
    • /
    • 2004
  • In this thesis, Bayesian parameter estimation procedure is discussed for the mean change model of multivariate normal random variates under the assumption of noninformative priors for all the parameters. Parameters are estimated by Gibbs sampling method. In Gibbs sampler, the change point parameter is generated by Metropolis-Hastings algorithm. We apply our methodology to numerical data to examine it.