• 제목/요약/키워드: Monte Carlo sampling

검색결과 291건 처리시간 0.025초

Bayesian estimates of genetic parameters of non-return rate and success in first insemination in Japanese Black cattle

  • Setiaji, Asep;Arakaki, Daichi;Oikawa, Takuro
    • Animal Bioscience
    • /
    • 제34권7호
    • /
    • pp.1100-1104
    • /
    • 2021
  • Objective: The objective of present study was to estimate heritability of non-return rate (NRR) and success of first insemination (SFI) by using the Bayesian approach with Gibbs sampling. Methods: Heifer Traits were denoted as NRR-h and SFI-h, and cow traits as NRR-c and SFI-c. The variance covariance components were estimated using threshold model under Bayesian procedures THRGIBBS1F90. Results: The SFI was more relevant to evaluating success of insemination because a high percentage of animals that demonstrated no return did not successfully conceive in NRR. Estimated heritability of NRR and SFI in heifers were 0.032 and 0.039 and the corresponding estimates for cows were 0.020 and 0.027. The model showed low values of Geweke (p-value ranging between 0.012 and 0.018) and a low Monte Carlo chain error, indicating that the amount of a posteriori for the heritability estimate was valid for binary traits. Genetic correlation between the same traits among heifers and cows by using the two-trait threshold model were low, 0.485 and 0.591 for NRR and SFI, respectively. High genetic correlations were observed between NRR-h and SFI-h (0.922) and between NRR-c and SFI-c (0.954). Conclusion: SFI showed slightly higher heritability than NRR but the two traits are genetically correlated. Based on this result, both two could be used for early indicator for evaluate the capacity of cows to conceive.

A Bayesian state-space production model for Korean chub mackerel (Scomber japonicus) stock

  • Jung, Yuri;Seo, Young Il;Hyun, Saang-Yoon
    • Fisheries and Aquatic Sciences
    • /
    • 제24권4호
    • /
    • pp.139-152
    • /
    • 2021
  • The main purpose of this study is to fit catch-per-unit-effort (CPUE) data about Korea chub mackerel (Scomber japonicus) stock with a state-space production (SSP) model, and to provide stock assessment results. We chose a surplus production model for the chub mackerel data, namely annual yield and CPUE. Then we employed a state-space layer for a production model to consider two sources of variability arising from unmodelled factors (process error) and noise in the data (observation error). We implemented the model via script software ADMB-RE because it reduces the computational cost of high-dimensional integration and provides Markov Chain Monte Carlo sampling, which is required for Bayesian approaches. To stabilize the numerical optimization, we considered prior distributions for model parameters. Applying the SSP model to data collected from commercial fisheries from 1999 to 2017, we estimated model parameters and management references, as well as uncertainties for the estimates. We also applied various production models and showed parameter estimates and goodness of fit statistics to compare the model performance. This study presents two significant findings. First, we concluded that the stock has been overexploited in terms of harvest rate from 1999 to 2017. Second, we suggest a SSP model for the smallest goodness of fit statistics among several production models, especially for fitting CPUE data with fluctuations.

The Impact of Foreign Ownership on Capital Structure: Empirical Evidence from Listed Firms in Vietnam

  • NGUYEN, Van Diep;DUONG, Quynh Nga
    • The Journal of Asian Finance, Economics and Business
    • /
    • 제9권2호
    • /
    • pp.363-370
    • /
    • 2022
  • The study aims to probe the impact of foreign ownership on Vietnamese listed firms' capital structure. This study employs panel data of 288 non-financial firms listed on the Ho Chi Minh City stock exchange (HOSE) and Ha Noi stock exchange (HNX) in 2015-2019. In this research, we applied a Bayesian linear regression method to provide probabilistic explanations of the model uncertainty and effect of foreign ownership on the capital structure of non-financial listed enterprises in Vietnam. The findings of experimental analysis by Bayesian linear regression method through Markov chain Monte Carlo (MCMC) technique combined with Gibbs sampler suggest that foreign ownership has substantial adverse effects on the firms' capital structure. Our findings also indicate that a firm's size, age, and growth opportunities all have a strong positive and significant effect on its debt ratio. We found that the firms' profitability, tangible assets, and liquidity negatively and strongly affect firms' capital structure. Meanwhile, there is a low negative impact of dividends and inflation on the debt ratio. This research has ramifications for business managers since it improves a company's financial resources by developing a strong capital structure and considering foreign investment as a source of funding.

Important measure analysis of uncertainty parameters in bridge probabilistic seismic demands

  • Song, Shuai;Wu, Yuan H.;Wang, Shuai;Lei, Hong G.
    • Earthquakes and Structures
    • /
    • 제22권2호
    • /
    • pp.157-168
    • /
    • 2022
  • A moment-independent importance measure analysis approach was introduced to quantify the effects of structural uncertainty parameters on probabilistic seismic demands of simply supported girder bridges. Based on the probability distributions of main uncertainty parameters in bridges, conditional and unconditional bridge samples were constructed with Monte-Carlo sampling and analyzed in the OpenSees platform with a series of real seismic ground motion records. Conditional and unconditional probability density functions were developed using kernel density estimation with the results of nonlinear time history analysis of the bridge samples. Moment-independent importance measures of these uncertainty parameters were derived by numerical integrations with the conditional and unconditional probability density functions, and the uncertainty parameters were ranked in descending order of their importance. Different from Tornado diagram approach, the impacts of uncertainty parameters on the whole probability distributions of bridge seismic demands and the interactions of uncertainty parameters were considered simultaneously in the importance measure analysis approach. Results show that the interaction of uncertainty parameters had significant impacts on the seismic demand of components, and in some cases, it changed the most significant parameters for piers, bearings and abutments.

The Effect of Distribution Project Manager Leadership and Performance of Project Team Members with the Mediation Role of Self-Efficacy

  • SHOKORY, Suzyanty Mohd;ZAINOL, Zuraidah;AWANG, Marinah;ABDUL HAMID, Suriani;RAMDAN, Mohamad Rohieszan
    • 유통과학연구
    • /
    • 제20권9호
    • /
    • pp.29-38
    • /
    • 2022
  • Purpose: The purpose of this study to determine the effect of distribution transformational and transactional project manager leadership style on the extra-role performance of project team members using multi-level modelling analysis. Research design, data and methodology: The role of psychological factors as the mediating variable namely is self-efficacy in the effect of project manager's leadership style on the project team members' performance was also studied using the Monte Carlo bootstrapping method. The sample of the study consisting of 370 project team members from 74 contractors registered with the Construction Industry Development Board in the Klang Valley was selected using a simple random sampling and a survey using a questionnaire. Results: The findings showed that the transformational leadership of project managers was a dominant predictor of extra-role performance of project team members. Furthermore, the study show the self-efficacy of project team members acted as a mediator in the relationship between transformational and transactional leadership of project manager leadership on the extra-role performance of project team members. Conclusions: The findings are expected to be used by the relevant parties in planning, arranging and implementing efforts to improve the work performance and ensure that projects are implemented according to the specified specifications.

An Application of Realistic Evaluation Model to the Large Break LOCA Analysis of Ulchin 3&4

  • C. H. Ban;B. D. Chung;Lee, K. M.;J. H. Jeong;S. T. Hwang
    • 한국원자력학회:학술대회논문집
    • /
    • 한국원자력학회 1996년도 춘계학술발표회논문집(2)
    • /
    • pp.429-434
    • /
    • 1996
  • K-REM[1], which is under development as a realistic evaluation model of large break LOCA, is applied to the analysis of cold leg guillotine break of Ulchin 3&4. Fuel parameters on which statistical analysis of their effects on the peak cladding temperature (PCT) are made and system parameters on which the concept of limiting value approach (LVA) are applied, are determined from the single parameter sensitivity study. 3 parameters of fuel gap conductance, fuel thermal conductivity and power peaking factor are selected as fuel related ones and 4 parameters of axial power shape, reactor power, decay heat and the gas pressure of safety injection tank (SIT) are selected as plant system related ones. Response surface of PCT is generated from the plant calculation results and on which Monte Carlo sampling is made to get plant application uncertainty which is statistically combined with code uncertainty to produce the 95th percentile PCT. From the break spectrum analysis, blowdown PCT of 1350.23 K and reflood PCT of 1195.56 K are obtained for break discharge coefficients of 0.8 and 0.5, respectively.

  • PDF

Model-independent Constraints on Type Ia Supernova Light-curve Hyperparameters and Reconstructions of the Expansion History of the Universe

  • Koo, Hanwool;Shafieloo, Arman;Keeley, Ryan E.;L'Huillier, Benjamin
    • 천문학회보
    • /
    • 제45권1호
    • /
    • pp.48.4-49
    • /
    • 2020
  • We reconstruct the expansion history of the universe using type Ia supernovae (SN Ia) in a manner independent of any cosmological model assumptions. To do so, we implement a nonparametric iterative smoothing method on the Joint Light-curve Analysis (JLA) data while exploring the SN Ia light-curve hyperparameter space by Markov Chain Monte Carlo (MCMC) sampling. We test to see how the posteriors of these hyperparameters depend on cosmology, whether using different dark energy models or reconstructions shift these posteriors. Our constraints on the SN Ia light-curve hyperparameters from our model-independent analysis are very consistent with the constraints from using different parameterizations of the equation of state of dark energy, namely the flat ΛCDM cosmology, the Chevallier-Polarski-Linder model, and the Phenomenologically Emergent Dark Energy (PEDE) model. This implies that the distance moduli constructed from the JLA data are mostly independent of the cosmological models. We also studied that the possibility the light-curve parameters evolve with redshift and our results show consistency with no evolution. The reconstructed expansion history of the universe and dark energy properties also seem to be in good agreement with the expectations of the standard ΛCDM model. However, our results also indicate that the data still allow for considerable flexibility in the expansion history of the universe. This work is published in ApJ.

  • PDF

Uncertainty analysis of heat transfer of TMSR-SF0 simulator

  • Jiajun Wang;Ye Dai;Yang Zou;Hongjie Xu
    • Nuclear Engineering and Technology
    • /
    • 제56권2호
    • /
    • pp.762-769
    • /
    • 2024
  • The TMSR-SF0 simulator is an integral effect thermal-hydraulic experimental system for the development of thorium molten salt reactor (TMSR) program in China. The simulator has two heat transport loops with liquid FLiNaK. In literature, the 95% level confidence uncertainties of the thermophysical properties of FLiNaK are recommended, and the uncertainties of density, heat capacity, thermal conductivity and viscosity are ±2%, ±10, ±10% and ±10% respectively. In order to investigate the effects of thermophysical properties uncertainties on the molten salt heat transport system, the uncertainty and sensitivity analysis of the heat transfer characteristics of the simulator system are carried out on a RELAP5 model. The uncertainties of thermophysical properties are incorporated in simulation model and the Monte Carlo sampling method is used to propagate the input uncertainties through the model. The simulation results indicate that the uncertainty propagated to core outlet temperature is about ±10 ℃ with a confidence level of 95% in a steady-state operation condition. The result should be noted in the design, operation and code validation of molten salt reactor. In addition, more experimental data is necessary for quantifying the uncertainty of thermophysical properties of molten salts.

Multihazard capacity optimization of an NPP using a multi-objective genetic algorithm and sampling-based PSA

  • Eujeong Choi;Shinyoung Kwag;Daegi Hahm
    • Nuclear Engineering and Technology
    • /
    • 제56권2호
    • /
    • pp.644-654
    • /
    • 2024
  • After the Tohoku earthquake and tsunami (Japan, 2011), regulatory efforts to mitigate external hazards have increased both the safety requirements and the total capital cost of nuclear power plants (NPPs). In these circumstances, identifying not only disaster robustness but also cost-effective capacity setting of NPPs has become one of the most important tasks for the nuclear power industry. A few studies have been performed to relocate the seismic capacity of NPPs, yet the effects of multiple hazards have not been accounted for in NPP capacity optimization. The major challenges in extending this problem to the multihazard dimension are (1) the high computational costs for both multihazard risk quantification and system-level optimization and (2) the lack of capital cost databases of NPPs. To resolve these issues, this paper proposes an effective method that identifies the optimal multihazard capacity of NPPs using a multi-objective genetic algorithm and the two-stage direct quantification of fault trees using Monte Carlo simulation method, called the two-stage DQFM. Also, a capacity-based indirect capital cost measure is proposed. Such a proposed method enables NPP to achieve safety and cost-effectiveness against multi-hazard simultaneously within the computationally efficient platform. The proposed multihazard capacity optimization framework is demonstrated and tested with an earthquake-tsunami example.

Non-Simultaneous Sampling Deactivation during the Parameter Approximation of a Topic Model

  • Jeong, Young-Seob;Jin, Sou-Young;Choi, Ho-Jin
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • 제7권1호
    • /
    • pp.81-98
    • /
    • 2013
  • Since Probabilistic Latent Semantic Analysis (PLSA) and Latent Dirichlet Allocation (LDA) were introduced, many revised or extended topic models have appeared. Due to the intractable likelihood of these models, training any topic model requires to use some approximation algorithm such as variational approximation, Laplace approximation, or Markov chain Monte Carlo (MCMC). Although these approximation algorithms perform well, training a topic model is still computationally expensive given the large amount of data it requires. In this paper, we propose a new method, called non-simultaneous sampling deactivation, for efficient approximation of parameters in a topic model. While each random variable is normally sampled or obtained by a single predefined burn-in period in the traditional approximation algorithms, our new method is based on the observation that the random variable nodes in one topic model have all different periods of convergence. During the iterative approximation process, the proposed method allows each random variable node to be terminated or deactivated when it is converged. Therefore, compared to the traditional approximation ways in which usually every node is deactivated concurrently, the proposed method achieves the inference efficiency in terms of time and memory. We do not propose a new approximation algorithm, but a new process applicable to the existing approximation algorithms. Through experiments, we show the time and memory efficiency of the method, and discuss about the tradeoff between the efficiency of the approximation process and the parameter consistency.