• Title/Summary/Keyword: conditional probability distribution

Search Result 72, Processing Time 0.024 seconds

Improved Exact Inference in Logistic Regression Model

  • Kim, Donguk;Kim, Sooyeon
    • Communications for Statistical Applications and Methods
    • /
    • v.10 no.2
    • /
    • pp.277-289
    • /
    • 2003
  • We propose modified exact inferential methods in logistic regression model. Exact conditional distribution in logistic regression model is often highly discrete, and ordinary exact inference in logistic regression is conservative, because of the discreteness of the distribution. For the exact inference in logistic regression model we utilize the modified P-value. The modified P-value can not exceed the ordinary P-value, so the test of size $\alpha$ based on the modified P-value is less conservative. The modified exact confidence interval maintains at least a fixed confidence level but tends to be much narrower. The approach inverts results of a test with a modified P-value utilizing the test statistic and table probabilities in logistic regression model.

Bayesian Methods for Generalized Linear Models

  • Paul E. Green;Kim, Dae-Hak
    • Communications for Statistical Applications and Methods
    • /
    • v.6 no.2
    • /
    • pp.523-532
    • /
    • 1999
  • Generalized linear models have various applications for data arising from many kinds of statistical studies. Although the response variable is generally assumed to be generated from a wide class of probability distributions we focus on count data that are most often analyzed using binomial models for proportions or poisson models for rates. The methods and results presented here also apply to many other categorical data models in general due to the relationship between multinomial and poisson sampling. The novelty of the approach suggested here is that all conditional distribution s can be specified directly so that staraightforward Gibbs sampling is possible. The prior distribution consists of two stages. We rely on a normal nonconjugate prior at the first stage and a vague prior for hyperparameters at the second stage. The methods are demonstrated with an illustrative example using data collected by Rosenkranz and raftery(1994) concerning the number of hospital admissions due to back pain in Washington state.

  • PDF

Review of Screening Procedure as Statistical Hypothesis Testing (통계적 가설검정으로서의 선별검사절차의 검토)

  • 권혁무;이민구;김상부;홍성훈
    • Journal of Korean Society for Quality Management
    • /
    • v.26 no.2
    • /
    • pp.39-50
    • /
    • 1998
  • A screening procedure, where one or more correlated variables are used for screeing, is reviewed from the point of statistical hypothesis testing. Without assuming a specific probability model for the joint distribution of the performance and screening variables, some principles are provided to establish the best screeing region. A, pp.ication examples are provided for two cases; ⅰ) the case where the performance variable is dichotomous and ⅱ) the case where the performance variable is continuous. In case ⅰ), a normal model is assumed for the conditional distribution of the screening variable given the performance variable. In case ⅱ), the performance and screening variables are assumed to be jointly normally distributed.

  • PDF

Investigation on Exact Tests (정확검정들에 대한 고찰)

  • 강승호
    • The Korean Journal of Applied Statistics
    • /
    • v.15 no.1
    • /
    • pp.187-199
    • /
    • 2002
  • When the sample size is small, exact tests are often employed because the asymptotic distribution of the test statistic is in doubt. The advantage of exact tests is that it is guaranteed to bound the type I error probability to the nominal level. In this paper we review the methods of constructing exact tests, the algorithm and commercial software. We also examine the difference between exact p-values obtained from exact tests and true p-values obtained from the true underlying distribution.

Business Strategy, Corporate Governance and Sustainability Reporting: An Analysis of the Fit Contingency Approach

  • HERNAWATI, Erna
    • The Journal of Asian Finance, Economics and Business
    • /
    • v.7 no.12
    • /
    • pp.761-771
    • /
    • 2020
  • This study discusses the role of Board Monitoring Effectiveness (BME) on managers' decisions regarding the business strategies that fit the external business environmental conditions by using a contingency analysis approach. Furthermore, this study will examine how fit strategies affect Sustainability Reporting (SR) of listed companies on the Indonesia Stock Exchange (IDX) from 2014 to 2017. This study uses Conditional Mixed Process (CMP) technique. This CMP method is claimed to be more efficient in analyzing the TSL models. This study found that in highly uncertain conditions, BME had a positive influence on the probability of managers to choose prospector and defender strategies rather than analyzers. These results indicate that BME shows positive impact on the contingency fit between business strategies and environmental uncertainty. In addition, the study documents that only prospectors have a positive impact on SR, however this study failed to document that defenders have positive impact on SR. Meanwhile the unexpected result is analyzers have a significantly positive effect on SR. This study is the first study to investigate the role of BME in contingency fit between business strategies and environmental uncertainties and how it produces effects up to the level of SR.

Non-stationary Frequency Analysis with Climate Variability using Conditional Generalized Extreme Value Distribution (기후변동을 고려한 조건부 GEV 분포를 이용한 비정상성 빈도분석)

  • Kim, Byung-Sik;Lee, Jung-Ki;Kim, Hung-Soo;Lee, Jin-Won
    • Journal of Wetlands Research
    • /
    • v.13 no.3
    • /
    • pp.499-514
    • /
    • 2011
  • An underlying assumption of traditional hydrologic frequency analysis is that climate, and hence the frequency of hydrologic events, is stationary, or unchanging over time. Under stationary conditions, the distribution of the variable of interest is invariant to temporal translation. Water resources infrastructure planning and design, such as dams, levees, canals, bridges, and culverts, relies on an understanding of past conditions and projection of future conditions. But, Water managers have always known our world is inherently non-stationary, and they routinely deal with this in management and planning. The aim of this paper is to give a brief introduction to non-stationary extreme value analysis methods. In this paper, a non-stationary hydrologic frequency analysis approach is introduced in order to determine probability rainfall consider changing climate. The non-stationary statistical approach is based on the conditional Generalized Extreme Value(GEV) distribution and Maximum Likelihood parameter estimation. This method are applied to the annual maximum 24 hours-rainfall. The results show that the non-stationary GEV approach is suitable for determining probability rainfall for changing climate, sucha sa trend, Moreover, Non-stationary frequency analyzed using SOI(Southern Oscillation Index) of ENSO(El Nino Southern Oscillation).

Probabilistic Analysis of Drought Characteristics in Pakistan Using a Bivariate Copula Model

  • Jehanzaib, Muhammad;Kim, Ji Eun;Park, Ji Yeon;Kim, Tae-Woong
    • Proceedings of the Korea Water Resources Association Conference
    • /
    • 2019.05a
    • /
    • pp.151-151
    • /
    • 2019
  • Because drought is a complex and stochastic phenomenon in nature, statistical approaches for drought assessment receive great attention for water resource planning and management. Generally drought characteristics such as severity, duration and intensity are modelled separately. This study aims to develop a relationship between drought characteristics using a bivariate copula model. To achieve the objective, we calculated the Standardized Precipitation Index (SPI) using rainfall data at 6 rain gauge stations for the period of 1961-1999 in Jehlum River Basin, Pakistan, and investigated the drought characteristics. Since there is a significant correlation between drought severity and duration, they are usually modeled using different marginal distributions and joint distribution function. Using exponential distribution for drought severity and log-logistic distribution for drought duration, the Galambos copula was recognized as best copula to model joint distribution of drought severity and duration based on the KS-statistic. Various return periods of drought were calculated to identify time interval of repeated drought events. The result of this study can provide useful information for effective water resource management and shows superiority against univariate drought analysis.

  • PDF

Dependency-based Framework of Combining Multiple Experts for Recognizing Unconstrained Handwritten Numerals (무제약 필기 숫자를 인식하기 위한 다수 인식기를 결합하는 의존관계 기반의 프레임워크)

  • Kang, Hee-Joong;Lee, Seong-Whan
    • Journal of KIISE:Software and Applications
    • /
    • v.27 no.8
    • /
    • pp.855-863
    • /
    • 2000
  • Although Behavior-Knowledge Space (BKS) method, one of well known decision combination methods, does not need any assumptions in combining the multiple experts, it should theoretically build exponential storage spaces for storing and managing jointly observed K decisions from K experts. That is, combining K experts needs a (K+1)st-order probability distribution. However, it is well known that the distribution becomes unmanageable in storing and estimating, even for a small K. In order to overcome such weakness, it has been studied to decompose a probability distribution into a number of component distributions and to approximate the distribution with a product of the component distributions. One of such previous works is to apply a conditional independence assumption to the distribution. Another work is to approximate the distribution with a product of only first-order tree dependencies or second-order distributions as shown in [1]. In this paper, higher order dependency than the first-order is considered in approximating the distribution and a dependency-based framework is proposed to optimally approximate the (K+1)st-order probability distribution with a product set of dth-order dependencies where ($1{\le}d{\le}K$), and to combine multiple experts based on the product set using the Bayesian formalism. This framework was experimented and evaluated with a standardized CENPARMI data base.

  • PDF

Strategy of the Fracture Network Characterization for Groundwater Modeling

  • Ji, Sung-Hoon;Park, Young-Jin;Lee, Kang-Kun;Kim, Kyoung-Su
    • Proceedings of the Korean Radioactive Waste Society Conference
    • /
    • 2009.06a
    • /
    • pp.186-186
    • /
    • 2009
  • The characterization strategy of fracture networks are classified into a deterministic or statistical characterization according to the type of required information. A deterministic characterization is most efficient for a sparsely fractured system, while the statistics are sufficient for densely fractured rock. In this study, the ensemble mean and variability of the effective connectivity is systematically analyzed with various density values for different network structures of a power law size distribution. The results of high resolution Monte Carlo analyses show that statistical characteristics can be a necessary information to determine the transport properties of a fracture system when fracture density is greater than a percolation threshold. When the percolation probability (II) approaches unity with increasing fracture density, the effective connectivity of the network can be safely estimated using statistics only (sufficient condition). It is inferred from conditional simulations that deterministic information for main pathways can reduce the uncertainty in estimation of system properties when the network becomes denser. Overall results imply that most pathways need to be identified when II < 0.5 statistics are sufficient when II $\rightarrow$ 1 and statistics are necessary and the identification of main pathways can significantly reduce the uncertainty in estimation of transport properties when 0.5$\ll$1. It is suggested that the proper estimation of the percolation probability of a fracture network is a prerequisite for an appropriate conceptualization and further characterization.

  • PDF

PROBABILISTIC SEISMIC ASSESSMENT OF BASE-ISOLATED NPPS SUBJECTED TO STRONG GROUND MOTIONS OF TOHOKU EARTHQUAKE

  • Ali, Ahmer;Hayah, Nadin Abu;Kim, Dookie;Cho, Ung Gook
    • Nuclear Engineering and Technology
    • /
    • v.46 no.5
    • /
    • pp.699-706
    • /
    • 2014
  • The probabilistic seismic performance of a standard Korean nuclear power plant (NPP) with an idealized isolation is investigated in the present work. A probabilistic seismic hazard analysis (PSHA) of the Wolsong site on the Korean peninsula is performed by considering peak ground acceleration (PGA) as an earthquake intensity measure. A procedure is reported on the categorization and selection of two sets of ground motions of the Tohoku earthquake, i.e. long-period and common as Set A and Set B respectively, for the nonlinear time history response analysis of the base-isolated NPP. Limit state values as multiples of the displacement responses of the NPP base isolation are considered for the fragility estimation. The seismic risk of the NPP is further assessed by incorporation of the rate of frequency exceedance and conditional failure probability curves. Furthermore, this framework attempts to show the unacceptable performance of the isolated NPP in terms of the probabilistic distribution and annual probability of limit states. The comparative results for long and common ground motions are discussed to contribute to the future safety of nuclear facilities against drastic events like Tohoku.