• Title/Summary/Keyword: probability distributions

Search Result 744, Processing Time 0.028 seconds

Statistical investigation on size distribution of suspended cohesive sediment (점착성 부유사의 입도분포형 검증)

  • Park, Byeoungeun;Byun, Jisun;Son, Minwoo
    • Journal of Korea Water Resources Association
    • /
    • v.53 no.10
    • /
    • pp.917-928
    • /
    • 2020
  • The purpose of this study is to find the appropriate probability distribution representing the size distribution of suspended cohesive sediment. Based on goodness-of-fit test for a significance level of 5% using the Kolmogorov-Smirnov test, it is found that the floc size distributions measured in laboratory experiment and field study show different results. In the case of sample data collected from field experiments, the Gamma distribution is the best fitting form. In the case of laboratory experiment results, the sample data shows the positively-skewed distribution and the GEV distribution is the best fitted. The lognormal distribution, which is generally assumed to be a floc size distribution, is not suitable for both field and laboratory results. By using 3-parameter lognormal distribution, it is shown that similar size distribution with floc size distribution can be simulated.

Regional Drought Frequency Analysis of Monthly Rainfall Data by the Method of L-Moments (L-Moment법을 이용한 월 강우량 자료의 지역가뭄빈도 해석)

  • Yun, Yong-Nam;Park, Mu-Jong
    • Journal of Korea Water Resources Association
    • /
    • v.30 no.1
    • /
    • pp.55-62
    • /
    • 1997
  • To quantitatively investigate the nationwide drought characteristics and to comparatively evaluate the 1994-1995 drought with several past droughts of significant magnitude regional frequency analysis is made for the meteorological stations in each of the 47 subbasins covering the whole nation. With monthly precipitation data for the period of records at the stations in each subbasin low precipitation data series of various durations are formulated with the running totals of monthly data and fitted to probability distributions. The method of L-method of L-moments is used to determine the unbiased parameters of each distribution, and using the best-fit distribution for each subbasin the low precipitations of various durations with return periods of 5, 10, 20, 30, and 50 years are estimated. The drought frequency maps are drawn with the low drought frequency analysis the drought of 1994-1995 is evaluated in its severity and areal extent in comparison with four other past drought of significance. The current practice of safety standards for the design of impounding facilities is also evaluated with reference to the recurrence interval of the severe drought, and a recommendation is made for the future design standard.

  • PDF

Performance Analysis of Request Handling Schemes for Intelligent Peripherals (IP용 서비스 요청 처리방식의 성능 분석)

  • 최고봉;윤종호;권기호
    • The Journal of Korean Institute of Communications and Information Sciences
    • /
    • v.19 no.7
    • /
    • pp.1321-1334
    • /
    • 1994
  • This paper presents the service handling schemes of an intelligent peripheral(IP) which provides the service function as a physical entity in the intelligent network. Four service request handling scheme are compared for and IP which can handle both ordinary requests and prioritized requests on the blocked-call-delayed basis. Delayed requests are assumed to be stored in a finite storage buffer. Scheme-I exclusively allows prioritized requests be stored and to use a fixed number of reserved servers. The other three schemes without reserved servers(Scheme-II.III,and IV) allow both types of requests to be stored and prioritized requests pushout ordinary requests if the buffer is full. For these four schemes, the blocking probabilities and delay distributions of both types of requests are numerically obtained. From the numericall results, the schemes without reserved servers reduce the blocking probability of ordinary requists without a severe penalty on proritized requests. For three schemes without reserved servers, it is noted that prioritized requests should br served on the first-in, first-out basis, and ordinary requests should be served on the last-in, first-out basis.

  • PDF

A Study on the Reliability-Based Optimum Design of Reinforced Concrete Frames (철근(鐵筋)콘크리트 뼈대구조(構造) 신뢰성(信賴性) 최적설계(最適設計)에 관한 연구(硏究))

  • Kim, Kee Dae;Yang, Chang Hyun;Cho, Hyo Nam
    • KSCE Journal of Civil and Environmental Engineering Research
    • /
    • v.9 no.3
    • /
    • pp.57-64
    • /
    • 1989
  • This study presents a reliability-based optimum design of reinforced concrete frames, in which the AFOSM and SOSM methods are applied for the evaluation of the failure probabilities, and the sequential linear programming method is used as a practical approach to the system optimization. One-story two-bay reinforced concrete frame is chosen for the numerical illustration of the proposed reliability-based optimum design. As a result, it is found that the proposed procedure for the reliability-based optimization of RC frames could provide the accurate estimation of the optimal level of safety, and appears applicable to real structures with reasonable complexity. It is shown in the paper that the probability distributions of the basic random variables and the uncertainties of the applied loadings and material strengths may have great effect on the optimum design, but the AFOSM and SOSM methods do not show significant discrepancy in the optimum design results, but the former appears more realistic and time saving than the latter for this specific study.

  • PDF

Complex Segregation Analysis of Categorical Traits in Farm Animals: Comparison of Linear and Threshold Models

  • Kadarmideen, Haja N.;Ilahi, H.
    • Asian-Australasian Journal of Animal Sciences
    • /
    • v.18 no.8
    • /
    • pp.1088-1097
    • /
    • 2005
  • Main objectives of this study were to investigate accuracy, bias and power of linear and threshold model segregation analysis methods for detection of major genes in categorical traits in farm animals. Maximum Likelihood Linear Model (MLLM), Bayesian Linear Model (BALM) and Bayesian Threshold Model (BATM) were applied to simulated data on normal, categorical and binary scales as well as to disease data in pigs. Simulated data on the underlying normally distributed liability (NDL) were used to create categorical and binary data. MLLM method was applied to data on all scales (Normal, categorical and binary) and BATM method was developed and applied only to binary data. The MLLM analyses underestimated parameters for binary as well as categorical traits compared to normal traits; with the bias being very severe for binary traits. The accuracy of major gene and polygene parameter estimates was also very low for binary data compared with those for categorical data; the later gave results similar to normal data. When disease incidence (on binary scale) is close to 50%, segregation analysis has more accuracy and lesser bias, compared to diseases with rare incidences. NDL data were always better than categorical data. Under the MLLM method, the test statistics for categorical and binary data were consistently unusually very high (while the opposite is expected due to loss of information in categorical data), indicating high false discovery rates of major genes if linear models are applied to categorical traits. With Bayesian segregation analysis, 95% highest probability density regions of major gene variances were checked if they included the value of zero (boundary parameter); by nature of this difference between likelihood and Bayesian approaches, the Bayesian methods are likely to be more reliable for categorical data. The BATM segregation analysis of binary data also showed a significant advantage over MLLM in terms of higher accuracy. Based on the results, threshold models are recommended when the trait distributions are discontinuous. Further, segregation analysis could be used in an initial scan of the data for evidence of major genes before embarking on molecular genome mapping.

A Study on Comparison of Risk Estimates Among Various Exposure Scenario of Several Volatile Organic Compounds in Tap Water (음용수중 휘발성 유기오염물질의 노출경로에 따른 위해도 추정치 비교연구)

  • Chung, Yong;Shin, Dong-Chun;Kim, Jong-Man;Yang, Ji-Yeon;Park, Seong-Eun
    • Environmental Analysis Health and Toxicology
    • /
    • v.10 no.1_2
    • /
    • pp.21-35
    • /
    • 1995
  • Risk assessment processes, which include processes for the estimation of human cancer potency using animal bioassay data and calculation of human exposure, entail uncertainties. In the exposure assessment process, exposure scenarios with various assumptions could affect the exposure amount and excess cancer risk. We compared risk estimates among various exposure scenarios of vinyl chloride, trichloroethylene and tetrachloroethylene in tap water. The contaminant concentrations were analyzed from tap water samples in Seoul from 1993 to 1994. The oral and inhalation cancer potencies of the contaminants were estimated using multistage, Weibull, lognormal, and Mantel-Bryan model in TOX-RISK computer software. In the first case, human excess cancer risk was estimated by the US EPA method used to set the MCL(maximum contaminant level). In the second and third case, the risk was estimated for multi-route exposure with and without adopting Monte-Carlo simulation, respectively. In the second case, exposure input parameters and cancer potencies used probability distributions, and in the third case, those values used point estimates(mean, and maximum or 95% upper-bound value). As a result, while the excess cancer risk estimated by US EPA method considering only direct ingestion tended to be underestimated, the risk which was estimated by considering multi-route exposure without Monte-Carlo simulation and then using the maximum or 95% upper-bound value as input parameters tended to be overestimated. In risk assessment for volatile organic compounds, considering multi-route exposure with adopting Monte-Carlo analysis seems to provide the most reasonable estimations.

  • PDF

Forensic Characterization of Four New Bovine Tri-nucleotide Microsatellite Markers in Korean Cattle (Hanwoo)

  • Sim, Yong Teak;Na, Jong Gil;Lee, Chul-Sang
    • Journal of Animal Science and Technology
    • /
    • v.55 no.2
    • /
    • pp.87-93
    • /
    • 2013
  • We identified four new bovine tri-nucleotide microsatellite loci and analyzed their sequence structures and genetic parameters in 105 randomly selected Korean cattle (Hanwoo). Allele numbers of the loci B17S0808, B15S6253, B8S7996, and B17S4998 were 10, 11, 12, and 29, respectively. These alleles contained a simple or compound repeat sequences with some variations. Allele distributions of all these loci were in Hardy-Weinberg equilibrium (P > 0.05). Observed heterozygosity and expected heterozygosity ranged from 0.54 (B15S6253) to 0.92 (B17S4998) and from 0.599 (B15S6253) to 0.968 (B17S4998), respectively, and two measures of heterozygosity at each locus were highly correlated. Polymorphism information content (PIC) for these 4 loci ranged from 0.551 (B15S6253) to 0.932 (B17S4998), which means that all these loci are highly informative (PIC > 0.5). Other genetic parameters, power of discrimination (PD) and probability of exclusion (PE) ranged from 0.783 (B15S6253) to 0.984 (B17S4998) and from 0.210 (B15S6253) to 0.782 (B17S4998), respectively. Their combined PD and PE values were 0.9999968 and 0.98005176, respectively. Capillary electrophoresis revealed that average peak height ratio for a stutter was 13.89% at B17S0808, 26.67% at B15S6253, 9.09% at B8S7996, and 43.75% at B17S4998. Although the degree of genetic variability of the locus B15S6253 was relatively low among these four microsatellite markers, their favorable parameters and low peak height ratios for stutters indicate that these four new tri-nucleotide microsatellite loci could be useful multiplex PCR markers for the forensic and population genetic studies in cattle including Korean native breed.

Using the Sample IQR for Calculating Sample Size (표본크기 결정을 위한 IQR의 활용방법)

  • 홍종선;김현태;윤상호;정민정
    • The Korean Journal of Applied Statistics
    • /
    • v.16 no.1
    • /
    • pp.181-193
    • /
    • 2003
  • Without a sample standard deviation for an estimator of the population standard deviation u in a sample size computations, we often use some functions of a sample range (R) or interquartile range (IQR) by an estimator of $\sigma$. In order to avoid under-powered studies, these estimates must have a high probability of being greater than or equal to $\sigma$. In this paper, these probabilities of being greater than or equal to $\sigma$ are estimated for IQR for various parents distributions, and are compared with the probabilities for R/4 (Browne 2001). Alternative divisors (K) are explored and discussed for which the probabilities of R/K and IQR/K being greater than or equal to $\sigma$ is at least 95%.

Warpinging and Budding Prediction Model of Wooden Hollow Core Flush Door due to Moisture Content Change (II) : Simple Method of LMC and MOE, and Monte Carlo Simulation for Calculating Reject (목제(木製) 프러쉬 문의 함수율 변동에 따른 틀어짐과 좌굴 예측모델 (II) : 치수변동과 탄성계수의 간이측정법과 불량율 예측 Monte Carlo 시뮬레이션)

  • Kang, Wook;Jung, Hee-Suk
    • Journal of the Korean Wood Science and Technology
    • /
    • v.28 no.1
    • /
    • pp.18-27
    • /
    • 2000
  • Even the same materials are assembled in flush door skin panel, warping is not simply prevented under the changes of environmental conditions since wood and wood-based material have large variations in their physical and mechanical properties. The parameters such as linear movement coefficient(LMC), modulus of elasticity (MOE), required to predict warping could be estimated by oven drying method and dynamic method instead of American Society for Testing Materials(ASTM) procedure. The relationship between warping and LMC was curvilinear, while it between warping and MOE was linear. LMC had a larger effect on warping than MOE. Material propensity of skin panel such as hardboard and plywood showed normal distributions. The variation of material properties, however, was much larger in plywood than in hardboard. Monte Carlo simulation also indicated that rejection ratio of flush door due to the occurrence of warping could be predicted with consideration of the relationship of warping and parameters of probability distribution of MOE, LMC, and moisture content.

  • PDF

Bandwidth Management of WiMAX Systems and Performance Modeling

  • Li, Yue;He, Jian-Hua;Xing, Weixi
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • v.2 no.2
    • /
    • pp.63-81
    • /
    • 2008
  • WiMAX has been introduced as a competitive alternative for metropolitan broadband wireless access technologies. It is connection oriented and it can provide very high data rates, large service coverage, and flexible quality of services (QoS). Due to the large number of connections and flexible QoS supported by WiMAX, the uplink access in WiMAX networks is very challenging since the medium access control (MAC) protocol must efficiently manage the bandwidth and related channel allocations. In this paper, we propose and investigate a cost-effective WiMAX bandwidth management scheme, named the WiMAX partial sharing scheme (WPSS), in order to provide good QoS while achieving better bandwidth utilization and network throughput. The proposed bandwidth management scheme is compared with a simple but inefficient scheme, named the WiMAX complete sharing scheme (WCPS). A maximum entropy (ME) based analytical model (MEAM) is proposed for the performance evaluation of the two bandwidth management schemes. The reason for using MEAM for the performance evaluation is that MEAM can efficiently model a large-scale system in which the number of stations or connections is generally very high, while the traditional simulation and analytical (e.g., Markov models) approaches cannot perform well due to the high computation complexity. We model the bandwidth management scheme as a queuing network model (QNM) that consists of interacting multiclass queues for different service classes. Closed form expressions for the state and blocking probability distributions are derived for those schemes. Simulation results verify the MEAM numerical results and show that WPSS can significantly improve the network’s performance compared to WCPS.