• Title/Summary/Keyword: 조건부 확률 분포

Search Result 56, Processing Time 0.024 seconds

Reynolds Shear Stress Distribution in Turbulent Channel Flows (난류 채널 유동 내부의 레이놀즈 전단 응력 분포)

  • Kim, Kyoung-Youn
    • Transactions of the Korean Society of Mechanical Engineers B
    • /
    • v.36 no.8
    • /
    • pp.829-837
    • /
    • 2012
  • Direct numerical simulations were carried out for turbulent channel flows with $Re_{\tau}$ = 180, 395 and 590 to investigate the turbulent flow structure related to the Reynolds shear stress. By examining the probability density function, the second quadrant (Q2) events with the largest contribution to the mean Reynolds shear stress were identified. The change in the inclination angle of Q2 events varies with wall units in $y^+<50$ and with the channel half height in y/h > 0.5. Conditionally averaged flow fields for the Q2 event show that the flow structures associated with Reynolds shear stress are a quasi-streamwise vortex in the buffer layer and a hairpin-shaped vortex in the outer layer. Three-dimensional visualization of the distribution of high Reynolds shear stress reveals that the organization of hairpin vortices in the outer layer having a size of 1.5~3 h is associated with large-scale motions with high Reynolds shear stress in the outer layer.

On asymptotics for a bias-corrected version of the NPMLE of the probability of discovering a new species (신종발견확률의 편의보정 비모수 최우추정량에 관한 연구)

  • 이주호
    • The Korean Journal of Applied Statistics
    • /
    • v.6 no.2
    • /
    • pp.341-353
    • /
    • 1993
  • As an estimator of the conditional probability of discovering a new species at the next observation after a sample of certain size is taken, the one proposed by Good(1953) has been most widely used. Recently, Clayton and Frees(1987) showed via simulation that their nonparametric maximum likelihood estimator(NPMLE) has smaller MSE than Good's estimator when the population is relatively nonuniform. Lee(1989) proved that their conjecture is asymptotically true for truncated geometric population distributions. One shortcoming of the NPMLE, however, is that it has a considerable amount of negative bias. In this study we proposed a bias-corrected version of the NPMLE for virtually all realistic population distributions. We also showed that it has a smaller asymptotic MSE than Good's extimator except when the population is very uniform. A Monte Carlo simulation was performed for small sample sizes, and the result supports the asymptotic results.

  • PDF

Bayesian analysis of cumulative logit models using the Monte Carlo Gibbs sampling (몬테칼로깁스표본기법을 이용한 누적로짓 모형의 베이지안 분석)

  • 오만숙
    • The Korean Journal of Applied Statistics
    • /
    • v.10 no.1
    • /
    • pp.151-161
    • /
    • 1997
  • An easy Monte Carlo Gibbs sampling approach is suggested for Bayesian analysis of cumulative logit models for ordinal polytomous data. Because in the cumulative logit model the posterior conditional distributions of parameters are not given in convenient forms for random sample generation, appropriate latent variables are introduced into the model so that in the new model all the conditional distributions are given in very convenient forms for implementation of the Gibbs sampler.

  • PDF

Investigation on Exact Tests (정확검정들에 대한 고찰)

  • 강승호
    • The Korean Journal of Applied Statistics
    • /
    • v.15 no.1
    • /
    • pp.187-199
    • /
    • 2002
  • When the sample size is small, exact tests are often employed because the asymptotic distribution of the test statistic is in doubt. The advantage of exact tests is that it is guaranteed to bound the type I error probability to the nominal level. In this paper we review the methods of constructing exact tests, the algorithm and commercial software. We also examine the difference between exact p-values obtained from exact tests and true p-values obtained from the true underlying distribution.

Non-stationary Frequency Analysis with Climate Variability using Conditional Generalized Extreme Value Distribution (기후변동을 고려한 조건부 GEV 분포를 이용한 비정상성 빈도분석)

  • Kim, Byung-Sik;Lee, Jung-Ki;Kim, Hung-Soo;Lee, Jin-Won
    • Journal of Wetlands Research
    • /
    • v.13 no.3
    • /
    • pp.499-514
    • /
    • 2011
  • An underlying assumption of traditional hydrologic frequency analysis is that climate, and hence the frequency of hydrologic events, is stationary, or unchanging over time. Under stationary conditions, the distribution of the variable of interest is invariant to temporal translation. Water resources infrastructure planning and design, such as dams, levees, canals, bridges, and culverts, relies on an understanding of past conditions and projection of future conditions. But, Water managers have always known our world is inherently non-stationary, and they routinely deal with this in management and planning. The aim of this paper is to give a brief introduction to non-stationary extreme value analysis methods. In this paper, a non-stationary hydrologic frequency analysis approach is introduced in order to determine probability rainfall consider changing climate. The non-stationary statistical approach is based on the conditional Generalized Extreme Value(GEV) distribution and Maximum Likelihood parameter estimation. This method are applied to the annual maximum 24 hours-rainfall. The results show that the non-stationary GEV approach is suitable for determining probability rainfall for changing climate, sucha sa trend, Moreover, Non-stationary frequency analyzed using SOI(Southern Oscillation Index) of ENSO(El Nino Southern Oscillation).

Nonstationary Intensity-Duration-Frequency Curves under Climate Change (기후변화를 고려한 비정상성 I-D-F 곡선 작성)

  • Jeung, Se Jin;Lee, Suk Ho;Kim, Byung Sik
    • Proceedings of the Korea Water Resources Association Conference
    • /
    • 2015.05a
    • /
    • pp.94-94
    • /
    • 2015
  • 기후변화와 변동으로 인한 기상이변이 갈수록 심각해지고 발생 빈도도 잦아짐에 따라 현재의 배수관련 사회기반시설(Drainage Infrastructure)이 이런 문제에 대처할 준비가 잘되어 있는지에 대해 의문점이 제기되고 있다. 현재의 배수관련 사회기반시설의 설계는 이른바 정상성(stationarity)이라는 가정 하에 강우의 강도(Intensity), 지속기간(Duration), 빈도(Frequency)의 관계를 나타내는 I-D-F 곡선을 주로 이용하기 때문에 기후변화로 인한 극치사상(extremes)의 유의한 변화를 나타낼 수가 없다는 한계점을 가지고 있다. 그러나 기후변화는 극한기후(climatic extremes)의 특성을 비정상성(nonstationarity)이라 일컫는 개념으로 바꾸고 있기 때문에 배수관련 기반구조 설계(Drainage Infrastructuredesign)의 기본 가정의 하나인 강우 통계 매개변수의 정상성은 기후변화의 시대에는 더는 유효하지 않을 수 있다. 본 논문에서는 이러한 비정상성을 고려하여 조건부 GEV 분포를 이용하여 지속시간별 확률강우량 과비정상성 I-D-F 곡선식을 유도하였다. 또한, 분포형 홍수유출모형인 S-RAT(Spatial Runoff Assessment Tool)을 이용하여 강우강도의 증가가 설계 최대유량(design peak flows)에 미치는 영향을 분석하였다. 분석결과 지속기간별 차이는 있었지만 고빈도로 갈수록 전반적으로 현행 I-D-F 곡선이 실질적으로 극한강수를 과소평가하고 있으며 정상성 I-D-F 곡선 작성 방법이 기후변화의 배수관련 기반구조물의 능력설계에 적합지 않을 수도 있음을 제시하였다.

  • PDF

Optimization of Data Recovery using Non-Linear Equalizer in Cellular Mobile Channel (셀룰라 이동통신 채널에서 비선형 등화기를 이용한 최적의 데이터 복원)

  • Choi, Sang-Ho;Ho, Kwang-Chun;Kim, Yung-Kwon
    • Journal of IKEEE
    • /
    • v.5 no.1 s.8
    • /
    • pp.1-7
    • /
    • 2001
  • In this paper, we have investigated the CDMA(Code Division Multiple Access) Cellular System with non-linear equalizer in reverse link channel. In general, due to unknown characteristics of channel in the wireless communication, the distribution of the observables cannot be specified by a finite set of parameters; instead, we partitioned the m-dimensional sample space Into a finite number of disjointed regions by using quantiles and a vector quantizer based on training samples. The algorithm proposed is based on a piecewise approximation to regression function based on quantiles and conditional partition moments which are estimated by Robbins Monro Stochastic Approximation (RMSA) algorithm. The resulting equalizers and detectors are robust in the sense that they are insensitive to variations in noise distributions. The main idea is that the robust equalizers and robust partition detectors yield better performance in equiprobably partitioned subspace of observations than the conventional equalizer in unpartitioned observation space under any condition. And also, we apply this idea to the CDMA system and analyze the BER performance.

  • PDF

A Selection of Threshold for the Generalized Hough Transform: A Probabilistic Approach (일반화된 허프변환의 임계값 선택을 위한 확률적 접근방식)

  • Chang, Ji Y.
    • Journal of the Institute of Electronics and Information Engineers
    • /
    • v.51 no.1
    • /
    • pp.161-171
    • /
    • 2014
  • When the Hough transform is applied to identify an instance of a given model, the output is typically a histogram of votes cast by a set of image features into a parameter space. The next step is to threshold the histogram of counts to hypothesize a given match. The question is "What is a reasonable choice of the threshold?" In a standard implementation of the Hough transform, the threshold is selected heuristically, e.g., some fraction of the highest cell count. Setting the threshold too low can give rise to a false alarm of a given shape(Type I error). On the other hand, setting the threshold too high can result in mis-detection of a given shape(Type II error). In this paper, we derive two conditional probability functions of cell counts in the accumulator array of the generalized Hough transform(GHough), that can be used to select a scientific threshold at the peak detection stage of the Ghough.

Analysis of W-CDMA systems with different bandwidths over JTC channel model (JTC 채널 모델에서 W-CDMA의 대역폭에 따른 성능 분석)

  • 이주석;오동진;김철성
    • The Journal of Korean Institute of Communications and Information Sciences
    • /
    • v.26 no.11B
    • /
    • pp.1546-1555
    • /
    • 2001
  • Conventionally, in a CDMA system analysis, we assume only a single path within one chip duration. But, in this paper, we assume various number of multipaths within one chip duration according to spreading bandwidth in fixed channel model. Thus we take into account of the effects of autocorrelation and relative phases among multipath components within one chip duration according to different bandwidth, and analyze fading effects. And we derive the pdf of output signal. Then, we derive the average error probability versus the number of users from derived pdf. We use a Maximal Ratio Combining (MRC) RAKE receiver under the JTC channel model which is one of the popular realistic wideband channel models. And we employ hybrid FDMA/CDMA systems to compare the performance of W-CBMA system for the same occupied total bandwidth. Then, we compare and analyze them for different bandwidth based on the number of users as a parameter. From the simulation results for different bandwidth, better performance can be obtained for wider bandwidth system where more resolvable multipath components are available.

  • PDF

Frequency analysis for annual maximum of daily snow accumulations using conditional joint probability distribution (적설 자료의 빈도해석을 위한 확률밀도함수 개선 연구)

  • Park, Heeseong;Chung, Gunhui
    • Journal of Korea Water Resources Association
    • /
    • v.52 no.9
    • /
    • pp.627-635
    • /
    • 2019
  • In Korea, snow damage has been happened in the region with no snowfalls in history. Also, casual damage was caused by heavy snow. Therefore, policy about the Natural Disaster Reduction Comprehensive Plan has been changed to include the mitigation measures of snow damage. However, since heavy snow damage was not frequent, studies on snowfall have not been conducted in different points. The characteristics of snow data commonly are not same to the rainfall data. For example, some parts of the southern coastal areas are snowless during the year, so there is often no values or zero values among the annual maximum daily snow accumulation. The characteristics of this type of data is similar to the censored data. Indeed, Busan observation sites have more than 36% of no data or zero data. Despite of the different characteristics, the frequency analysis for snow data has been implemented according to the procedures for rainfall data. The frequency analysis could be implemented in both way to include the zero data or exclude the zero data. The fitness of both results would not be high enough to represent the real data shape. Therefore, in this study, a methodology for selecting a probability density function was suggested considering the characteristics of snow data in Korea. A method to select probability density function using conditional joint probability distribution was proposed. As a result, fitness from the proposed method was higher than the conventional methods. This shows that the conventional methods (includes 0 or excludes 0) overestimated snow depth. The results of this study can affect the design standards of buildings and also contribute to the establishment of measures to reduce snow damage.