• Title/Summary/Keyword: 확률 기반

Search Result 2,637, Processing Time 0.035 seconds

Estimation of underwater acoustic uncertainty based on the ocean experimental data measured in the East Sea and its application to predict sonar detection probability (동해 해역에서 측정된 해상실험 데이터 기반의 수중음향 불확정성 추정 및 소나 탐지확률 예측)

  • Dae Hyeok Lee;Wonjun Yang;Ji Seop Kim;Hoseok Sul;Jee Woong Choi;Su-Uk Son
    • The Journal of the Acoustical Society of Korea
    • /
    • v.43 no.3
    • /
    • pp.285-292
    • /
    • 2024
  • When calculating sonar detection probability, underwater acoustic uncertainty is assumed to be normal distributed with a standard deviation of 8 dB to 9 dB. However, due to the variability in experimental areas and ocean environmental conditions, predicting detection performance requires accounting for underwater acoustic uncertainty based on ocean experimental data. In this study, underwater acoustic uncertainty was determined using measured mid-frequency (2.3 kHz, 3 kHz) noise level and transmission loss data collected in the shallow water of the East Sea. After calculating the predictable probability of detection reflecting underwater acoustic uncertainty based on ocean experimental data, we compared it with the conventional detection probability results, as well as the predictable probability of detection results considering the uncertainty of the Rayleigh distribution and a negatively skewed distribution. As a result, we confirmed that differences in the detection area occur depending on each underwater acoustic uncertainty.

Application of a large-scale climate ensemble simulation data to evaluate the scale of extreme rainfall: The case of 2018 Hiroshima extreme-scale rainfall event (극한 호우의 규모 평가를 위한 대규모 기후 앙상블 자료의 적용: 2018년 히로시마 극한 호우의 사례)

  • Kim, Youngkyu;Son, Minwoo
    • Proceedings of the Korea Water Resources Association Conference
    • /
    • 2022.05a
    • /
    • pp.290-290
    • /
    • 2022
  • 본 연구는 대규모 기후 앙상블 모의 결과를 이용하여 산정된 극한 강우량을 최근 발생한 극한 호우사상의 규모 평가에 적용하는 것을 목적으로 수행되었다. 2018 년 히로시마 호우사상은 지속시간 24 시간에서 재현기간 1,000 년에 상응하는 극한 규모를 나타냈기 때문에 짧은 기간동안 수집된 관측자료만으로 규모를 평가하기 어렵다. 따라서 이를 평가하고자 대규모 기후 앙상블 모의결과 기반의 d4PDF 자료를 이용하였다. 이 자료는 3,000 개의 연 최대 강우자료를 제공하고, 이를 토대로 통계적 모형 및 가정 없이 비모수적으로 10 년부터 1,000 년의 재현기간을 나타내는 지속시간 24 시간의 확률강우량을 산정했다. 산정된 d4PDF 의 확률강우량은 관측강우량의 확률강우량과 비교하였으며, 관측기간에 가까운 50 년의 재현기간에서는 두 확률강우량의 차이가 3.53%였지만 관측기간 (33 년)과 재현기간 (100 년 이상)의 차이가 증가할수록 오차가 10% 이상으로 증가하는 양상을 나타냈다. 이는 장기간 재현기간에서 관측강우량의 확률강우량은 불확실성을 내포하는 것을 의미한다. d4PDF 의 확률강우량에 대해서 2018 년 히로시마 호우사상은 300 년에 가까운 재현기간을 나타냈다. 미래 기후조건에서의 d4PDF 자료를 이용해 확률강우량을산정했으며, 현재 기후조건대비 미래 기후조건에서 10 년부터 1000 년의 재현기간을 나타내는 확률강우량은 모두 20% 이상으로 증가했다. 미래 기후조건의 확률강우량에 대해 2018 년 히로시마 호우사상은 100 년에 가까운 재현기간을 나타냈으며, 이는 미래 기후조건에서 히로시마 호우사상의 발생 확률이 0.33% (현재 기후)에서 1% (미래 기후)로 증가하는 것을 의미한다. 결과적으로, 대규모 기후 앙상블 모의결과 기반의 d4PDF 는 현재 기후조건과 미래 기후조건하에서 극한 규모의 호우사상의 정량적인 평가에 유용하게 활용될 수 있다.

  • PDF

Computing Conditional Probabilities in a Latent Probabilistic Library Model (은닉 확률 라이브러리 모델에서의 조건부 확률의 계산)

  • Heo Min-Oh;Zhang Byoung-Tak
    • Proceedings of the Korean Information Science Society Conference
    • /
    • 2006.06a
    • /
    • pp.70-72
    • /
    • 2006
  • 확률 라이브러리 모델(Probabilistic Library Model)은 DNA컴퓨팅 방법론에 기반하여, 라이브러리를 구성하는 원소들의 빈도를 이용하여 결합확률분포를 표현하고자 하는 모델이다. PLM에서 결합확률분포 외에도 조건부 확률을 계산하는 방법이 필요해짐에 따라, 본 논문에서는 in-vitro메서 DNA를 이용하여 임의의 조건부 확률을 계산하는 방법으로 은닉 확률라이브러리모델(Latent Probabilistic Library Model)을 이용한 방법을 제시하고, 이전 논문에서 미비한 부분인 알고리즘의 타당성 증명을 보완하였다. 또한, 시뮬레이션을 통하여 실제 확률과 1% 이내로 동일한 결과를 얻었다.

  • PDF

Integrated Stochastic Admission Control Policy in Clustered Continuous Media Storage Server (클리스터 기반 연속 미디어 저장 서버에서의 통합형 통계적 승인 제어 기법)

  • Kim, Yeong-Ju;No, Yeong-Uk
    • The KIPS Transactions:PartA
    • /
    • v.8A no.3
    • /
    • pp.217-226
    • /
    • 2001
  • In this paper, for continuous media access operations performed by Clustered Continuous Media Storage Server (CCMSS) system, we present the analytical model based on the open queueing network, which considers simultaneously two critical delay factors, the disk I/O and the internal network, in the CCMSS system. And we derive by using the analytical model the stochastic model for the total service delay time in the system. Next, we propose the integrated stochastic admission control model for the CCMSS system, which estimate the maximum number of admittable service requests at the allowable service failure rate by using the derived stochastic model and apply the derived number of requests in the admission control operation. For the performance evaluation of the proposed model, we evaluated the deadline miss rates by means of the previous stochastic model considering only the disk I/O and the propose stochastic model considering the disk I/O and the internal network, and compared the values with the results obtained from the simulation under the real cluster-based distributed media server environment. The evaluation showed that the proposed admission control policy reflects more precisely the delay factors in the CCMSS system.

  • PDF

A study on the optimization of network resource allocation scheme based on access probabilities (접근확률 기반의 네트워크 자원할당방식의 최적화에 관한 연구)

  • Kim Do-Kyu
    • Journal of the Korea Institute of Information and Communication Engineering
    • /
    • v.10 no.8
    • /
    • pp.1393-1400
    • /
    • 2006
  • This paper optimizes the access probabilities (APs) in a network resource allocation scheme based on access probabilities in order that the waiting time and the blocking probability are minimized under the given constraints, and obtains its performance. In order to optimize APs, an infinite number of balance equations is reduced to a finite number of balance equations by applying Neuts matrix geometric method. And the nonlinear programming problem is converted into a linear programming problem. As a numerical example, the performance measures of waiting time and blocking probability for optimal access probabilities and the maximum utilization under the given constraints are obtained. And it is shown that the scheme with optimal APs gives more performance build-up than the strategy without optimization.

Development of New Probabilistic Seismic Hazard Analysis and Seismic Coefficients of Korea Part II: Derivation of Probabilistic Site Coefficients (신(新) 확률론적 지진분석 및 지진계수 개발 Part II: 확률론적 지진계수 도출)

  • Kwak, Dong-Yeop;Jeong, Chang-Gyun;Lee, Hyunwoo;Park, Duhee
    • Journal of the Korean GEO-environmental Society
    • /
    • v.10 no.7
    • /
    • pp.111-115
    • /
    • 2009
  • In Korea, the probabilistically developed seismic hazard maps are used with deterministically derived seismic site coefficients in developing the design response spectrum of a specific site. Even though the seismic hazard maps and seismic site coefficients are incompatible, the current design code ignores such incompatibility. If the seismic hazard map and seismic coefficients are both developed in identical probabilistic framework, such problems can be solved. Unfortunately, the available method cannot be use to derive "true" probabilistic site coefficients. This study uses the ground motion time histories, which were developed as the result of a new probabilistic seismic hazard analysis in the companion paper, as input motions in performing one-dimensional equivalent linear site response analyses, from which the uniform hazard response spectra are generated. Another important characteristic of the hazard response spectra are that the uncertainties and randomness of the ground properties are accounted for. The uniform hazard spectra are then used to derive probabilistic site coefficients. Comparison of probabilistic and deterministically site coefficients demonstrate that there is a distinct discrepancy between two coefficients.

  • PDF

Comparison of confidence intervals for testing probabilities of a system (시스템의 확률 값 시험을 위한 신뢰구간 비교 분석)

  • Hwang, Ik-Soon
    • The Journal of the Korea institute of electronic communication sciences
    • /
    • v.5 no.5
    • /
    • pp.435-443
    • /
    • 2010
  • When testing systems that incorporate probabilistic behavior, it is necessary to apply test inputs a number of times in order to give a test verdict. Interval estimation can be used to assert the correctness of probabilities where the selection of confidence interval is one of the important issues for quality of testing. The Wald interval has been widely accepted for interval estimation. In this paper, we compare the Wald interval and the Agresti-Coull interval for various sizes of samples. The comparison is carried out based on the test pass probability of correct implementations and the test fail probability of incorrect implementations when these confidence intervals are used for probability testing. We consider two-sided confidence intervals to check if the probability is close to a given value. Also one-sided confidence intervals are considered in the comparison in order to check if the probability is not less than a given value. When testing probabilities using two-sided confidence intervals, we recommend the Agresti-Coull interval. For one-sided confidence intervals, the Agresti-Coull interval is recommended when the size of samples is large while either one of two confidence intervals can be used for small size samples.

Noise Removal using a Convergence of the posteriori probability of the Bayesian techniques vocabulary recognition model to solve the problems of the prior probability based on HMM (HMM을 기반으로 한 사전 확률의 문제점을 해결하기 위해 베이시안 기법 어휘 인식 모델에의 사후 확률을 융합한 잡음 제거)

  • Oh, Sang-Yeob
    • Journal of Digital Convergence
    • /
    • v.13 no.8
    • /
    • pp.295-300
    • /
    • 2015
  • In vocabulary recognition using an HMM model which models the prior distribution for the observation of a discrete probability distribution indicates the advantages of low computational complexity, but relatively low recognition rate. The Bayesian techniques to improve vocabulary recognition model, it is proposed using a convergence of two methods to improve recognition noise-canceling recognition. In this paper, using a convergence of the prior probability method and techniques of Bayesian posterior probability based on HMM remove noise and improves the recognition rate. The result of applying the proposed method, the recognition rate of 97.9% in vocabulary recognition, respectively.

A Parser of Definitions in Korean Dictionary based on Probabilistic Grammar Rules (확률적 문법규칙에 기반한 국어사전의 뜻풀이말 구문분석기)

  • Lee, Su-Gwang;Ok, Cheol-Yeong
    • Journal of KIISE:Software and Applications
    • /
    • v.28 no.5
    • /
    • pp.48-460
    • /
    • 2001
  • 국어사전의 뜻풀이말은 표제어의 의미를 기술할 뿐만 아니라, 상위/하위개념, 부분-전체개념, 다의어, 동형이의어, 동의어, 반의어, 의미속성 등의 많은 의미정보를 내재하고 있다. 본 연구는 뜻풀이말에서 다양한 의미정보를 획득을 위한 기본적인 도구로서 국어사전의 뜻풀이말 구문분석기를 구현하는 것을 목적으로 한다. 이를 위해서 우선 국어사전의 뜻풀이말을 대상으로 일정한 수준의 품사 및 구문 부착 말 뭉치를 구축하고, 이 말뭉치들로부터 품사 태그 중의성 어절의 빈도 정보와 통계적 방법에 기반한 문법규칙과 확률정보를 자동으로 추출한다. 본 연구의 뜻풀이말 구문분석기는 이를 이용한 확률적 차트파서이다. 품사 태그 중의성 어절의 빈도 정보와 문법규칙 및 확률정보는 파싱 과정의 명사구 중의성을 해소한다. 또한, 파싱 과정에서 생성되는 노드의 수를 줄이고 수행 속도를 높이기 위한 방법으로 문법 Factoring, Best-First 탐색 그리고 Viterbi 탐색의 방법을 이용한다. 문법규칙의 확률과 왼쪽 우선 파싱 그리고 왼쪽 우선 탐색 방법을 사용하여 실험한 결과, 왼쪽 우선 탐색 방식과 문법확률을 혼용하는 방식이 가장 정확한 결과를 보였으며 비학습 문장에 대해 51.74%의 재현률과 87.47%의 정확률을 보였다.

  • PDF

System Reliability-Based Design Optimization Using Performance Measure Approach (성능치 접근법을 이용한 시스템 신뢰도 기반 최적설계)

  • Kang, Soo-Chang;Koh, Hyun-Moo
    • KSCE Journal of Civil and Environmental Engineering Research
    • /
    • v.30 no.3A
    • /
    • pp.193-200
    • /
    • 2010
  • Structural design requires simultaneously to ensure safety by considering quantitatively uncertainties in the applied loadings, material properties and fabrication error and to maximize economical efficiency. As a solution, system reliability-based design optimization (SRBDO), which takes into consideration both uncertainties and economical efficiency, has been extensively researched and numerous attempts have been done to apply it to structural design. Contrary to conventional deterministic optimization, SRBDO involves the evaluation of component and system probabilistic constraints. However, because of the complicated algorithm for calculating component reliability indices and system reliability, excessive computational time is required when the large-scale finite element analysis is involved in evaluating the probabilistic constraints. Accordingly, an algorithm for SRBDO exhibiting improved stability and efficiency needs to be developed for the large-scale problems. In this study, a more stable and efficient SRBDO based on the performance measure approach (PMA) is developed. PMA shows good performance when it is applied to reliability-based design optimization (RBDO) which has only component probabilistic constraints. However, PMA could not be applied to SRBDO because PMA only calculates the probabilistic performance measure for limit state functions and does not evaluate the reliability indices. In order to overcome these difficulties, the decoupled algorithm is proposed where RBDO based on PMA is sequentially performed with updated target component reliability indices until the calculated system reliability index approaches the target system reliability index. Through a mathematical problem and ten-bar truss problem, the proposed method shows better convergence and efficiency than other approaches.