• Title/Summary/Keyword: Probabilistic Method.

Search Result 1,546, Processing Time 0.033 seconds

Analysis and Prediction for Spatial Distribution of Functional Feeding Groups of Aquatic Insects in the Geum River (금강 수계 수서곤충 섭식기능군의 공간분포 분석 및 예측)

  • Kim, Ki-Dong;Park, Young-Jun;Nam, Sang-Ho
    • Journal of the Korean Association of Geographic Information Studies
    • /
    • v.15 no.1
    • /
    • pp.99-118
    • /
    • 2012
  • The aim of this study is to define a correlation between spatial distribution characteristics of FFG(Functional Feeding Groups) of aquatic insects and related environmental factors in the Geum River based on the theory of RCC(River Continuum Concept). For that objective we had used SMRA(Stepwise Multiple Regression Analysis) method to analyze close relationship between the distribution of aquatic insects and the physical and chemical factors that may affect their inhabiting environment in the study area. And then, a probabilistic method named Frequency Ratio Model(FRM) and spatial analysis function of GIS were applied to produce a predictive distribution map of biota community considering their distribution characteristics according to the environmental factors as related variables. As a result of SMRA, the values of decision coefficient for factors of elevation, stream width, flow velocity, conductivity, temperature and percentage of sand showed higher than 0.5. Therefore these 6 environmental factors were considered as major factors that might affect the distribution characteristics of aquatic insects. Finally, we had calculated RMSE(Root Mean Square Error) between the predicted distribution map and prior survey database from other researches to verify the result of this study. The values of RMSE were calculated from 0.1892 to 0.4242 according to each FFG so we could find out a high reliability of this study. The results of this study might be used to develop a new estimation method for aquatic ecosystem with macro invertebrate community and also be used as preliminary data for conservation and restoration of stream habitats.

A Prediction Method of the Gas Pipeline Failure Using In-line Inspection and Corrosion Defect Clustering (In-line Inspection과 부식결함 클러스터링을 이용한 가스배관의 고장예측)

  • Kim, Seong-Jun;Choe, Byung Hak;Kim, Woosik
    • Journal of the Korean Institute of Intelligent Systems
    • /
    • v.24 no.6
    • /
    • pp.651-656
    • /
    • 2014
  • Corrosion has a significant influence upon the reliability assessment and the maintenance planning of gas pipeline. Corrosion defects occurred on the underground pipeline can be obtained by conducting periodic in-line inspection (ILI). However, little study has been done for practical use of ILI data. This paper deals with remaining lifetime prediction of the gas pipeline in the presence of corrosion defects. Because a pipeline parameter includes uncertainty in its operation, a probabilistic approach is adopted in this paper. A pipeline fails when its operating pressure is larger than the pipe failure pressure. In order to estimate the failure probability, this paper uses First Order Reliability Method (FORM) which is popular in the field of structural engineering. A well-known Battelle code is chosen as the computational model for the pipe failure pressure. This paper develops a Matlab GUI for illustrating failure probability predictions Our result indicates that clustering of corrosion defects is helpful for improving a prediction accuracy and preventing an unnecessary maintenance.

Effective Reference Probability Incorporating the Effect of Expiration Time in Web Cache (웹 캐쉬에서 만기시간의 영향을 고려한 유효참조확률)

  • Lee, Jeong-Joon;Moon, Yang-Se;Whang, Kyu-Young;Hong, Eui-Kyung
    • Journal of KIISE:Databases
    • /
    • v.28 no.4
    • /
    • pp.688-701
    • /
    • 2001
  • Web caching has become an important problem addressing the performance issues in web applications. In this paper we propose a method that enhances the performance of web caching by incorporating the expiration time of web data we introduce the notion of the effective reference probability that incorporates the effect of expiration time into the reference probability used in the existing cache replacement algorithms .We formally define the effective reference probability and derive it theoretically using a probabilistic model. By simply replacing probabilities with the effective reference probability in the existing cache replacement algorithms we can take the effect of expiration time into account The results of performance evaluation through experiments show that the replacement algorithms using the effective reference probability always outperform the existing ones. The reason is that the proposed method precisely reflects the theoretical probability of getting the cache effect, and thus, incorporates the influence of the expiration time more effectively. In particular when the cache fraction is 0.05 and data update is comparatively frequent (i.e. the update frequency is more than 1/0 of the reference frequency) the performance enhancement is more than 30% in LRU-2 and 13% in Aggarwal's method (PSS integrating a refresh overhead factor) The results show that effective reference probability contributes significantly to the performance enhancement of the web cache in the presence of expiration time.

  • PDF

Formation Estimation of Shaly Sandstone Reservoir using Joint Inversion from Well Logging Data (복합역산을 이용한 물리검층자료로부터의 셰일성 사암 저류층의 지층 평가)

  • Choi, Yeonjin;Chung, Woo-Keen;Ha, Jiho;Shin, Sung-ryul
    • Geophysics and Geophysical Exploration
    • /
    • v.22 no.1
    • /
    • pp.1-11
    • /
    • 2019
  • Well logging technologies are used to measure the physical properties of reservoirs through boreholes. These technologies have been utilized to understand reservoir characteristics, such as porosity, fluid saturation, etc., using equations based on rock physics models. The analysis of well logs is performed by selecting a reliable rock physics model adequate for reservoir conditions or characteristics, comparing the results using the Archie's equation or simandoux method, and determining the most feasible reservoir properties. In this study, we developed a joint inversion algorithm to estimate physical properties in shaly sandstone reservoirs based on the pre-existing algorithm for sandstone reservoirs. For this purpose, we proposed a rock physics model with respect to shale volume, constructed the Jacobian matrix, and performed the sensitivity analysis for understanding the relationship between well-logging data and rock properties. The joint inversion algorithm was implemented by adopting the least-squares method using probabilistic approach. The developed algorithm was applied to the well-logging data obtained from the Colony gas sandstone reservoir. The results were compared with the simandox method and the joint inversion algorithms of sand stone reservoirs.

Data analysis by Integrating statistics and visualization: Visual verification for the prediction model (통계와 시각화를 결합한 데이터 분석: 예측모형 대한 시각화 검증)

  • Mun, Seong Min;Lee, Kyung Won
    • Design Convergence Study
    • /
    • v.15 no.6
    • /
    • pp.195-214
    • /
    • 2016
  • Predictive analysis is based on a probabilistic learning algorithm called pattern recognition or machine learning. Therefore, if users want to extract more information from the data, they are required high statistical knowledge. In addition, it is difficult to find out data pattern and characteristics of the data. This study conducted statistical data analyses and visual data analyses to supplement prediction analysis's weakness. Through this study, we could find some implications that haven't been found in the previous studies. First, we could find data pattern when adjust data selection according as splitting criteria for the decision tree method. Second, we could find what type of data included in the final prediction model. We found some implications that haven't been found in the previous studies from the results of statistical and visual analyses. In statistical analysis we found relation among the multivariable and deducted prediction model to predict high box office performance. In visualization analysis we proposed visual analysis method with various interactive functions. Finally through this study we verified final prediction model and suggested analysis method extract variety of information from the data.

Assessment of Cerebral Hemodynamic Changes in Pediatric Patients with Moyamoya Disease Using Probabilistic Maps on Analysis of Basal/Acetazolamide Stress Brain Perfusion SPECT (소아 모야모야병에서 뇌확률지도를 이용한 수술전후 혈역학적 변화 분석)

  • Lee, Ho-Young;Lee, Jae-Sung;Kim, Seung-Ki;Wang, Kyu-Chang;Cho, Byung-Kyu;Chung, June-Key;Lee, Myung-Chul;Lee, Dong-Soo
    • Nuclear Medicine and Molecular Imaging
    • /
    • v.42 no.3
    • /
    • pp.192-200
    • /
    • 2008
  • To evaluate the hemodynamic changes and the predictive factors of the clinical outcome in pediatric patients with moyamoya disease, we analyzed pre/post basal/acetazolamide stress brain perfusion SPECT with automated volume of interest (VOIs) method. Methods: Total fifty six (M:F = 33:24, age $6.7{\pm}3.2$ years) pediatric patients with moyamoya disease, who underwent basal/acetazolamide stress brain perfusion SPECT within 6 before and after revascularization surgery (encephalo-duro-arterio-synangiosis (EDAS) with frontal encephalo-galeo-synangiosis (EGS) and EDAS only followed on contralateral hemisphere), and followed-up more than 6 months after post-operative SPECT, were included. A mean follow-up period after post-operative SPECT was $33{\pm}21$ months. Each patient's SPECT image was spatially normalized to Korean template with the SPM2. For the regional count normalization, the count of pons was used as a reference region. The basal/acetazolamide-stressed cerebral blood flow (CBF), the cerebral vascular reserve index (CVRI), and the extent of area with significantly decreased basal/acetazolamide- stressed rCBF than age-matched normal control were evaluated on both medial frontal, frontal, parietal, occipital lobes, and whole brain in each patient's images. The post-operative clinical outcome was assigned as good, poor according to the presence of transient ischemic attacks and/or fixed neurological deficits by pediatric neurosurgeon. Results: In a paired t-test, basal/acetazolamide-stressed rCBF and the CVRI were significantly improved after revascularization (p<0.05). The significant difference in the pre-operative basal/acetazolamide-stressed rCBF and the CVRI between the hemispheres where EDAS with frontal EGS was performed and their contralateral counterparts where EDAS only was done disappeared after operation (p<0.05). In an independent student t-test, the pre-operative basal rCBF in the medial frontal gyrus, the post-operative CVRI in the frontal lobe and the parietal lobe of the hemispheres with EDAS and frontal EGS, the post-operative CVRI, and ${\Delta}CVRI$ showed a significant difference between patients with a good and poor clinical outcome (p<0.05). In a multivariate logistic regression analysis, the ${\Delta}CVRI$ and the post-operative CVRI of medial frontal gyrus on the hemispheres where EDAS with frontal EGS was performed were the significant predictive factors for the clinical outcome (p =0.002, p =0.015), Conclusion: With probabilistic map, we could objectively evaluate pre/post-operative hemodynamic changes of pediatric patients with moyamoya disease. Specifically the post-operative CVRI and the post-operative CVRI of medial frontal gyrus where EDAS with frontal EGS was done were the significant predictive factors for further clinical outcomes.

Adaptive RFID anti-collision scheme using collision information and m-bit identification (충돌 정보와 m-bit인식을 이용한 적응형 RFID 충돌 방지 기법)

  • Lee, Je-Yul;Shin, Jongmin;Yang, Dongmin
    • Journal of Internet Computing and Services
    • /
    • v.14 no.5
    • /
    • pp.1-10
    • /
    • 2013
  • RFID(Radio Frequency Identification) system is non-contact identification technology. A basic RFID system consists of a reader, and a set of tags. RFID tags can be divided into active and passive tags. Active tags with power source allows their own operation execution and passive tags are small and low-cost. So passive tags are more suitable for distribution industry than active tags. A reader processes the information receiving from tags. RFID system achieves a fast identification of multiple tags using radio frequency. RFID systems has been applied into a variety of fields such as distribution, logistics, transportation, inventory management, access control, finance and etc. To encourage the introduction of RFID systems, several problems (price, size, power consumption, security) should be resolved. In this paper, we proposed an algorithm to significantly alleviate the collision problem caused by simultaneous responses of multiple tags. In the RFID systems, in anti-collision schemes, there are three methods: probabilistic, deterministic, and hybrid. In this paper, we introduce ALOHA-based protocol as a probabilistic method, and Tree-based protocol as a deterministic one. In Aloha-based protocols, time is divided into multiple slots. Tags randomly select their own IDs and transmit it. But Aloha-based protocol cannot guarantee that all tags are identified because they are probabilistic methods. In contrast, Tree-based protocols guarantee that a reader identifies all tags within the transmission range of the reader. In Tree-based protocols, a reader sends a query, and tags respond it with their own IDs. When a reader sends a query and two or more tags respond, a collision occurs. Then the reader makes and sends a new query. Frequent collisions make the identification performance degrade. Therefore, to identify tags quickly, it is necessary to reduce collisions efficiently. Each RFID tag has an ID of 96bit EPC(Electronic Product Code). The tags in a company or manufacturer have similar tag IDs with the same prefix. Unnecessary collisions occur while identifying multiple tags using Query Tree protocol. It results in growth of query-responses and idle time, which the identification time significantly increases. To solve this problem, Collision Tree protocol and M-ary Query Tree protocol have been proposed. However, in Collision Tree protocol and Query Tree protocol, only one bit is identified during one query-response. And, when similar tag IDs exist, M-ary Query Tree Protocol generates unnecessary query-responses. In this paper, we propose Adaptive M-ary Query Tree protocol that improves the identification performance using m-bit recognition, collision information of tag IDs, and prediction technique. We compare our proposed scheme with other Tree-based protocols under the same conditions. We show that our proposed scheme outperforms others in terms of identification time and identification efficiency.

Fragility Analysis Method Based on Seismic Performance of Bridge Structure considering Earthquake Frequencies (지진 진동수에 따른 교량의 내진성능기반 취약도 해석 방법)

  • Lee, Dae-Hyoung;Chung, Young-Soo;Yang, Dong-Wook
    • Journal of the Korea Concrete Institute
    • /
    • v.21 no.2
    • /
    • pp.187-197
    • /
    • 2009
  • This paper presents a systematic approach for estimating fragility curves and damage probability matrices for different frequencies. Fragility curves and damage probability indicate the probabilities that a structure will sustain different degrees of damage at different ground motion levels. The seismic damages are to achieved by probabilistic evaluation because of uncertainty of earthquakes. In contrast to previous approaches, this paper presents a method that is based on nonlinear dynamic analysis of the structure using empirical data. This paper presents the probability of damage as a function of peak ground acceleration and estimates the probability of five damage levels for prestressed concrete (PSC) bridge pier subjected to given ground acceleration. At each level, 100 artificial earthquake motions were generated in terms of soil conditions, and nonlinear time domain analyses was performed for the damage states of PSC bridge pier structures. These damage states are described by displacement ductility resulting from seismic performance based on existing research results. Using the damage states and ground motion parameters, five fragility curves for PSC bridge pier with five types of dominant frequencies were constructed assuming a log-normal distribution. The effect of dominant frequences was found to be significant on fragility curves.

Computation of Criterion Rainfall for Urban Flood by Logistic Regression (로지스틱 회귀에 의한 도시 침수발생의 한계강우량 산정)

  • Kim, Hyun Il;Han, Kun Yeun
    • KSCE Journal of Civil and Environmental Engineering Research
    • /
    • v.39 no.6
    • /
    • pp.713-723
    • /
    • 2019
  • Due to the climate change and various rainfall pattern, it is difficult to estimate a rainfall criterion which cause inundation for urban drainage districts. It is necessary to examine the result of inundation analysis by considering the detailed topography of the watershed, drainage system, and various rainfall scenarios. In this study, various rainfall scenarios were considered with the probabilistic rainfall and Huff's time distribution method in order to identify the rainfall characteristics affecting the inundation of the Hyoja drainage basin. Flood analysis was performed with SWMM and two-dimensional inundation analysis model and the parameters of SWMM were optimized with flood trace map and GA (Genetic Algorithm). By linking SWMM and two-dimensional flood analysis model, the fitness ratio between the existing flood trace and simulated inundation map turned out to be 73.6 %. The occurrence of inundation according to each rainfall scenario was identified, and the rainfall criterion could be estimated through the logistic regression method. By reflecting the results of one/two dimensional flood analysis, and AWS/ASOS data during 2010~2018, the rainfall criteria for inundation occurrence were estimated as 72.04 mm, 146.83 mm, 203.06 mm in 1, 2 and 3 hr of rainfall duration repectively. The rainfall criterion could be re-estimated through input of continuously observed rainfall data. The methodology presented in this study is expected to provide a quantitative rainfall criterion for urban drainage area, and the basic data for flood warning and evacuation plan.

The Seismic Hazard Study on Chung-Nam Province using HAZUS (HAZUS를 이용한 충남지역의 지진피해 연구)

  • Kang, Ik-Bum;Park, Jung-Ho
    • Journal of the Korean Society of Hazard Mitigation
    • /
    • v.2 no.2 s.5
    • /
    • pp.73-83
    • /
    • 2002
  • HAZUS developed by FEMA is applied to estimation on seismic hazard in Chung-Nam Province using basic data on general building, population, and geology of well-logging. Through the investigation on historical and instrumental earthquakes in Korean Peninsula seismic hazard is estimated in Chung-Nam Province in two ways for calculation of acceleration, deterministically and probabilistically. In deterministic method seismic hazard in Chung-Nam Province is estimated by generation of the maximum event that occurs in Hongsung and has magnitude of 6.0. According to the result, Hongsung Gun, Yesan Gun, and Boryung City are the most severe in building damage. The expected number of people who need hospitalization in Hongsung Gun and Yesan Gun due to the earthquake are 1.1 and 0.4, respectively. In probabilistic(return period of 5,000 year) method seismic hazard in Chung-Nam Province is estimated. According to the result, Gongju City is the most severe in building damage. The expected number of people who need hospitalization in Gongju City and Nonsan City due to the earthquake are 0.1 and 0.15, respectively.