• Title/Summary/Keyword: Probabilistic Models

Search Result 464, Processing Time 0.038 seconds

A Study on the Economic Valuation of the Suncheon Bay Wetland according to the Logit Model (로짓모형에 따른 순천만습지의 경제적 가치평가)

  • Lee, Jeong;Kim, Sa-rang;Kweon, Dae-gon;Jung, Bom-bi;Song, Sung-hwan;Kim, Sun-hwa
    • Journal of the Korean Institute of Landscape Architecture
    • /
    • v.45 no.6
    • /
    • pp.10-27
    • /
    • 2017
  • Recently, the importance of recognizing the natural environment and the need for its conservation are increasing due to rapid urbanization. Suncheon Bay, designated as Scenic Site No. 41 and one of the World's Five Greatest Coastal Wetlands, is the only tideland among the tidal flats in Korea, which has salt marsh reserves. It has high conservation value from the ecological aspect. In addition to the Suncheon Bay National Garden, it provides various benefits not only to visitors but to local residents as well in terms of economics, environmental issues, and history and cultural aspects. Two million tourists visit the site annually, which has constantly highlighted the limits of ecological capacity. The valuation of the Suncheon Bay wetland is more important for the sustainability of the Suncheon Bay wetland than for its value as a tourism resource for the activation of the local economy. This study used the Logit model, which is commonly used among probabilistic choice models, to evaluate the economic value of Suncheon Bay wetland with the contingent valuation method(CVM). Applying the conservation value of the Suncheon Bay wetland to the benefit of KRW 8,200 for 1 person and 1 day, the benefit from exploration is KRW 2,050, the management and conservation value is KRW 3,034, and the heritage value is KRW 3,116. The results of this study are that benefit from the annual exploration of Suncheon Bay wetland was KRW 44.3 in billion, the management and conservation value was KRW 6.55 in billion, and the heritage value was KRW 6.73 in billion. When converted to the number of paying visitors per year, the conservation value is about KRW 177.1 billion. This study was conducted to evaluate the use and conservation aspects of the economic value of Suncheon Bay wetland. Based on the latent value of the Suncheon Bay wetland, it provides basic data about the efficient management and policy establishment of Suncheon Bay wetland. The study is significant in that the ecological sustainability of the Suncheon bay wetland and the value of non-marketable were evaluated based on the recognition of 'benefit through exploration', 'management and conservation value' and 'value of heritage'. It can be used as policy decision data on the integrated collection of the admission fee of the Suncheon Bay wetland and Suncheon Bay National Garden.

Reliability Analysis on Stability of Armor Units for Foundation Mound of Composite Breakwaters (혼성제 기초 마운드의 피복재 안정성에 대한 신뢰성 해석)

  • Cheol-Eung Lee
    • Journal of Korean Society of Coastal and Ocean Engineers
    • /
    • v.35 no.2
    • /
    • pp.23-32
    • /
    • 2023
  • Probabilistic and deterministic analyses are implemented for the armor units of rubble foundation mound of composite breakwaters which is needed to protect the upright section against the scour of foundation mounds. By a little modification and incorporation of the previous empirical formulas that has commonly been applied to design the armor units of foundation mound, a new type formula of stability number has been suggested which is capable of taking into account slopes of foundation mounds, damage ratios of armor units, and incident wave numbers. The new proposed formula becomes mathematically identical with the previous empirical formula under the same conditions used in the developing process. Deterministic design have first been carried out to evaluate the minimum weights of armor units for several conditions associated with a typical section of composite breakwater. When the slopes of foundation mound become steepening and the incident wave numbers are increasing, the bigger armor units more than those from the previous empirical formula should be required. The opposite trends however are shown if the damage ratios is much more allowed. Meanwhile, the reliability analysis, which is one of probabilistic models, has been performed in order to quantitatively verify how the armor unit resulted from the deterministic design is stable. It has been confirmed that 1.2% of annual encounter probability of failure has been evaluated under the condition of 1% damage ratio of armor units for the design wave of 50 years return period. By additionally calculating the influence factors of the related random variables on the failure probability due to those uncertainties, it has been found that Hudson's stability coefficient, significant wave height, and water depth above foundation mound have sequentially been given the impacts on failure regardless of the incident wave angles. Finally, sensitivity analysis has been interpreted with respect to the variations of random variables which are implicitly involved in the formula of stability number for armor units of foundation mound. Then, the probability of failure have been rapidly decreased as the water depth above foundation mound are deepening. However, it has been shown that the probability of failure have been increased according as the berm width of foundation mound are widening and wave periods become shortening.

Eye Movements in Understanding Combinatorial Problems (순열 조합 이해 과제에서의 안구 운동 추적 연구)

  • Choi, In Yong;Cho, Han Hyuk
    • Journal of Educational Research in Mathematics
    • /
    • v.26 no.4
    • /
    • pp.635-662
    • /
    • 2016
  • Combinatorics, the basis of probabilistic thinking, is an important area of mathematics and closely linked with other subjects such as informatics and STEAM areas. But combinatorics is one of the most difficult units in school mathematics for leaning and teaching. This study, using the designed combinatorial models and executable expression, aims to analyzes the eye movement of graduate students when they translate the written combinatorial problems to the corresponding executable expression, and examines not only the understanding process of the written combinatorial sentences but also the degree of difficulties depending on the combinatorial semantic structures. The result of the study shows that there are two types of solving process the participants take when they solve the problems : one is to choose the right executable expression by comparing the sentence and the executable expression frequently. The other approach is to find the corresponding executable expression after they derive the suitable mental model by translating the combinatorial sentence. We found the cognitive processing patterns of the participants how they pay attention to words and numbers related to the essential informations hidden in the sentence. Also we found that the student's eyes rest upon the essential combinatorial sentences and executable expressions longer and they perform the complicated cognitive handling process such as comparing the written sentence with executable expressions when they try the problems whose meaning structure is rarely used in the school mathematics. The data of eye movement provide meaningful information for analyzing the cognitive process related to the solving process of the participants.

A Study on Random Selection of Pooling Operations for Regularization and Reduction of Cross Validation (정규화 및 교차검증 횟수 감소를 위한 무작위 풀링 연산 선택에 관한 연구)

  • Ryu, Seo-Hyeon
    • Journal of the Korea Academia-Industrial cooperation Society
    • /
    • v.19 no.4
    • /
    • pp.161-166
    • /
    • 2018
  • In this paper, we propose a method for the random selection of pooling operations for the regularization and reduction of cross validation in convolutional neural networks. The pooling operation in convolutional neural networks is used to reduce the size of the feature map and for its shift invariant properties. In the existing pooling method, one pooling operation is applied in each pooling layer. Because this method fixes the convolution network, the network suffers from overfitting, which means that it excessively fits the models to the training samples. In addition, to find the best combination of pooling operations to maximize the performance, cross validation must be performed. To solve these problems, we introduce the probability concept into the pooling layers. The proposed method does not select one pooling operation in each pooling layer. Instead, we randomly select one pooling operation among multiple pooling operations in each pooling region during training, and for testing purposes, we use probabilistic weighting to produce the expected output. The proposed method can be seen as a technique in which many networks are approximately averaged using a different pooling operation in each pooling region. Therefore, this method avoids the overfitting problem, as well as reducing the amount of cross validation. The experimental results show that the proposed method can achieve better generalization performance and reduce the need for cross validation.

Causal Effects Along Transitive Causal Routes: Reconsidering Two Concepts of Effects Founded on Structural Equation Model (이행적 인과 경로를 통한 원인 효과에 대한 해명: 구조 방정식에 토대한 인과 모형의 원인 효과 개념에 대한 평가와 대안)

  • Kim, Joonsung
    • Korean Journal of Logic
    • /
    • v.18 no.1
    • /
    • pp.83-133
    • /
    • 2015
  • In this paper, I pose a problem for Hitchcock's arguments for two concepts of effects that are intended to explicate double causal effects, and put forth a theory that is intended not just to meet the problem but also to accommodate Hitchcock's theory and Eells' theory both. First, I introduce an example of dual causal effects, and examine the accounts of Otte(1985) and Eells(1987) on how to explicate the dual effects. I show that their accounts of the dual effects help us understand the problem of dual effects and see how different it is for Cartwright(1979, 1989, 1995), Eells(1991, 1995), and Hitchcock(2001a) to meet the problem. Second, I introduce two concepts of effects on Hitchcock(2001a), that is, net effect and component effect that are allegedly analogous to two effects of structural equation model. Third, I reveal the significance of homogeneous subpopulation and causal interaction regarding the problem of dual effects while examining Cartwright's theory and Elles' theory. Fourth, I critically examine the two concepts of effects on Hitchcock and argue against Hitchcock's criticism of Eells' theory. Fifth, I take a moderator variable of structural equation model and a moderator effect into the probabilistic theory of causality, and formally generalize causal interaction due to the dual effects in terms of disjunctive relation and counterfactual conditionals. I expect my account of disjunctive relation and counterfactual conditionals to contribute not just to several problems the received theories of causal modelling confront but also to the structural equation models many people exploit as a promising statistical methodology.

  • PDF

Efficient Management of Statistical Information of Keywords on E-Catalogs (전자 카탈로그에 대한 효율적인 색인어 통계 정보 관리 방법)

  • Lee, Dong-Joo;Hwang, In-Beom;Lee, Sang-Goo
    • The Journal of Society for e-Business Studies
    • /
    • v.14 no.4
    • /
    • pp.1-17
    • /
    • 2009
  • E-Catalogs which describe products or services are one of the most important data for the electronic commerce. E-Catalogs are created, updated, and removed in order to keep up-to-date information in e-Catalog database. However, when the number of catalogs increases, information integrity is violated by the several reasons like catalog duplication and abnormal classification. Catalog search, duplication checking, and automatic classification are important functions to utilize e-Catalogs and keep the integrity of e-Catalog database. To implement these functions, probabilistic models that use statistics of index words extracted from e-Catalogs had been suggested and the feasibility of the methods had been shown in several papers. However, even though these functions are used together in the e-Catalog management system, there has not been enough consideration about how to share common data used for each function and how to effectively manage statistics of index words. In this paper, we suggest a method to implement these three functions by using simple SQL supported by relational database management system. In addition, we use materialized views to reduce the load for implementing an application that manages statistics of index words. This brings the efficiency of managing statistics of index words by putting database management systems optimize statistics updating. We showed that our method is feasible to implement three functions and effective to manage statistics of index words with empirical evaluation.

  • PDF

OD trip matrix estimation from urban link traffic counts (comparison with GA and SAB algorithm) (링크관측교통량을 이용한 도시부 OD 통행행렬 추정 (GA와 SAB 알고리즘의 비교를 중심으로))

  • 백승걸;김현명;임용택;임강원
    • Journal of Korean Society of Transportation
    • /
    • v.18 no.6
    • /
    • pp.89-99
    • /
    • 2000
  • To cope with the limits of conventional O-D trip matrix collecting methods, several approaches have been developed. One of them is bilevel Programming method Proposed by Yang(1995), which uses Sensitivity Analysis Based(SAB) algorithm to solve Generalized Least Square(GLS) problem. However, the SAB a1gorithm has revealed two critical short-comings. The first is that when there exists a significant difference between target O-D matrix and true O-D matrix, SAB algorithm may not produce correct solution. This stems from the heavy dependance on the historical O-D information, in special when gravel Patterns are dramatically changed. The second is the assumption of iterative linear approximation to original Problem. Because of the approximation, SAB algorithm has difficulty in converging to Perfect Stackelberg game condition. So as to avoid the Problems. we need a more robust and stable solution method. The main purpose of this Paper is to show the problem of the dependency of Previous models and to Propose an alternative solution method to handle it. The Problem of O-D matrix estimation is intrinsically nonlinear and nonconvex. thus it has multiple solutions. Therefore it is necessary to require a method for searching globa1 solution. In this paper, we develop a solution algorithm combined with genetic algorithm(GA) , which is widely used as probabilistic global searching method To compare the efficiency of the algorithm, SAB algorithm suggested by Yang et al. (1992,1995) is used. From the results of numerical example, the Proposed algorithm is superior to SAB algorithm irrespective of travel patterns.

  • PDF

Economic Evaluation and Budget Impact Analysis of the Surveillance Program for Hepatocellular Carcinoma in Thai Chronic Hepatitis B Patients

  • Sangmala, Pannapa;Chaikledkaew, Usa;Tanwandee, Tawesak;Pongchareonsuk, Petcharat
    • Asian Pacific Journal of Cancer Prevention
    • /
    • v.15 no.20
    • /
    • pp.8993-9004
    • /
    • 2014
  • Background: The incidence rate and the treatment costs of hepatocellular carcinoma (HCC) are high, especially in Thailand. Previous studies indicated that early detection by a surveillance program could help by down-staging. This study aimed to compare the costs and health outcomes associated with the introduction of a HCC surveillance program with no program and to estimate the budget impact if the HCC surveillance program were implemented. Materials and Methods: A cost utility analysis using a decision tree and Markov models was used to compare costs and outcomes during the lifetime period based on a societal perspective between alternative HCC surveillance strategies with no program. Costs included direct medical, direct non-medical, and indirect costs. Health outcomes were measured as life years (LYs), and quality adjusted life years (QALYs). The results were presented in terms of the incremental cost-effectiveness ratio (ICER) in Thai THB per QALY gained. One-way and probabilistic sensitivity analyses were applied to investigate parameter uncertainties. Budget impact analysis (BIA) was performed based on the governmental perspective. Results: Semi-annual ultrasonography (US) and semi-annual ultrasonography plus alpha-fetoprotein (US plus AFP) as the first screening for HCC surveillance would be cost-effective options at the willingness to pay (WTP) threshold of 160,000 THB per QALY gained compared with no surveillance program (ICER=118,796 and ICER=123,451 THB/QALY), respectively. The semi-annual US plus AFP yielded more net monetary benefit, but caused a substantially higher budget (237 to 502 million THB) than semi-annual US (81 to 201 million THB) during the next ten fiscal years. Conclusions: Our results suggested that a semi-annual US program should be used as the first screening for HCC surveillance and included in the benefit package of Thai health insurance schemes for both chronic hepatitis B males and females aged between 40-50 years. In addition, policy makers considered the program could be feasible, but additional evidence is needed to support the whole prevention system before the implementation of a strategic plan.

Simulation-Based Stochastic Markup Estimation System $(S^2ME)$ (시뮬레이션을 기반(基盤)으로 하는 영업이윤율(營業利潤率) 추정(推定) 시스템)

  • Yi, Chang-Yong;Kim, Ryul-Hee;Lim, Tae-Kyung;Kim, Wha-Jung;Lee, Dong-Eun
    • Proceedings of the Korean Institute of Building Construction Conference
    • /
    • 2007.11a
    • /
    • pp.109-113
    • /
    • 2007
  • This paper introduces a system, Simulation based Stochastic Markup Estimation System (S2ME), for estimating optimum markup for a project. The system was designed and implemented to better represent the real world system involved in construction bidding. The findings obtained from the analysis of existing assumptions used in the previous quantitative markup estimation methods were incorporated to improve the accuracy and predictability of the S2ME. The existing methods has four categories of assumption as follows; (1) The number of competitors and who is the competitors are known, (2) A typical competitor, who is fictitious, is assumed for easy computation, (3) the ratio of bid price against cost estimate (B/C) is assumed to follow normal distribution, (4) The deterministic output obtained from the probabilistic equation of existing models is assumed to be acceptable. However, these assumptions compromise the accuracy of prediction. In practice, the bidding patterns of the bidders are randomized in competitive bidding. To complement the lack of accuracy contributed by these assumptions, bidding project was randomly selected from the pool of bidding database in the simulation experiment. The probability to win the bid in the competitive bidding was computed using the profile of the competitors appeared in the selected bidding project record. The expected profit and probability to win the bid was calculated by selecting a bidding record randomly in an iteration of the simulation experiment under the assumption that the bidding pattern retained in historical bidding DB manifest revival. The existing computation, which is handled by means of deterministic procedure, were converted into stochastic model using simulation modeling and analysis technique as follows; (1) estimating the probability distribution functions of competitors' B/C which were obtained from historical bidding DB, (2) analyzing the sensitivity against the increment of markup using normal distribution and actual probability distribution estimated by distribution fitting, (3) estimating the maximum expected profit and optimum markup range. In the case study, the best fitted probability distribution function was estimated using the historical bidding DB retaining the competitors' bidding behavior so that the reliability was improved by estimating the output obtained from simulation experiment.

  • PDF

Comparison of Methods for the Analysis Percentile of Seismic Hazards (지진재해도의 백분위수 분석 방법 비교)

  • Rhee, Hyun-Me;Seo, Jung-Moon;Kim, Min-Kyu;Choi, In-Kil
    • Journal of the Earthquake Engineering Society of Korea
    • /
    • v.15 no.2
    • /
    • pp.43-51
    • /
    • 2011
  • Probabilistic seismic hazard analysis (PSHA), which can effectively apply inevitable uncertainties in seismic data, considers a number of seismotectonic models and attenuation equations. The calculated hazard by PSHA is generally a value dependent on peak ground acceleration (PGA) and expresses the value as an annual exceedance probability. To represent the uncertainty range of a hazard which has occurred using various seismic data, a hazard curve figure shows both a mean curve and percentile curves (15, 50, and 85). The percentile performs an important role in that it indicates the uncertainty range of the calculated hazard, could be calculated using various methods by the relation of the weight and hazard. This study using the weight accumulation method, the weighted hazard method, the maximum likelihood method, and the moment method, has calculated the percentile of the computed hazard by PSHA on the Shinuljin 1, 2 site. The calculated percentile using the weight accumulation method, the weighted hazard method, and the maximum likelihood method, have similar trends and represent the range of all computed hazards by PSHA. The calculated percentile using the moment method effectively showed the range of hazards at the source which includes a site. This study suggests the moment method as effective percentile calculation method considering the almost same mean hazard for the seismotectonic model and a source which includes a site.