• 제목/요약/키워드: probabilistic study

Search Result 1,439, Processing Time 0.032 seconds

An Investigation of Reliability and Safety Factors in RC Flexural Members Designed by Current WSD Standard Code (현행(現行) 허용응력설계법(許容應力設計法)으로 설계(設計)되는 RC 휨부재(部材)의 신뢰성(信賴性)과 안전율(安全率) 고찰(考察))

  • Shin, Hyun Mook;Cho, Hyo Nam;Chung, Hwan Ho
    • KSCE Journal of Civil and Environmental Engineering Research
    • /
    • v.1 no.1
    • /
    • pp.33-42
    • /
    • 1981
  • Current standard code for R.C. design consists of two conventional design parts, so called WSD and USD, which are based on ACI 318-63 and 318-71 code provisions. The safety factors of our WSD and USD design criteria which are taken primarily from ACI 318-63 code are considered to be not appropriate compared to out country's design and construction practices. Furthermore, even the ACI safety factors are not determined from probabilistic study but merely from experiences and practices. This study investigates the safety level of R.C. flexural members designed by the current WSD safety provisions based on Second Moment Reliability theory, and proposes a rational but efficient way of determining the nominal safety factors and the associated flexural allowable stresses of steel bars and concretes in order to provide a consistent level of target reliability. Cornell's Mean First-Order Second Moment Method formulae by a log normal transformation of resistance and load output variables are adopted as the reliability analysis method for this study. The compressive allowable stress formulae are derived by a unique approach in which the balanced steel ratios of the resulting design are chosen to be the corresponding under-reinforced sections designed by strength design method with an optimum reinforcing ratio. The target reliability index for the safety provisions are considered to be ${\beta}=4$ that is well suited for our level of construction and design practices. From a series of numerical applications to investigate the safety and reliability of R.C. flexural members designed by current WSD code, it has been found that the design based on WSD provision results in uneconomical design because of unusual and inconsistent reliability. A rational set of reliability based safety factors and allowable stress of steel bars and concrete for flexural members is proposed by providing the appropriate target reliability ${\beta}=4$.

  • PDF

A Study on the Dose Assessment Methodology Using the Probabilistic Characteristics of TL Element Response (확률분포 특성을 이용한 열형광선량계의 선량평가방법에 관한 연구)

  • Cho, Dae-Hyung;Oh, Jang-Jin;Han, Seung-Jae;Na, Seong-Ho;Hwang, Won-Guk;Lee, Won-Keun
    • Journal of Radiation Protection and Research
    • /
    • v.23 no.3
    • /
    • pp.123-138
    • /
    • 1998
  • Characteristics of element responses of Panasonic UD802 personnel dosimeters in the X, ${\beta}$, ${\gamma}$, ${\gamma}/X$, ${\gamma}/{\beta}$ and ${\gamma}$/neutron mixed fields were assessed. A dose-response algorithm has been developed to decide the high probability of a radiation type and energy by using the distribution in all six ratios of the multi-element TLD. To calculate the 4-element response factors and ratios between the elements of the Panasonic TLDs in the X, $\beta$, and $\gamma$ radiation fields, Panasonic’s UD802 TLDs were irradiated with KINS’s reference irradiation facility. In the photon radiation field, this study confirms that element-3 (E3) and element-4 (E4) of the Panasonic TLDs show energy dependent both in low- and intermediate-energy range, while element-1 (E1) and element-2 (E2) show little energy dependency in the entire whole range. The algorithm, which was developed in this study, was applied to the Panasonic personnel dosimetry system with UD716AGL reader and UD802 TLDs. Performance tests of the algorithm developed was conducted according to the standards and criteria recommended in the ANSI N13.11. The sum of biases and standard deviations was less than 0.232. The values of biases and standard deviations are distributed within a triangle of a lateral value of 0.3 in the ordinate and abscissa, With the above algorithm, Panasonic TLDs satisfactorily perform optimum dose assessment even under an abnormal response of the TLD elements to the energy imparted. This algorithm can be applied to a more rigorous dose assessment by distinguishing an unexpected dose from the planned dose for the most practical purposes, and is useful in conducting an effective personnel dose control program.

  • PDF

Comparative Study of Reliability Design Methods by Application to Donghae Harbor Breakwaters. 1. Stability of Amor Blocks (동해항 방파제를 대상으로 한 신뢰성 설계법의 비교 연구. 1 피복 블록의 안정성)

  • Kim Seung-Woo;Suh Kyung-Duck;Oh Young Min
    • Journal of Korean Society of Coastal and Ocean Engineers
    • /
    • v.17 no.3
    • /
    • pp.188-201
    • /
    • 2005
  • This is the first part of a two-part paper which describes comparison of reliability design methods by application to Donghae Harbor Breakwaters. This paper, Part 1, is restricted to stability of armor blocks, while Part 2 deals with sliding of caissons. Reliability design methods have been developed fur breakwater designs since the mid-1980s. The reliability design method is classified into three categories depending on the level of probabilistic concepts being employed. In the Level 1 method, partial safety factors are used, which are predetermined depending on the allowable probability of failure. In the Level 2 method, the probability of failure is evaluated with the reliability index, which is calculated using the means and standard deviations of the load and resistance. The load and resistance are assumed to distribute normally. In the Level 3 method, the cumulative quantity of failure (e.g. cumulative damage of armor blocks) during the lifetime of the breakwater is calculated without assumptions of normal distribution of load and resistance. Each method calculates different design parameters, but they can be expressed in terms of probability of failure so that tile difference can be compared among the different methods. In this study, we applied the reliability design methods to the stability of armor blocks of the breakwaters of Donghae Harbor, which was constructed by traditional deterministic design methods to be damaged in 1987. Analyses are made for the breakwaters before the damage and after reinforcement. The probability of failure before the damage is much higher than the target probability of failure while that for the reinforced breakwater is much lower than the target value, indicating that the breakwaters before damage and after reinforcement were under- and over-designed, respectively. On the other hand, the results of the different reliability design methods were in fairly good agreement, confirming that there is not much difference among different methods.

Determinants of Consumer Preference by type of Accommodation: Two Step Cluster Analysis (이단계 군집분석에 의한 농촌관광 편의시설 유형별 소비자 선호 결정요인)

  • Park, Duk-Byeong;Yoon, Yoo-Shik;Lee, Min-Soo
    • Journal of Global Scholars of Marketing Science
    • /
    • v.17 no.3
    • /
    • pp.1-19
    • /
    • 2007
  • 1. Purpose Rural tourism is made by individuals with different characteristics, needs and wants. It is important to have information on the characteristics and preferences of the consumers of the different types of existing rural accommodation. The stud aims to identify the determinants of consumer preference by type of accommodations. 2. Methodology 2.1 Sample Data were collected from 1000 people by telephone survey with three-stage stratified random sampling in seven metropolitan areas in Korea. Respondents were chosen by sampling internal on telephone book published in 2006. We surveyed from four to ten-thirty 0'clock afternoon so as to systematic sampling considering respondents' life cycle. 2.2 Two-step cluster Analysis Our study is accomplished through the use of a two-step cluster method to classify the accommodation in a reduced number of groups, so that each group constitutes a type. This method had been suggested as appropriate in clustering large data sets with mixed attributes. The method is based on a distance measure that enables data with both continuous and categorical attributes to be clustered. This is derived from a probabilistic model in which the distance between two clusters in equivalent to the decrease in log-likelihood function as a result of merging. 2.3 Multinomial Logit Analysis The estimation of a Multionmial Logit model determines the characteristics of tourist who is most likely to opt for each type of accommodation. The Multinomial Logit model constitutes an appropriate framework to explore and explain choice process where the choice set consists of more than two alternatives. Due to its ease and quick estimation of parameters, the Multinomial Logit model has been used for many empirical studies of choice in tourism. 3. Findings The auto-clustering algorithm indicated that a five-cluster solution was the best model, because it minimized the BIC value and the change in them between adjacent numbers of clusters. The accommodation establishments can be classified into five types: Traditional House, Typical Farmhouse, Farmstay house for group Tour, Log Cabin for Family, and Log Cabin for Individuals. Group 1 (Traditional House) includes mainly the large accommodation establishments, i.e. those with ondoll style room providing meals and one shower room on family tourist, of original construction style house. Group 2 (Typical Farmhouse) encompasses accommodation establishments of Ondoll rooms and each bathroom providing meals. It includes, in other words, the tourist accommodations Known as "rural houses." Group 3 (Farmstay House for Group) has accommodation establishments of Ondoll rooms not providing meals and self cooking facilities, large room size over five persons. Group 4 (Log Cabin for Family) includes mainly the popular accommodation establishments, i.e. those with Ondoll style room with on shower room on family tourist, of western styled log house. While the accommodations in this group are not defined as regards type of construction, the group does include all the original Korean style construction, Finally, group 5 (Log Cabin for Individuals)includes those accommodations that are bedroom western styled wooden house with each bathroom. First Multinomial Logit model is estimated including all the explicative variables considered and taking accommodation group 2 as base alternative. The results show that the variables and the estimated values of the parameters for the model giving the probability of each of the five different types of accommodation available in rural tourism village in Korea, according to the socio-economic and trip related characteristics of the individuals. An initial observation of the analysis reveals that none of variables income, the number of journey, distance, and residential style of house is explicative in the choice of rural accommodation. The age and accompany variables are significant for accommodation establishment of group 1. The education and rural residential experience variables are significant for accommodation establishment of groups 4 and 5. The expenditure and marital status variables are significant for accommodation establishment of group 4. The gender and occupation variable are significant for accommodation establishment of group 3. The loyalty variable is significant for accommodation establishment of groups 3 and 4. The study indicates that significant differences exist among the individuals who choose each type of accommodation at a destination. From this investigation is evident that several profiles of tourists can be attracted by a rural destination according to the types of existing accommodations at this destination. Besides, the tourist profiles may be used as the basis for investment policy and promotion for each type of accommodation, making use in each case of the variables that indicate a greater likelihood of influencing the tourist choice of accommodation.

  • PDF

A Study on GPU-based Iterative ML-EM Reconstruction Algorithm for Emission Computed Tomographic Imaging Systems (방출단층촬영 시스템을 위한 GPU 기반 반복적 기댓값 최대화 재구성 알고리즘 연구)

  • Ha, Woo-Seok;Kim, Soo-Mee;Park, Min-Jae;Lee, Dong-Soo;Lee, Jae-Sung
    • Nuclear Medicine and Molecular Imaging
    • /
    • v.43 no.5
    • /
    • pp.459-467
    • /
    • 2009
  • Purpose: The maximum likelihood-expectation maximization (ML-EM) is the statistical reconstruction algorithm derived from probabilistic model of the emission and detection processes. Although the ML-EM has many advantages in accuracy and utility, the use of the ML-EM is limited due to the computational burden of iterating processing on a CPU (central processing unit). In this study, we developed a parallel computing technique on GPU (graphic processing unit) for ML-EM algorithm. Materials and Methods: Using Geforce 9800 GTX+ graphic card and CUDA (compute unified device architecture) the projection and backprojection in ML-EM algorithm were parallelized by NVIDIA's technology. The time delay on computations for projection, errors between measured and estimated data and backprojection in an iteration were measured. Total time included the latency in data transmission between RAM and GPU memory. Results: The total computation time of the CPU- and GPU-based ML-EM with 32 iterations were 3.83 and 0.26 see, respectively. In this case, the computing speed was improved about 15 times on GPU. When the number of iterations increased into 1024, the CPU- and GPU-based computing took totally 18 min and 8 see, respectively. The improvement was about 135 times and was caused by delay on CPU-based computing after certain iterations. On the other hand, the GPU-based computation provided very small variation on time delay per iteration due to use of shared memory. Conclusion: The GPU-based parallel computation for ML-EM improved significantly the computing speed and stability. The developed GPU-based ML-EM algorithm could be easily modified for some other imaging geometries.

Comparative Study of Reliability Design Methods by Application to Donghae Harbor Breakwaters. 2. Sliding of Caissons (동해항 방파제를 대상으로 한 신뢰성 설계법의 비교 연구. 2. 케이슨의 활동)

  • Kim, Seung-Woo;Suh, Kyung-Duck;Oh, Young-Min
    • Journal of Korean Society of Coastal and Ocean Engineers
    • /
    • v.18 no.2
    • /
    • pp.137-146
    • /
    • 2006
  • This is the second of a two-part paper which describes comparison of reliability design methods by application to Donghae Harbor Breakwaters. In this paper, Part 2, we deal with sliding of caissons. The failure modes of a vertical breakwater, which consists of a caisson mounted on a rubble mound, include the sliding and overturning of the caisson and the failure of the rubble mound or subsoil, among which most frequently occurs the sliding of the caisson. The traditional deterministic design method for sliding failure of a caisson uses the concept of a safety factor that the resistance should be greater than the load by a certain factor (e.g. 1.2). However, the safety of a structure cannot be quantitatively evaluated by the concept of a safety factor. On the other hand, the reliability design method, for which active research is being performed recently, enables one to quantitatively evaluate the safety of a structure by calculating the probability of failure of the structure. The reliability design method is classified into three categories depending on the level of probabilistic concepts being employed, i.e., Level 1, 2, and 3. In this study, we apply the reliability design methods to the sliding of the caisson of the breakwaters of Donghae Harbor, which was constructed by traditional deterministic design methods to be damaged in 1987. Analyses are made for the breakwaters before the damage and after reinforcement. The probability of failure before the damage is much higher than the allowable value, indicating that the breakwater was under-designed. The probability of failure after reinforcement, however, is close to the allowable value, indicating that the breakwater is no longer in danger. On the other hand, the results of the different reliability design methods are in fairly good agreement, confirming that there is not much difference among different methods.

An Analysis of the Uncertainty Factors for the Life Cycle Cost of Light Railroad Transit (경량전철 교량 LCC분석을 위한 불확실성 인자 분석)

  • Won, Seo-Kyung;Lee, Du-Heon;Kim, Kyoon-Tai;Kim, Hyun-Bae;Jun, Jin-Taek;Han, Choong-Hee
    • Proceedings of the Korean Institute Of Construction Engineering and Management
    • /
    • 2007.11a
    • /
    • pp.396-400
    • /
    • 2007
  • Various ways of automated guideway transit construction are being planned recently owing to the policies of the national government and local municipalities as well as increasing investment from the private sector. Particularly, the increase in the private investment is increasing greatly in SOC (Social Overhead Cost). This trend of promoting private sector investment must be conducted on the basis of a thorough analysis of the economic feasibility of the project from the government and construction companies in the private sector. In other words, an accurate cost analysis of initial investment cost (Construction cost), maintenance/repair cost, profit making through the operation of the concerned facilities, cost of dissolution, etc. in terms of the life cycle is very much in need. Nevertheless, the analysis of uncertainty factors and its probabilistic theory are in need of development so that they can be used in the analysis of the economic feasibility of a construction project. First of all, the actual studies on maintenance/repair cost of automated guideway transit are scarce as of yet, prohibiting an accurate computation of the cost and its economic analysis. Accordingly, this study focused on the uncertainty analysis of the economic feasibility for civil engineering structures among automated guideway transit construction projects based on the rapidly increasing investment on such structures from the private sector. For this research purpose, a cost classification system for the automated guideway transit is proposed, first of all, and the data On the cost cycle of the civil structure facilities and their unit cost are collected and analyzed. Then, the uncertainty in the cost is analyzed from the perspective of LCC. In consideration of the current status with almost no. studies on maintenance/repair of such facilities, it is expected that the cost classification system and the uncertainty analysis technique proposed in this study will greatly enhance LCC analysis and economic feasibility studies for automated guideway transit projects in the future.

  • PDF

A Study of Adjustment for Beginning & Ending Points of Climbing Lanes (오르막차로 시.종점 위치의 보정에 관한 연구)

  • 김상윤;오흥운
    • Journal of Korean Society of Transportation
    • /
    • v.24 no.5 s.91
    • /
    • pp.35-44
    • /
    • 2006
  • Acceleration and deceleration curves have been used for design purposes worldwide. The curve in design level has been regarded as an single deterministic curve to be used for design of climb lanes. It should be noted that the curve was originally made using ideal driving truck and that the curve is applied during design based on the assumption of no difference between ideal and real driving conditions. However. observations show that aged vehicles and lazy behavioring drivers nay make lower performance of vehicles than the ideal performance. The present paper provides the results of truck speeds at climbing lanes then probabilistic variation of acceleration and deceleration corves. For these purposes. a study about identification of vehicle makers, and weights for trucks at freeway toll gates and then observation of vehicle-following speed were performed. The 85%ile results obtained were compared with the deterministic performance curves of 180, 200, and 220 Ib/hp. It was identified that the performance of 85%ile results obtained from vehicle-following-speed observations were lower than one from deterministic performance curves. From these results, it may be concluded that at the beginning Point of climbing lanes additional $16.19{\sim}67.94m$ is necessary and that at the end point of climbing lanes $53.12{\sim}103.24m$ of extension is necessary.

A Proposal for Simplified Velocity Estimation for Practical Applicability (실무 적용성이 용이한 간편 유속 산정식 제안)

  • Tai-Ho Choo;Jong-Cheol Seo; Hyeon-Gu Choi;Kun-Hak Chun
    • Journal of Wetlands Research
    • /
    • v.25 no.2
    • /
    • pp.75-82
    • /
    • 2023
  • Data for measuring the flow rate of streams are used as important basic data for the development and maintenance of water resources, and many experts are conducting research to make more accurate measurements. Especially, in Korea, monsoon rains and heavy rains are concentrated in summer due to the nature of the climate, so floods occur frequently. Therefore, it is necessary to measure the flow rate most accurately during a flood to predict and prevent flooding. Thus, the U.S. Geological Survey (USGS) introduces 1, 2, 3 point method using a flow meter as one way to measure the average flow rate. However, it is difficult to calculate the average flow rate with the existing 1, 2, 3 point method alone.This paper proposes a new 1, 2, 3 point method formula, which is more accurate, utilizing one probabilistic entropy concept. This is considered to be a highly empirical study that can supplement the limitations of existing measurement methods. Data and Flume data were used in the number of holesman to demonstrate the utility of the proposed formula. As a result of the analysis, in the case of Flume Data, the existing USGS 1 point method compared to the measured value was 7.6% on average, 8.6% on the 2 point method, and 8.1% on the 3 point method. In the case of Coleman Data, the 1 point method showed an average error rate of 5%, the 2 point method 5.6% and the 3 point method 5.3%. On the other hand, the proposed formula using the concept of entropy reduced the error rate by about 60% compared to the existing method, with the Flume Data averaging 4.7% for the 1 point method, 5.7% for the 2 point method, and 5.2% for the 3 point method. In addition, Coleman Data showed an average error of 2.5% in the 1 point method, 3.1% in the 2 point method, and 2.8% in the 3 point method, reducing the error rate by about 50% compared to the existing method.This study can calculate the average flow rate more accurately than the existing 1, 2, 3 point method, which can be useful in many ways, including future river disaster management, design and administration.

Semi-automated Tractography Analysis using a Allen Mouse Brain Atlas : Comparing DTI Acquisition between NEX and SNR (알렌 마우스 브레인 아틀라스를 이용한 반자동 신경섬유지도 분석 : 여기수와 신호대잡음비간의 DTI 획득 비교)

  • Im, Sang-Jin;Baek, Hyeon-Man
    • Journal of the Korean Society of Radiology
    • /
    • v.14 no.2
    • /
    • pp.157-168
    • /
    • 2020
  • Advancements in segmentation methodology has made automatic segmentation of brain structures using structural images accurate and consistent. One method of automatic segmentation, which involves registering atlas information from template space to subject space, requires a high quality atlas with accurate boundaries for consistent segmentation. The Allen Mouse Brain Atlas, which has been widely accepted as a high quality reference of the mouse brain, has been used in various segmentations and can provide accurate coordinates and boundaries of mouse brain structures for tractography. Through probabilistic tractography, diffusion tensor images can be used to map comprehensive neuronal network of white matter pathways of the brain. Comparisons between neural networks of mouse and human brains showed that various clinical tests on mouse models were able to simulate disease pathology of human brains, increasing the importance of clinical mouse brain studies. However, differences between brain size of human and mouse brain has made it difficult to achieve the necessary image quality for analysis and the conditions for sufficient image quality such as a long scan time makes using live samples unrealistic. In order to secure a mouse brain image with a sufficient scan time, an Ex-vivo experiment of a mouse brain was conducted for this study. Using FSL, a tool for analyzing tensor images, we proposed a semi-automated segmentation and tractography analysis pipeline of the mouse brain and applied it to various mouse models. Also, in order to determine the useful signal-to-noise ratio of the diffusion tensor image acquired for the tractography analysis, images with various excitation numbers were compared.