• Title/Summary/Keyword: weighted source

Search Result 152, Processing Time 0.026 seconds

Correlation analysis between energy indices and source-to-node shortest pathway of water distribution network (상수도관망 수원-절점 최소거리와 에너지 지표 상관성 분석)

  • Lee, Seungyub;Jung, Donghwi
    • Journal of Korea Water Resources Association
    • /
    • v.51 no.11
    • /
    • pp.989-998
    • /
    • 2018
  • Connectivity between water source and demand node can be served as a critical system performance indicator of the degree of water distribution network (WDN)' failure severity under abnormal conditions. Graph theory-based approaches have been widely applied to quantify the connectivity due to WDN's graph-like topological feature. However, most previous studies used undirected-unweighted graph theory which is not proper to WDN. In this study, the directed-weighted graph theory was applied for WDN connectivity analyses. We also proposed novel connectivity indicators, Source-to-Node Shortest Pathway (SNSP) and SNSP-Degree (SNSP-D) which is an inverse of the SNSP value, that does not require complicate hydraulic simulation of a WDN of interest. The proposed SNSP-D index was demonstrated in total 42 networks in J City, South Korea in which Pearson Correlation Coefficient (PCC) between the proposed SNSP-D and four other system performance indicators was computed: three resilience indexes and an energy efficiency metric. It was confirmed that a system representative value of the SNSP-D has strong correlation with all resilience and energy efficiency indexes (PCC = 0.87 on average). Especially, PCC was higher than 0.93 with modified resilience index (MRI) and energy efficiency indicator. In addition, a multiple linear regression analysis was performed to identify the system hydraulic characteristic factors that affect the correlation between SNSP-D and other system performance indicators. The proposed SNSP is expected to be served as a useful surrogate measure of resilience and/or energy efficiency indexes in practice.

A Minimum Cut Algorithm Using Maximum Adjacency Merging Method of Undirected Graph (무방향 그래프의 최대인접병합 방법을 적용한 최소절단 알고리즘)

  • Choi, Myeong-Bok;Lee, Sang-Un
    • The Journal of the Institute of Internet, Broadcasting and Communication
    • /
    • v.13 no.1
    • /
    • pp.143-152
    • /
    • 2013
  • Given weighted graph G=(V,E), n=|V|, m=|E|, the minimum cut problem is classified with source s and sink t or without s and t. Given undirected weighted graph without s and t, Stoer-Wagner algorithm is most popular. This algorithm fixes arbitrary vertex, and arranges maximum adjacency (MA)-ordering. In the last, the sum of weights of the incident edges for last ordered vertex is computed by cut value, and the last 2 vertices are merged. Therefore, this algorithm runs $\frac{n(n-1)}{2}$ times. Given graph with s and t, Ford-Fulkerson algorithm determines the bottleneck edges in the arbitrary augmenting path from s to t. If the augmenting path is no more exist, we determine the minimum cut value by combine the all of the bottleneck edges. This paper suggests minimum cut algorithm for undirected weighted graph with s and t. This algorithm suggests MA-merging and computes cut value simultaneously. This algorithm runs n-1 times and successfully divides V into disjoint S and V sets on the basis of minimum cut, but the Stoer-Wagner is fails sometimes. The proposed algorithm runs more than Ford-Fulkerson algorithm, but finds the minimum cut value within n-1 processing times.

Analysis of Importance in Available Space for Creating Urban Forests to Reduce Particulate Matter - Using the Analytic Hierarchy Process - (미세먼지 저감 도시숲 조성을 위한 가용공간의 중요도 분석 - AHP 기법을 이용하여 -)

  • Jeong, Dae-Young;Choi, Yun-Eui;Chon, Jin-Hyung
    • Journal of the Korean Institute of Landscape Architecture
    • /
    • v.47 no.6
    • /
    • pp.103-114
    • /
    • 2019
  • Despite recent projects to create urban forests to reduce the levels of particulate matter, objective evaluation criteria for selecting suitable sites for the projects have not been provided. The purposes of this study are to identify assessment items for the evaluation of available spaces for urban forests for the reduction of particulate matter and to analyze the relative importance of the items using the Analytic Hierarchy Process (AHP). We identified a total of 19 items in five categories through a literature review and a panel discussion. A total of 29 responses were collected from surveys given to experts, and an AHP analysis was conducted on the results. As a result, 'locational characteristics' (0.355) received the highest weighted value among the five categories, followed by 'planting type of existing green space' (0.184), 'weather conditions' (0.183), 'physical characteristics' (0.15), and 'human social environment' (0.128). In addition, among all the assessment items, 'proximity of source apportionment of particulate matter' (0.143) had higher weighted value while 'plantation of existing green space' (0.024) had the lowest weighted value. This study will present objective criteria and directions in the selecting of available spaces to create urban forests for the reduction of particulate matter.

Locally adaptive intelligent interpolation for population distribution modeling using pre-classified land cover data and geographically weighted regression (지표피복 데이터와 지리가중회귀모형을 이용한 인구분포 추정에 관한 연구)

  • Kim, Hwahwan
    • Journal of the Korean association of regional geographers
    • /
    • v.22 no.1
    • /
    • pp.251-266
    • /
    • 2016
  • Intelligent interpolation methods such as dasymetric mapping are considered to be the best way to disaggregate zone-based population data by observing and utilizing the internal variation within each source zone. This research reviews the advantages and problems of the dasymetric mapping method, and presents a geographically weighted regression (GWR) based method to take into consideration the spatial heterogeneity of population density - land cover relationship. The locally adaptive intelligent interpolation method is able to make use of readily available ancillary information in the public domain without the need for additional data processing. In the case study, we use the preclassified National Land Cover Dataset 2011 to test the performance of the proposed method (i.e. the GWR-based multi-class dasymetric method) compared to four other popular population estimation methods (i.e. areal weighting interpolation, pycnophylactic interpolation, binary dasymetric method, and globally fitted ordinary least squares (OLS) based multi-class dasymetric method). The GWR-based multi-class dasymetric method outperforms all other methods. It is attributed to the fact that spatial heterogeneity is accounted for in the process of determining density parameters for land cover classes.

  • PDF

Elastic Wave Modeling Including Surface Topography Using a Weighted-Averaging Finite Element Method in Frequency Domain (지형을 고려한 주파수 영역 가중평균 유한요소법 탄성파 모델링)

  • Choi, Ji-Hyang;Nam, Myung-Jin;Min, Dong-Joo;Shin, Chang-Soo;Suh, Jung-Hee
    • Geophysics and Geophysical Exploration
    • /
    • v.11 no.2
    • /
    • pp.93-98
    • /
    • 2008
  • Abstract: Surface topography has a significant influence on seismic wave propagation in a reflection seismic exploration. Effects of surface topography on two-dimensional elastic wave propagation are investigated through modeling using a weighted-averaging (WA) finite-element method (FEM), which is computationally more efficient than conventional FEM. Effects of air layer on wave propagation are also investigated using flat surface models with and without air. To validate our scheme in modeling including topography, we compare WA FEM results for irregular topographic models against those derived from conventional FEM using one set of rectangular elements. For the irregular surface topography models, elastic wave propagation is simulated to show that breaks in slope act as a new source for diffracted waves, and that Rayleigh waves are more seriously distorted by surface topography than P-waves.

Resource Weighted Load Distribution Policy for Effective Transcoding Load Distribution (효과적인 트랜스코딩 부하 분산을 위한 자원 가중치 부하분산 정책)

  • Seo, Dong-Mahn;Lee, Joa-Hyoung;Choi, Myun-Uk;Kim, Yoon;Jung, In-Bum
    • Journal of KIISE:Computing Practices and Letters
    • /
    • v.11 no.5
    • /
    • pp.401-415
    • /
    • 2005
  • Owing to the improved wireless communication technologies, it is possible to provide streaming service of multimedia with PDAs and mobile phones in addition to desktop PCs. Since mobile client devices have low computing power and low network bandwidth due to wireless network, the transcoding technology to adapt media for mobile client devices considering their characteristics is necessary. Transcoding servers transcode the source media to the target media within corresponding grades and provide QoS in real-time. In particular, an effective load balancing policy for transcoding servers is inevitable to support QoS for large scale mobile users. In this paper, the resource weighted load distribution policy is proposed for a fair load balance and a more scalable performance in cluster-based transcoding servers. Our proposed policy is based on the resource weighted table and number of maximum supported users, which are pre-computed for each pre-defined grade. We implement the proposed policy on cluster-based transcoding servers and evaluate its fair load distribution and scalable performance with the number of transcoding servers.

Relationship between Abnormal Hyperintensity on T2-Weighted Images Around Developmental Venous Anomalies and Magnetic Susceptibility of Their Collecting Veins: In-Vivo Quantitative Susceptibility Mapping Study

  • Yangsean Choi;Jinhee Jang;Yoonho Nam;Na-Young Shin;Hyun Seok Choi;So-Lyung Jung;Kook-Jin Ahn;Bum-soo Kim
    • Korean Journal of Radiology
    • /
    • v.20 no.4
    • /
    • pp.662-670
    • /
    • 2019
  • Objective: A developmental venous anomaly (DVA) is a vascular malformation of ambiguous clinical significance. We aimed to quantify the susceptibility of draining veins (χvein) in DVA and determine its significance with respect to oxygen metabolism using quantitative susceptibility mapping (QSM). Materials and Methods: Brain magnetic resonance imaging of 27 consecutive patients with incidentally detected DVAs were retrospectively reviewed. Based on the presence of abnormal hyperintensity on T2-weighted images (T2WI) in the brain parenchyma adjacent to DVA, the patients were grouped into edema (E+, n = 9) and non-edema (E-, n = 18) groups. A 3T MR scanner was used to obtain fully flow-compensated gradient echo images for susceptibility-weighted imaging with source images used for QSM processing. The χvein was measured semi-automatically using QSM. The normalized χvein was also estimated. Clinical and MR measurements were compared between the E+ and E- groups using Student's t-test or Mann-Whitney U test. Correlations between the χvein and area of hyperintensity on T2WI and between χvein and diameter of the collecting veins were assessed. The correlation coefficient was also calculated using normalized veins. Results: The DVAs of the E+ group had significantly higher χvein (196.5 ± 27.9 vs. 167.7 ± 33.6, p = 0.036) and larger diameter of the draining veins (p = 0.006), and patients were older (p = 0.006) than those in the E- group. The χvein was also linearly correlated with the hyperintense area on T2WI (r = 0.633, 95% confidence interval 0.333-0.817, p < 0.001). Conclusion: DVAs with abnormal hyperintensity on T2WI have higher susceptibility values for draining veins, indicating an increased oxygen extraction fraction that might be associated with venous congestion.

Comparison of Validity of Food Group Intake by Food Frequency Questionnaire Between Pre- and Post-adjustment Estimates Derived from 2-day 24-hour Recalls in Combination with the Probability of Consumption

  • Kim, Dong-Woo;Oh, Se-Young;Kwon, Sung-Ok;Kim, Jeong-Seon
    • Asian Pacific Journal of Cancer Prevention
    • /
    • v.13 no.6
    • /
    • pp.2655-2661
    • /
    • 2012
  • Validation of a food frequency questionnaire (FFQ) utilising a short-term measurement method is challenging when the reference method does not accurately reflect the usual food intake. In addition, food group intake that is not consumed on daily basis is more critical when episodically consumed foods are related and compared. To overcome these challenges, several statistical approaches have been developed to determine usual food intake distributions. The Multiple Source Method (MSM) can calculate the usual food intake by combining the frequency questions of an FFQ with the short-term food intake amount data. In this study, we applied the MSM to estimate the usual food group intake and evaluate the validity of an FFQ with a group of 333 Korean children (aged 3-6 y) who completed two 24-hour recalls (24HR) and one FFQ in 2010. After adjusting the data using the MSM procedure, the true rate of non-consumption for all food groups was less than 1% except for the beans group. The median Spearman correlation coefficients against FFQ of the mean of 2-d 24HRs data and the MSM-adjusted data were 0.20 (range: 0.11 to 0.40) and 0.35 (range: 0.14 to 0.60), respectively. The weighted kappa values against FFQ ranged from 0.08 to 0.25 for the mean of 2-d 24HRs data and from 0.10 to 0.41 for the MSM-adjusted data. For most food groups, the MSM-adjusted data showed relatively stronger correlations against FFQ than raw 2-d 24HRs data, from 0.03 (beverages) to 0.34 (mushrooms). The results of this study indicated that the application of the MSM, which was a better estimate of the usual intake, could be worth considering in FFQ validation studies among Korean children.

Meeting of Gauss and Shannon at Coin Leaf in 5G Massive MIMO (5G Massive MIMO에서 가우스(Gauss)와 샤논(Shannon)이 동전 한 닢에서 만남)

  • Kim, Jeong-Su;Lee, Moon-Ho;Park, Daechul
    • The Journal of the Institute of Internet, Broadcasting and Communication
    • /
    • v.18 no.2
    • /
    • pp.89-103
    • /
    • 2018
  • A genius "Prince of Mathematician" Gaussian and "Father of Communication" Shannon comes up with the creative idea of motivation to meet each other? The answer is a coin leaf. Gaussian found some creative ideas in the matter of obtaining a sum of 1 to 100. This is the same as the probability distribution curve when a coin leaf is thrown. Shannon extended the Gaussian probability distribution to define the entropy, taking the source symbol and the reciprocal logarithm to obtain the weighted average. These where the genius Gaussian and Shannon meet in the same coin leaf. This paper focuses on this point, and easily proves Gaussian distribution and Shannon entropy. As an application example, we have obtained the capacity and transition probability of Jeongju seminal vesicle, and the Shannon channel capacity is 1 when the equivalent transition probability is 1/2.

Hydrologic Response Analysis Considering the Scale Problem : Part 1. Derivation of the Model (규모문제를 고려한 수문응답의 해석 : 1. 모형이론의 유도)

  • 성기원;선우중호
    • Water for future
    • /
    • v.28 no.4
    • /
    • pp.185-194
    • /
    • 1995
  • The objective of this study is to explore scale problem and to analyze the relations between scale and geomorphologic parameters of the rainfall-runoff model. Generally, measurement and calculation of geomorphologic parameters rely on and are sensitive to the resolution of source information available. Therefore, rainfall-runoff models using geomorphologic parameters should take account of the effects of the map scale used in their development. The derived rainfall-runoff model considering scale problem in this research is the GIUH type model, that is a basin IUH consisting of the channel network response and hillslope response. The cannel network response is computed by means of the diffusion analogy transformed from linearized St. Venant equation and hillslope response is calculated by 2-parameter gamma distribution function. Representing geomorphologic structure of the channel network and initial distribution of its response is width function. This width function is derived by fractal theory and Melton's law to consider scale problems and is weighted by the source location function (SLF) proposed in this research to increase the applicability.

  • PDF