• Title/Summary/Keyword: Digital Engineering Model

Search Result 1,938, Processing Time 0.031 seconds

Long-term Runoff Analysis Using the TOPMODEL (TOPMODEL을 이용한 장기유출 해석)

  • Jo, Hong-Je;Kim, Jeong-Sik;Lee, Geun-Bae
    • Journal of Korea Water Resources Association
    • /
    • v.33 no.4
    • /
    • pp.393-405
    • /
    • 2000
  • Monthly runoff was estimated using TOPMODEL which simulates ground water movement as well as surface runoff in the area of catchment. SAYUN dam which is being operated by Korea Water Resources Corporation was selected for the study, and the topographic factors of the watershed were analyzed using 1/5,000 digital map and GIS software(Arc/Info). The comparison shows good agreement between observed monthly runoff and the computation results simulated by using TOPMODEL. The catchment area of SAYUN dam was modeled by using various grid sizes in order to check the sensitivity of grid size, and the grid size of 180m was found most proper among 6 different sizes. TOPMODEL was also found superior to the existing monthly runoff models such as Kajiyama, KRIHS and Tank. Because the model requires limited number of parameters and considers topographic aspects, it is reckoned to be very useful for practical use.

  • PDF

Application of the L-index to the Delineation of Market Areas of Retail Businesses

  • Lee, Sang-Kyeong;Lee, Byoungkil
    • Journal of the Korean Society of Surveying, Geodesy, Photogrammetry and Cartography
    • /
    • v.32 no.3
    • /
    • pp.245-251
    • /
    • 2014
  • As delineating market areas of retail businesses has become an interesting topic in marketing field, Lee and Lee recently suggested a noteworthy method, which applied the hydrological analysis of geographical information system (GIS), based on Christaller's central place theory. They used a digital elevation model (DEM) which inverted the kernel density of retail businesses, which was measured by using bandwidths of pre-determined 500, 1000 and 5000 m, respectively. In fact, their method is not a fully data-based approach in that they used pre-determined kernel bandwidths, however, this paper has been planned to improve Lee and Lee's method by using a kind of data-based approach of the L-index that describes clustering level of point feature distribution. The case study is implemented to automobile-related retail businesses in Seoul, Korea with selected Kernel bandwidths, 1211.5, 2120.2 and 7067.2 m from L-index analysis. Subsequently, the kernel density is measured, the density DEM is created by inverting it, and boundaries of market areas are extracted. Following the study, analysis results are summarized as follows. Firstly, the L-index can be a useful tool to complement the Lee and Lee's market area analysis method. At next, the kernel bandwidths, pre-determined by Lee and Lee, cannot be uniformly applied to all kinds of retail businesses. Lastly, the L-index method can be useful for analyzing the space structure of market areas of retail businesses, based on Christaller's central place theory.

A Digital Elevation Analysis : Sparially Distributed Flow Apportioning Algorithm (수치 고도 분석 : 분포형 흐름 분배 알고리즘)

  • Kim, Sang-Hyeon;Kim, Gyeong-Hyeon;Jeong, Seon-Hui
    • Journal of Korea Water Resources Association
    • /
    • v.34 no.3
    • /
    • pp.241-251
    • /
    • 2001
  • A flow determination algorithm is proposed for the distributed hydrologic model. The advantages of a single flow direction scheme and multiple flow direction schemes are selectively considered to address the drawbacks of existing algorithms. A spatially varied flow apportioning factor is introduced in order to accommodate the accumulated area from upslope cells. The channel initiation threshold area(CIT) concept is expanded and integrated into the spatially distributed flow apportioning algorithm in order to delineate a realistic channel network. An application of a field example suggests that the linearly distributed flow apportioning scheme provides some advantages over existing approaches, such as the relaxation of over-dissipation problems near channel cells, the connectivity feature of river cells, the continuity of saturated areas and the negligence of the optimization of few parameters in existing algorithms. The effects of grid sizes are explored spatially as well as statistically.

  • PDF

Computer Vision Based Measurement, Error Analysis and Calibration (컴퓨터 시각(視覺)에 의거한 측정기술(測定技術) 및 측정오차(測定誤差)의 분석(分析)과 보정(補正))

  • Hwang, H.;Lee, C.H.
    • Journal of Biosystems Engineering
    • /
    • v.17 no.1
    • /
    • pp.65-78
    • /
    • 1992
  • When using a computer vision system for a measurement, the geometrically distorted input image usually restricts the site and size of the measuring window. A geometrically distorted image caused by the image sensing and processing hardware degrades the accuracy of the visual measurement and prohibits the arbitrary selection of the measuring scope. Therefore, an image calibration is inevitable to improve the measuring accuracy. A calibration process is usually done via four steps such as measurement, modeling, parameter estimation, and compensation. In this paper, the efficient error calibration technique of a geometrically distorted input image was developed using a neural network. After calibrating a unit pixel, the distorted image was compensated by training CMLAN(Cerebellar Model Linear Associator Network) without modeling the behavior of any system element. The input/output training pairs for the network was obtained by processing the image of the devised sampled pattern. The generalization property of the network successfully compensates the distortion errors of the untrained arbitrary pixel points on the image space. The error convergence of the trained network with respect to the network control parameters were also presented. The compensated image through the network was then post processed using a simple DDA(Digital Differential Analyzer) to avoid the pixel disconnectivity. The compensation effect was verified using known sized geometric primitives. A way to extract directly a real scaled geometric quantity of the object from the 8-directional chain coding was also devised and coded. Since the developed calibration algorithm does not require any knowledge of modeling system elements and estimating parameters, it can be applied simply to any image processing system. Furthermore, it efficiently enhances the measurement accuracy and allows the arbitrary sizing and locating of the measuring window. The applied and developed algorithms were coded as a menu driven way using MS-C language Ver. 6.0, PC VISION PLUS library functions, and VGA graphic functions.

  • PDF

Comparison of Topex/Poseidon sea levels data and Tide Gause sea levels data from the South Indian Ocean (남인도양에서의 해수면에 대한 위성자료(Topex/Poseidon 고도계)와 현장자료(Tide Gauge 해면계)간의 비교)

  • 윤홍주;김상우;이문옥;박일흠
    • Proceedings of the Korean Institute of Information and Commucation Sciences Conference
    • /
    • 2001.10a
    • /
    • pp.281-285
    • /
    • 2001
  • According to standard procedures as defined in the users handbook for sea level data processes, I was compared to Topex/poseidon sea level data from the first 350days of mission and Tide Gauge sea level data from the Amsterdam- Crozet- Kerguelen region in the South Indian Ocean. The comparison improves significantly when many factors for the corrections were removed, then only the aliased oceanic tidal energy is removed by oceanic tide model in this period. Making the corrections and smoothing the sea level data over 60km along-track segments and the Tide Gauge sea level data for the time series results in the digital correlation and RMS difference between the two data of c=-0.12 and rms=11.4cm, c=0.55 and rms=5.38cm, and c=0.83 and rms=2.83cm for the Amsterdam, Crozet and Kerguelen plateau, respectively. It was also found that the Kerguelen plateau has a comparisons due to propagating signals(the baroclinic Rossby wave with velocity of -3.9~-4.2cm/sec, period of 167days and amplitude of 10cm) that introduce temporal lags($\tau$=10~30days) between the altimeter and tide gauge time series. The conclusion is that on timescales longer than about 10days the RMS sea level errors are less than or of the order of several centimeters and are mainly due to the effects of currents rather than the effects of sterics(water temperature, density) and winds.

  • PDF

Analysis of Shadows Effect in Seoul Area for the Estimation of Roof-type PV Power Calculation (지붕형 태양광 발전량 산정을 위한 서울지역 그림자 효과 분석)

  • Yun, ChangYeol;Jung, BoRin;Kim, ShinYoung;Kim, ChangKi;Kim, JinYoung;Kim, HyunGoo;Kang, YongHeack;Kim, YongIl
    • Journal of the Korean Solar Energy Society
    • /
    • v.38 no.2
    • /
    • pp.45-53
    • /
    • 2018
  • For the preliminary step for estimating the performance of roof-type photovoltaic system in urban areas, we analyzed the solar radiation reduction ratio by shadow effect by buildings using DSM (Digital Surface Model) and GIS (Geographical Information System) tools. An average loss by the shadow is about 19% in Seoul. The result was related to the building density and distribution. Monthly results show that the winter season (December and January) was more affected by the shading than during the summer season (June and July). It is expected that useful empirical formulas can be made if more detailed correlation studies are performed.

Spatial analysis of Shoreline change in Northwest coast of Taean Peninsula

  • Yun, MyungHyun;Choi, ChulUong
    • Korean Journal of Remote Sensing
    • /
    • v.31 no.1
    • /
    • pp.29-38
    • /
    • 2015
  • The coastline influenced naturally and artificially changes dynamically. While the long-term change is influenced by the rise in the surface of the sea and the changes in water level of the rivers, the short-term change is influenced by the tide, earthquake and storm. Also, man-made thoughtless development such as construction of embankment and reclaimed land not considering erosion and deformation of coast has been causes for breaking functions of coast and damages on natural environment. In order to manage coastal environment and resources effectively, In this study is intended to analyze and predict erosion in coastal environment and changes in sedimentation quantitatively by detecting changes in coastal line from data collection for satellite images and aerial LiDAR data. The coastal line in 2007 and 2012 was extracted by manufacturing Digital Surface Model (DSM) with Aviation LiDAR materials. For the coastal line in 2009 and 2010, Normalized Difference Vegetation Index (NDVI) method was used to extract the KOMPSAT-2 image selected after considering tide level and wave height. The change rate of the coastal line is varied in line with the forms of the observation target but most of topography shows a tendency of being eroded as time goes by. Compared to the relatively monotonous beach of Taean, the gravel and rock has very complex form. Therefore, there are more errors in extraction of coastlines and the combination of transect and shoreline, which affect overall changes. Thus, we think the correction of the anomalies caused by these properties is required in the future research.

Applicable Evaluation of the Latest Land-use Data for Developing a Real-time Atmospheric Field Prediction of RAMS (RAMS의 실시간 기상장 예측 향상을 위한 최신 토지피복도 자료의 적용가능성)

  • Won, Gyeong-Mee;Lee, Hwa-Woon;Yu, Jeong-Ah;Hong, Hyun-Su;Hwang, Man-Sik;Chun, Kwang-Su;Choi, Kwang-Su;Lee, Moon-Soon
    • Journal of Korean Society for Atmospheric Environment
    • /
    • v.24 no.1
    • /
    • pp.1-15
    • /
    • 2008
  • Chemical Accident Response Information System (CARIS) which has been designed for the efficient emergency response of chemical accidents produces the real-time atmospheric fields through the Regional Atmospheric Modeling System, RAMS. The previous studies were emphasized that improving an initial input data had more effective results in developing prediction ability of atmospheric model. In a continuous effort to improve an initial input data, we replaced the land-use dataset using in the RAMS, which is a high resolution USGS digital data constructed in April, 1993, with the latest land-use data of the Korea Ministry of Environment over the South Korea and simulated atmospheric fields for developing a real-time prediction in dispersion of chemicals. The results showed that the new land-use data was written in a standard RAMS format and shown the modified surface characteristics and the landscape heterogeneity resulting from land-use change. In the results of sensitivity experiment we got the improved atmospheric fields and assured that it will give more reliable real-time atmospheric fields to all users of CARIS for the dispersion forecast in associated with hazardous chemical releases as well as general air pollutants.

Design and Implementation of Tree-based Reliable Dissemination Multicast Protocol With Differential Control and Key Management (차별 제어와 키 관리 기능을 통한 트리 기반의 신뢰성 있는 멀티캐스트 프로토콜의 설계 및 구현)

  • Kim, Yeong-Jae;Park, Eun-Yong;An, Sang-Jun;Hyeon, Ho-Jae;Han, Seon-Yeong
    • The KIPS Transactions:PartC
    • /
    • v.9C no.2
    • /
    • pp.235-246
    • /
    • 2002
  • While the Internet is suffering from the massive data such as video stream, IP multicast can ease the load of the Internet by enabling one copy of digital information to be received by multiple computers simultaneously. But If multicast is based on UDP, packets are delivered using a best-effort Policy without any reliability, congestion control or flow control. Multicast group members can join or leave a multicast group at will, and multicast uses broadcast mechanism, it's very hard to keep security from unauthorized members. In this paper, we introduce a new reliable multicast protocol TRDMF proper for one-to-many multicast model with reliability, flow control, congestion control and key management.

Complexity Estimation Based Work Load Balancing for a Parallel Lidar Waveform Decomposition Algorithm

  • Jung, Jin-Ha;Crawford, Melba M.;Lee, Sang-Hoon
    • Korean Journal of Remote Sensing
    • /
    • v.25 no.6
    • /
    • pp.547-557
    • /
    • 2009
  • LIDAR (LIght Detection And Ranging) is an active remote sensing technology which provides 3D coordinates of the Earth's surface by performing range measurements from the sensor. Early small footprint LIDAR systems recorded multiple discrete returns from the back-scattered energy. Recent advances in LIDAR hardware now make it possible to record full digital waveforms of the returned energy. LIDAR waveform decomposition involves separating the return waveform into a mixture of components which are then used to characterize the original data. The most common statistical mixture model used for this process is the Gaussian mixture. Waveform decomposition plays an important role in LIDAR waveform processing, since the resulting components are expected to represent reflection surfaces within waveform footprints. Hence the decomposition results ultimately affect the interpretation of LIDAR waveform data. Computational requirements in the waveform decomposition process result from two factors; (1) estimation of the number of components in a mixture and the resulting parameter estimates, which are inter-related and cannot be solved separately, and (2) parameter optimization does not have a closed form solution, and thus needs to be solved iteratively. The current state-of-the-art airborne LIDAR system acquires more than 50,000 waveforms per second, so decomposing the enormous number of waveforms is challenging using traditional single processor architecture. To tackle this issue, four parallel LIDAR waveform decomposition algorithms with different work load balancing schemes - (1) no weighting, (2) a decomposition results-based linear weighting, (3) a decomposition results-based squared weighting, and (4) a decomposition time-based linear weighting - were developed and tested with varying number of processors (8-256). The results were compared in terms of efficiency. Overall, the decomposition time-based linear weighting work load balancing approach yielded the best performance among four approaches.