• Title/Summary/Keyword: speed data

Search Result 8,918, Processing Time 0.05 seconds

Groundwater Recharge Evaluation on Yangok-ri Area of Hongseong Using a Distributed Hydrologic Model (VELAS) (분포형 수문모형(VELAS)을 이용한 홍성 양곡리 일대 지하수 함양량 평가)

  • Ha, Kyoochul;Park, Changhui;Kim, Sunghyun;Shin, Esther;Lee, Eunhee
    • Economic and Environmental Geology
    • /
    • v.54 no.2
    • /
    • pp.161-176
    • /
    • 2021
  • In this study, one of the distributed hydrologic models, VELAS, was used to analyze the variation of hydrologic elements based on water balance analysis to evaluate the groundwater recharge in more detail than the annual time scale for the past and future. The study area is located in Yanggok-ri, Seobu-myeon, Hongseong-gun, Chungnam-do, which is very vulnerable to drought. To implement the VELAS model, spatial characteristic data such as digital elevation model (DEM), vegetation, and slope were established, and GIS data were constructed through spatial interpolation on the daily air temperature, precipitation, average wind speed, and relative humidity of the Korea Meteorological Stations. The results of the analysis showed that annual precipitation was 799.1-1750.8 mm, average 1210.7 mm, groundwater recharge of 28.8-492.9 mm, and average 196.9 mm over the past 18 years from 2001 to 2018 in the study area. Annual groundwater recharge rate compared to annual precipitation was from 3.6 to 28.2% with a very large variation and average 14.9%. By the climate change RCP 8.5 scenario, the annual precipitation from 2019 to 2100 was 572.8-1996.5 mm (average 1078.4 mm) and groundwater recharge of 26.7-432.5 mm (average precipitation 16.2%). The annual groundwater recharge rates in the future were projected from 2.8% to 45.1%, 18.2% on average. The components that make up the water balance were well correlated with precipitation, especially in the annual data rather than the daily data. However, the amount of evapotranspiration seems to be more affected by other climatic factors such as temperature. Groundwater recharge in more detailed time scale rather than annual scale is expected to provide basic data that can be used for groundwater development and management if precipitation are severely varied by time, such as droughts or floods.

A Study on the Application of the Smartphone Hiking Apps for Analyzing the User Characteristics in Forest Recreation Area: Focusing on Daegwallyoung Area (산림휴양공간 이용특성 분석을 위한 국내 스마트폰 산행앱(APP)의 적용성 및 활용방안 연구: 대관령 선자령 일대를 중심으로)

  • Jang, Youn-Sun;Yoo, Rhee-Hwa;Lee, Jeong-Hee
    • Journal of Korean Society of Forest Science
    • /
    • v.108 no.3
    • /
    • pp.382-391
    • /
    • 2019
  • This study was conducted to verify whether smartphone hiking apps, which generate social network data including location information, are useful tools for analyzing the use characteristics of a forest recreation area. For this purpose, the study identified the functions and service characteristics of smartphone hiking apps. Also, the use characteristics of the area of Daegwallyoung were analyzed, compared with the results of the field survey, and the applicability of hiking apps was reviewed. As a result, the service types of hiking apps were analyzed in terms of three categories: "information offering," "hiking record," and "information sharing." This study focused on an app that is one of the "hiking record" types with the greatest number of users. Analysis of the data from hiking apps and a field survey in the Daegwallyoung area showed that both hiking apps and the field survey can be used to identify the movement patterns, but hiking apps based on a global positioning system (GPS) are more efficient and objective tools for understanding the use patterns in a forest recreation area, as well as for extracting user-generated photos. Second, although it is advantageous to analyze the patterns objectively through the walking-speed data generated, field surveys and observation are needed as complements for understanding the types of activities in each space. The hiking apps are based on cellphone use and are specific to "hiking" use, so user bias can limit the usefulness of the data. It is significant that this research shows the applicability of hiking apps for analyzing the use patterns of forest recreation areas through the location-based social network data of app users who record their hiking information voluntarily.

A Study on the Digital Drawing of Archaeological Relics Using Open-Source Software (오픈소스 소프트웨어를 활용한 고고 유물의 디지털 실측 연구)

  • LEE Hosun;AHN Hyoungki
    • Korean Journal of Heritage: History & Science
    • /
    • v.57 no.1
    • /
    • pp.82-108
    • /
    • 2024
  • With the transition of archaeological recording method's transition from analog to digital, the 3D scanning technology has been actively adopted within the field. Research on the digital archaeological digital data gathered from 3D scanning and photogrammetry is continuously being conducted. However, due to cost and manpower issues, most buried cultural heritage organizations are hesitating to adopt such digital technology. This paper aims to present a digital recording method of relics utilizing open-source software and photogrammetry technology, which is believed to be the most efficient method among 3D scanning methods. The digital recording process of relics consists of three stages: acquiring a 3D model, creating a joining map with the edited 3D model, and creating an digital drawing. In order to enhance the accessibility, this method only utilizes open-source software throughout the entire process. The results of this study confirms that in terms of quantitative evaluation, the deviation of numerical measurement between the actual artifact and the 3D model was minimal. In addition, the results of quantitative quality analysis from the open-source software and the commercial software showed high similarity. However, the data processing time was overwhelmingly fast for commercial software, which is believed to be a result of high computational speed from the improved algorithm. In qualitative evaluation, some differences in mesh and texture quality occurred. In the 3D model generated by opensource software, following problems occurred: noise on the mesh surface, harsh surface of the mesh, and difficulty in confirming the production marks of relics and the expression of patterns. However, some of the open source software did generate the quality comparable to that of commercial software in quantitative and qualitative evaluations. Open-source software for editing 3D models was able to not only post-process, match, and merge the 3D model, but also scale adjustment, join surface production, and render image necessary for the actual measurement of relics. The final completed drawing was tracked by the CAD program, which is also an open-source software. In archaeological research, photogrammetry is very applicable to various processes, including excavation, writing reports, and research on numerical data from 3D models. With the breakthrough development of computer vision, the types of open-source software have been diversified and the performance has significantly improved. With the high accessibility to such digital technology, the acquisition of 3D model data in archaeology will be used as basic data for preservation and active research of cultural heritage.

An Iterative, Interactive and Unified Seismic Velocity Analysis (반복적 대화식 통합 탄성파 속도분석)

  • Suh Sayng-Yong;Chung Bu-Heung;Jang Seong-Hyung
    • Geophysics and Geophysical Exploration
    • /
    • v.2 no.1
    • /
    • pp.26-32
    • /
    • 1999
  • Among the various seismic data processing sequences, the velocity analysis is the most time consuming and man-hour intensive processing steps. For the production seismic data processing, a good velocity analysis tool as well as the high performance computer is required. The tool must give fast and accurate velocity analysis. There are two different approches in the velocity analysis, batch and interactive. In the batch processing, a velocity plot is made at every analysis point. Generally, the plot consisted of a semblance contour, super gather, and a stack pannel. The interpreter chooses the velocity function by analyzing the velocity plot. The technique is highly dependent on the interpreters skill and requires human efforts. As the high speed graphic workstations are becoming more popular, various interactive velocity analysis programs are developed. Although, the programs enabled faster picking of the velocity nodes using mouse, the main improvement of these programs is simply the replacement of the paper plot by the graphic screen. The velocity spectrum is highly sensitive to the presence of the noise, especially the coherent noise often found in the shallow region of the marine seismic data. For the accurate velocity analysis, these noise must be removed before the spectrum is computed. Also, the velocity analysis must be carried out by carefully choosing the location of the analysis point and accuarate computation of the spectrum. The analyzed velocity function must be verified by the mute and stack, and the sequence must be repeated most time. Therefore an iterative, interactive, and unified velocity analysis tool is highly required. An interactive velocity analysis program, xva(X-Window based Velocity Analysis) was invented. The program handles all processes required in the velocity analysis such as composing the super gather, computing the velocity spectrum, NMO correction, mute, and stack. Most of the parameter changes give the final stack via a few mouse clicks thereby enabling the iterative and interactive processing. A simple trace indexing scheme is introduced and a program to nike the index of the Geobit seismic disk file was invented. The index is used to reference the original input, i.e., CDP sort, directly A transformation techinique of the mute function between the T-X domain and NMOC domain is introduced and adopted to the program. The result of the transform is simliar to the remove-NMO technique in suppressing the shallow noise such as direct wave and refracted wave. However, it has two improvements, i.e., no interpolation error and very high speed computing time. By the introduction of the technique, the mute times can be easily designed from the NMOC domain and applied to the super gather in the T-X domain, thereby producing more accurate velocity spectrum interactively. The xva program consists of 28 files, 12,029 lines, 34,990 words and 304,073 characters. The program references Geobit utility libraries and can be installed under Geobit preinstalled environment. The program runs on X-Window/Motif environment. The program menu is designed according to the Motif style guide. A brief usage of the program has been discussed. The program allows fast and accurate seismic velocity analysis, which is necessary computing the AVO (Amplitude Versus Offset) based DHI (Direct Hydrocarn Indicator), and making the high quality seismic sections.

  • PDF

DC Resistivity method to image the underground structure beneath river or lake bottom (하저 지반특성 규명을 위한 전기비저항 탐사)

  • Kim Jung-Ho;Yi Myeong-Jong;Song Yoonho;Cho Seong-Jun;Lee Seong-Kon;Son Jeongsul
    • 한국지구물리탐사학회:학술대회논문집
    • /
    • 2002.09a
    • /
    • pp.139-162
    • /
    • 2002
  • Since weak zones or geological lineaments are likely to be eroded, weak zones may develop beneath rivers, and a careful evaluation of ground condition is important to construct structures passing through a river. Dc resistivity surveys, however, have seldomly applied to the investigation of water-covered area, possibly because of difficulties in data aquisition and interpretation. The data aquisition having high quality may be the most important factor, and is more difficult than that in land survey, due to the water layer overlying the underground structure to be imaged. Through the numerical modeling and the analysis of case histories, we studied the method of resistivity survey at the water-covered area, starting from the characteristics of measured data, via data acquisition method, to the interpretation method. We unfolded our discussion according to the installed locations of electrodes, ie., floating them on the water surface, and installing at the water bottom, since the methods of data acquisition and interpretation vary depending on the electrode location. Through this study, we could confirm that the dc resistivity method can provide the fairly reasonable subsurface images. It was also shown that installing electrodes at the water bottom can give the subsurface image with much higher resolution than floating them on the water surface. Since the data acquired at the water-covered area have much lower sensitivity to the underground structure than those at the land, and can be contaminated by the higher noise, such as streaming potential, it would be very important to select the acquisition method and electrode array being able to provide the higher signal-to-noise ratio data as well as the high resolving power. The method installing electrodes at the water bottom is suitable to the detailed survey because of much higher resolving power, whereas the method floating them, especially streamer dc resistivity survey, is to the reconnaissance survey owing of very high speed of field work.

  • PDF

Analysis and Evaluation of Frequent Pattern Mining Technique based on Landmark Window (랜드마크 윈도우 기반의 빈발 패턴 마이닝 기법의 분석 및 성능평가)

  • Pyun, Gwangbum;Yun, Unil
    • Journal of Internet Computing and Services
    • /
    • v.15 no.3
    • /
    • pp.101-107
    • /
    • 2014
  • With the development of online service, recent forms of databases have been changed from static database structures to dynamic stream database structures. Previous data mining techniques have been used as tools of decision making such as establishment of marketing strategies and DNA analyses. However, the capability to analyze real-time data more quickly is necessary in the recent interesting areas such as sensor network, robotics, and artificial intelligence. Landmark window-based frequent pattern mining, one of the stream mining approaches, performs mining operations with respect to parts of databases or each transaction of them, instead of all the data. In this paper, we analyze and evaluate the techniques of the well-known landmark window-based frequent pattern mining algorithms, called Lossy counting and hMiner. When Lossy counting mines frequent patterns from a set of new transactions, it performs union operations between the previous and current mining results. hMiner, which is a state-of-the-art algorithm based on the landmark window model, conducts mining operations whenever a new transaction occurs. Since hMiner extracts frequent patterns as soon as a new transaction is entered, we can obtain the latest mining results reflecting real-time information. For this reason, such algorithms are also called online mining approaches. We evaluate and compare the performance of the primitive algorithm, Lossy counting and the latest one, hMiner. As the criteria of our performance analysis, we first consider algorithms' total runtime and average processing time per transaction. In addition, to compare the efficiency of storage structures between them, their maximum memory usage is also evaluated. Lastly, we show how stably the two algorithms conduct their mining works with respect to the databases that feature gradually increasing items. With respect to the evaluation results of mining time and transaction processing, hMiner has higher speed than that of Lossy counting. Since hMiner stores candidate frequent patterns in a hash method, it can directly access candidate frequent patterns. Meanwhile, Lossy counting stores them in a lattice manner; thus, it has to search for multiple nodes in order to access the candidate frequent patterns. On the other hand, hMiner shows worse performance than that of Lossy counting in terms of maximum memory usage. hMiner should have all of the information for candidate frequent patterns to store them to hash's buckets, while Lossy counting stores them, reducing their information by using the lattice method. Since the storage of Lossy counting can share items concurrently included in multiple patterns, its memory usage is more efficient than that of hMiner. However, hMiner presents better efficiency than that of Lossy counting with respect to scalability evaluation due to the following reasons. If the number of items is increased, shared items are decreased in contrast; thereby, Lossy counting's memory efficiency is weakened. Furthermore, if the number of transactions becomes higher, its pruning effect becomes worse. From the experimental results, we can determine that the landmark window-based frequent pattern mining algorithms are suitable for real-time systems although they require a significant amount of memory. Hence, we need to improve their data structures more efficiently in order to utilize them additionally in resource-constrained environments such as WSN(Wireless sensor network).

A Fast Processor Architecture and 2-D Data Scheduling Method to Implement the Lifting Scheme 2-D Discrete Wavelet Transform (리프팅 스킴의 2차원 이산 웨이브릿 변환 하드웨어 구현을 위한 고속 프로세서 구조 및 2차원 데이터 스케줄링 방법)

  • Kim Jong Woog;Chong Jong Wha
    • Journal of the Institute of Electronics Engineers of Korea SD
    • /
    • v.42 no.4 s.334
    • /
    • pp.19-28
    • /
    • 2005
  • In this paper, we proposed a parallel fast 2-D discrete wavelet transform hardware architecture based on lifting scheme. The proposed architecture improved the 2-D processing speed, and reduced internal memory buffer size. The previous lifting scheme based parallel 2-D wavelet transform architectures were consisted with row direction and column direction modules, which were pair of prediction and update filter module. In 2-D wavelet transform, column direction processing used the row direction results, which were not generated in column direction order but in row direction order, so most hardware architecture need internal buffer memory. The proposed architecture focused on the reducing of the internal memory buffer size and the total calculation time. Reducing the total calculation time, we proposed a 4-way data flow scheduling and memory based parallel hardware architecture. The 4-way data flow scheduling can increase the row direction parallel performance, and reduced the initial latency of starting of the row direction calculation. In this hardware architecture, the internal buffer memory didn't used to store the results of the row direction calculation, while it contained intermediate values of column direction calculation. This method is very effective in column direction processing, because the input data of column direction were not generated in column direction order The proposed architecture was implemented with VHDL and Altera Stratix device. The implementation results showed overall calculation time reduced from $N^2/2+\alpha$ to $N^2/4+\beta$, and internal buffer memory size reduced by around $50\%$ of previous works.

The Nopsae;a Foehn type wind over the Young Suh region of central Korea (영서지방의 푄현상)

  • ;Lee, Hyon-Young
    • Journal of the Korean Geographical Society
    • /
    • v.29 no.3
    • /
    • pp.266-280
    • /
    • 1994
  • Upper-air synoptic data and surface weather elements such as temperature, relative humidity, wind speed, cloud and precipitation were analyzed in some detail to determine the characteristics of Nopsae, a foehn-like surface wind over the Youngsuh region of Central Korea. NOAA AVHRR and GMS images are also referenced to identify the distribution of clouds and precipitation to classify the tpyes of foehn over the study area. The data period examined is from 1982 until 1993 of spring and summer months from March through August. Results of the anaylsis are as follows. Warm and dry air penetration over the Younesuh region has experienced on foehn days occured between March 21 and August 10 during study perion. The mean annual number of foehn the days were 28. Foehn phenomena were prominent during March 21-25, April 5-15, May 25-June 10, and June 26-30 pentads. The intensity of the phenomena can be evaluated as the difference of daily maximum temperature and relative humidity between windward sites and leeward sites. The intensity of daily maximum temperature reached 14.5$^{\circ}C$, but most values were in the range of 5.0-7.5$^{\circ}C$ (61%). Although strong intensity of foehns usually develop in June, it is common that farmers in the region experince more aridity during the foehnday of April and May due to the transplantation of rice seedlings. Long-run foehn are not common phenomena and 55% of foehn terminate in one day, but there is a record that Nopsae persisted up to 9 days continuously. The author identified using the cloud and precipitation data out of NOAA-11, AVHRR and GMS images is that one of them has no precipitation over windward side. The available data and the results of the analysis are somewhat inadequate. Since the results imply that wave phenomenon is potentially important in terms of local surface weather and vertical momentum transport, more detailed theoretical and observational studies are necessary to clarify the mechanism and the impacts of Nopsae.

  • PDF

Design and Implementation of Content-based Video Database using an Integrated Video Indexing Method (통합된 비디오 인덱싱 방법을 이용한 내용기반 비디오 데이타베이스의 설계 및 구현)

  • Lee, Tae-Dong;Kim, Min-Koo
    • Journal of KIISE:Computing Practices and Letters
    • /
    • v.7 no.6
    • /
    • pp.661-683
    • /
    • 2001
  • There is a rapid increase in the use of digital video information in recent years, it becomes more important to manage video databases efficiently. The development of high speed data network and digital techniques has emerged new multimedia applications such as internet broadcasting, Video On Demand(VOD) combined with video data processing and computer. Video database should be construct for searching fast, efficient video be extract the accurate feature information of video with more massive and more complex characteristics. Video database are essential differences between video databases and traditional databases. These differences lead to interesting new issues in searching of video, data modeling. So, cause us to consider new generation method of database, efficient retrieval method of video. In this paper, We propose the construction and generation method of the video database based on contents which is able to accumulate the meaningful structure of video and the prior production information. And by the proposed the construction and generation method of the video database implemented the video database which can produce the new contents for the internet broadcasting centralized on the video database. For this production, We proposed the video indexing method which integrates the annotation-based retrieval and the content-based retrieval in order to extract and retrieval the feature information of the video data using the relationship between the meaningful structure and the prior production information on the process of the video parsing and extracting the representative key frame. We can improve the performance of the video contents retrieval, because the integrated video indexing method is using the content-based metadata type represented in the low level of video and the annotation-based metadata type impressed in the high level which is difficult to extract the feature information of the video at he same time.

  • PDF

Urban Climate Impact Assessment Reflecting Urban Planning Scenarios - Connecting Green Network Across the North and South in Seoul - (서울 도시계획 정책을 적용한 기후영향평가 - 남북녹지축 조성사업을 대상으로 -)

  • Kwon, Hyuk-Gi;Yang, Ho-Jin;Yi, Chaeyeon;Kim, Yeon-Hee;Choi, Young-Jean
    • Journal of Environmental Impact Assessment
    • /
    • v.24 no.2
    • /
    • pp.134-153
    • /
    • 2015
  • When making urban planning, it is important to understand climate effect caused by urban structural changes. Seoul city applies UPIS(Urban Plan Information System) which provides information on urban planning scenario. Technology for analyzing climate effect resulted from urban planning needs to developed by linking urban planning scenario provided by UPIS and climate analysis model, CAS(Climate Analysis Seoul). CAS develops for analyzing urban climate conditions to provide realistic information considering local air temperature and wind flows. Quantitative analyses conducted by CAS for the production, transportation, and stagnation of cold air, wind flow and thermal conditions by incorporating GIS analysis on land cover and elevation and meteorological analysis from MetPhoMod(Meteorology and atmospheric Photochemistry Meso-scale model). In order to reflect land cover and elevation of the latest information, CAS used to highly accurate raster data (1m) sourced from LiDAR survey and KOMPSAT-2(KOrea Multi-Purpose SATellite) satellite image(4m). For more realistic representation of land surface characteristic, DSM(Digital Surface Model) and DTM(Digital Terrain Model) data used as an input data for CFD(Computational Fluid Dynamics) model. Eight inflow directions considered to investigate the change of flow pattern, wind speed according to reconstruction and change of thermal environment by connecting green area formation. Also, MetPhoMod in CAS data used to consider realistic weather condition. The result show that wind corridors change due to reconstruction. As a whole surface temperature around target area decreases due to connecting green area formation. CFD model coupled with CAS is possible to evaluate the wind corridor and heat environment before/after reconstruction and connecting green area formation. In This study, analysis of climate impact before and after created the green area, which is part of 'Connecting green network across the north and south in Seoul' plan, one of the '2020 Seoul master plan'.