• Title/Summary/Keyword: 속도보정

Search Result 659, Processing Time 0.025 seconds

The Shear Wave Velocity Analysis using Passive Method MASW in the Center of the Metropolis, Gyeongsan (Passive Method MASW 방법을 이용한 경산시 도심구간에서의 전단파 속도 분석)

  • Lee, Hong-Gyu;Kim, Woo-Hyuk;Jang, Seung-Ik;Lee, Seog-Kyu
    • The Journal of Engineering Geology
    • /
    • v.17 no.4
    • /
    • pp.511-516
    • /
    • 2007
  • Active method MASW(Multi channel Analysis of Surface Waves), which is one of the surface wave exploration methods, has the difficulties to supply enough shear wave velocity log, caused by short spread length and lack of low frequency energy. To make up this defect, the passive method MASW survey is taked and analysised in Daeku subway construction site, Jungpyung-dong Gyeongsan city. The passive method MASW using the microtremor, improve the quality of the overtone record by applying the azimuth correction caused offline sources. And combing with active overtone record which is acquired by same geometry has the benefits of improve shallow depth resolution and extend possible investigation depth. To take the optimized acquisition parameters, the 2m, 4m, and 6m geophone spacing is tested. And 2m spacing overtone image could make the reliable shear wave velocity log.

Geostatistical Integration Analysis of Geophysical Survey and Borehole Data Applying Digital Map (수치지도를 활용한 탄성파탐사 자료와 시추조사 자료의 지구통계학적 통합 분석)

  • Kim, Hansaem;Kim, Jeongjun;Chung, Choongki
    • Journal of the Korean GEO-environmental Society
    • /
    • v.15 no.3
    • /
    • pp.65-74
    • /
    • 2014
  • Borehole investigation which is mainly used to figure out geotechnical characterizations at construction work has the benefit that it provides a clear and convincing geotechnical information. But it has limitations to get the overall information of the construction site because it is performed at point location. In contrast, geophysical measurements like seismic survey has the advantage that the geological stratum information of a large area can be characterized in a continuous cross-section but the result from geophysics survey has wide range of values and is not suitable to determine the geotechnical design values directly. Therefore it is essential to combine borehole data and geophysics data complementally. Accordingly, in this study, a three-dimensional spatial interpolation of the cross-sectional distribution of seismic refraction was performed using digitizing and geostatistical method (krigring). In the process, digital map were used to increase the trustworthiness of method. Using this map, errors of ground height which are broken out in measurement from boring investigation and geophysical measurements can be revised. After that, average seismic velocity are derived by comparing borehole data with geophysical speed distribution data of each soil layer. During this process, outlier analysis is adapted. On the basis of the average seismic velocity, integrated analysis techniques to determine the three-dimensional geological stratum information is established. Finally, this analysis system is applied to dam construction field.

Prediction of Pathway and Toxicity on Dechlorination of PCDDs by Linear Free Energy Relationship (다이옥신의 환원적 탈염화 분해 경로와 독성 변화예측을 위한 LFER 모델)

  • Kim, Ji-Hun;Chang, Yoon-Seok
    • Journal of Korean Society of Environmental Engineers
    • /
    • v.31 no.2
    • /
    • pp.125-131
    • /
    • 2009
  • Reductive dechlorination of polychlorinated dibenzo-p-dioxins (PCDDs) and its toxicity change were predicted by the linear free energy relationship (LFER) model to assess the zero-valent iron (ZVI) and anaerobic dechlorinating bacteria (ADB) as electron donors in PCDDs dechlorination. Reductive dechlorination of PCDDs involves 256 reactions linking 76 congeners with highly variable toxicities, so is challenging to assess the overall effect of this process on the environmental impact of PCDD contamination. The Gibbs free energies of PCDDs in aqueous solution were updated to density functional theory (DFT) calculation level from thermodynamic results of literatures. All of dechlorination kinetics of PCDDs was evaluated from the linear correlation between the experimental dechlorination kinetics of PCDDs and the calculated thermodynamics of PCDDs. As a result, it was predicted that over 100 years would be taken for the complete dechlorination of octachlorinated dibenzo-p-dioxin (OCDD) to non-chlorinated compound (dibenzo-p-dioxin, DD), and the toxic equivalent quantity (TEQ) of PCDDs could increase to 10 times larger from initial TEQ with the dechlorination process. The results imply that the single reductive dechlorination using ZVI or ADB is not suitable for the treatment strategy of PCDDs contaminated soil, sediment and fly ash. This LFER approach is applicable for the prediction of dechlorination process for organohalogen compounds and for the assessment of electron donating system for treatment strategies.

Intermediate View Image and its Digital Hologram Generation for an Virtual Arbitrary View-Point Hologram Service (임의의 가상시점 홀로그램 서비스를 위한 중간시점 영상 및 디지털 홀로그램 생성)

  • Seo, Young-Ho;Lee, Yoon-Hyuk;Koo, Ja-Myung;Kim, Dong-Wook
    • Journal of the Korea Institute of Information and Communication Engineering
    • /
    • v.17 no.1
    • /
    • pp.15-31
    • /
    • 2013
  • This paper proposes an intermediate image generation method for the viewer's view point by tracking the viewer's face, which is converted to a digital hologram. Its purpose is to increase the viewing angle of a digital hologram, which is gathering higher and higher interest these days. The method assumes that the image information for the leftmost and the rightmost view points within the viewing angle to be controlled are given. It uses a stereo-matching method between the leftmost and the rightmost depth images to obtain the pseudo-disparity increment per depth value. With this increment, the positional informations from both the leftmost view point and the rightmost view point are generated, which are blended to get the information at the wanted intermediate viewpoint. The occurrable dis-occlusion region in this case is defined and a inpainting method is proposed. The results from implementing and experimenting this method showed that the average image qualities of the generated depth and RGB image were 33.83[dB] and 29.5[dB], respectively, and the average execution time was 250[ms] per frame. Also, we propose a prototype system to service digital hologram interactively to the viewer by using the proposed intermediate view generation method. It includes the operations of data acquisition for the leftmost and the rightmost viewpoints, camera calibration and image rectification, intermediate view image generation, computer-generated hologram (CGH) generation, and reconstruction of the hologram image. This system is implemented in the LabView(R) environments, in which CGH generation and hologram image reconstruction are implemented with GPGPUs, while others are implemented in software. The implemented system showed the execution speed to process about 5 frames per second.

Evaluating Physical Characteristics of Raindrop in Anseong, Gyeonggi Province (강우입자의 물리적 특성평가: 경기도 안성시 지역을 사례로)

  • KIM, Jin Kwan;YANG, Dong Yoon;KIM, Min Seok
    • Journal of The Geomorphological Association of Korea
    • /
    • v.17 no.1
    • /
    • pp.49-57
    • /
    • 2010
  • To evaluate physical characteristics of open rainfall in Korea, terminal velocity of raindrop and drop size distributions (DSD) were continuously measured using by laser-optical disdrometer around Gosam reservoir, Anseong-si, Gyeonggi-do during three rainfall events from 2008 to 2009. The relationships between kinetic energies (KE, Jm-2mm-1; KER, Jm-2h-1) and rainfall intensity were obtained, respectively. Moreover, we compared the rainfall intensity from a disdrometer with the rainfall intensity from a tipping bucket raingauge to transform the kinetic energy of rainfall using the data from a tipping bucket raingauge. Therefore, the established relationships between kinetic energies (KE and KER) and rainfall intensity could be a useful model to consider the kinetic energy of raindrop using the rainfall intensity below 40mmh-1 of max 5-min rainfall intensity in the middle of South Korea. However, to better examine the relationship between kinetic energy and rainfall intensity, further measurement will be required.

Comparison of the 2D/3D Acoustic Full-waveform Inversions of 3D Ocean-bottom Seismic Data (3차원 해저면 탄성파 탐사 자료에 대한 2차원/3차원 음향 전파형역산 비교)

  • Hee-Chan, Noh;Sea-Eun, Park;Hyeong-Geun, Ji;Seok-Han, Kim;Xiangyue, Li;Ju-Won, Oh
    • Geophysics and Geophysical Exploration
    • /
    • v.25 no.4
    • /
    • pp.203-213
    • /
    • 2022
  • To understand an underlying geological structure via seismic imaging, the velocity information of the subsurface medium is crucial. Although the full-waveform inversion (FWI) method is considered useful for estimating subsurface velocity models, 3D FWI needs a lot-of computing power and time. Herein, we compare the calculation efficiency and accuracy of frequency-domain 2D and 3D acoustic FWIs. Thereafter, we demonstrate that the artifacts from 2D approximation can be partially suppressed via frequency-domain 2D FWI by employing diffraction angle filtering (DAF). By applying DAF, which employs only big reflection angle components, the impact of noise and out-of-plane reflections can be reduced. Additionally, it is anticipated that the DAF can create long-wavelength velocity structures for 3D FWI and migration.

A Study on the Field Data Applicability of Seismic Data Processing using Open-source Software (Madagascar) (오픈-소스 자료처리 기술개발 소프트웨어(Madagascar)를 이용한 탄성파 현장자료 전산처리 적용성 연구)

  • Son, Woohyun;Kim, Byoung-yeop
    • Geophysics and Geophysical Exploration
    • /
    • v.21 no.3
    • /
    • pp.171-182
    • /
    • 2018
  • We performed the seismic field data processing using an open-source software (Madagascar) to verify if it is applicable to processing of field data, which has low signal-to-noise ratio and high uncertainties in velocities. The Madagascar, based on Python, is usually supposed to be better in the development of processing technologies due to its capabilities of multidimensional data analysis and reproducibility. However, this open-source software has not been widely used so far for field data processing because of complicated interfaces and data structure system. To verify the effectiveness of the Madagascar software on field data, we applied it to a typical seismic data processing flow including data loading, geometry build-up, F-K filter, predictive deconvolution, velocity analysis, normal moveout correction, stack, and migration. The field data for the test were acquired in Gunsan Basin, Yellow Sea using a streamer consisting of 480 channels and 4 arrays of air-guns. The results at all processing step are compared with those processed with Landmark's ProMAX (SeisSpace R5000) which is a commercial processing software. Madagascar shows relatively high efficiencies in data IO and management as well as reproducibility. Additionally, it shows quick and exact calculations in some automated procedures such as stacking velocity analysis. There were no remarkable differences in the results after applying the signal enhancement flows of both software. For the deeper part of the substructure image, however, the commercial software shows better results than the open-source software. This is simply because the commercial software has various flows for de-multiple and provides interactive processing environments for delicate processing works compared to Madagascar. Considering that many researchers around the world are developing various data processing algorithms for Madagascar, we can expect that the open-source software such as Madagascar can be widely used for commercial-level processing with the strength of expandability, cost effectiveness and reproducibility.

A Study on the Development of High Sensitivity Collision Simulation with Digital Twin (디지털 트윈을 적용한 고감도 충돌 시뮬레이션 개발을 위한 연구)

  • Ki, Jae-Sug;Hwang, Kyo-Chan;Choi, Ju-Ho
    • Journal of the Society of Disaster Information
    • /
    • v.16 no.4
    • /
    • pp.813-823
    • /
    • 2020
  • Purpose: In order to maximize the stability and productivity of the work through simulation prior to high-risk facilities and high-cost work such as dismantling the facilities inside the reactor, we intend to use digital twin technology that can be closely controlled by simulating the specifications of the actual control equipment. Motion control errors, which can be caused by the time gap between precision control equipment and simulation in applying digital twin technology, can cause hazards such as collisions between hazardous facilities and control equipment. In order to eliminate and control these situations, prior research is needed. Method: Unity 3D is currently the most popular engine used to develop simulations. However, there are control errors that can be caused by time correction within Unity 3D engines. The error is expected in many environments and may vary depending on the development environment, such as system specifications. To demonstrate this, we develop crash simulations using Unity 3D engines, which conduct collision experiments under various conditions, organize and analyze the resulting results, and derive tolerances for precision control equipment based on them. Result: In experiments with collision experiment simulation, the time correction in 1/1000 seconds of an engine internal function call results in a unit-hour distance error in the movement control of the collision objects and the distance error is proportional to the velocity of the collision. Conclusion: Remote decomposition simulators using digital twin technology are considered to require limitations of the speed of movement according to the required precision of the precision control devices in the hardware and software environment and manual control. In addition, the size of modeling data such as system development environment, hardware specifications and simulations imitated control equipment and facilities must also be taken into account, available and acceptable errors of operational control equipment and the speed required of work.

Evaluation of Factors Related to Productivity and Yield Estimation Based on Growth Characteristics and Growing Degree Days in Highland Kimchi Cabbage (고랭지배추 생산성 관련요인 평가 및 생육량과 생육도일에 의한 수량예측)

  • Kim, Ki-Deog;Suh, Jong-Taek;Lee, Jong-Nam;Yoo, Dong-Lim;Kwon, Min;Hong, Soon-Choon
    • Horticultural Science & Technology
    • /
    • v.33 no.6
    • /
    • pp.911-922
    • /
    • 2015
  • This study was carried out to evaluate growth characteristics of Kimchi cabbage cultivated in various highland areas, and to create a predicting model for the production of highland Kimchi cabbage based on the growth parameters and climatic elements. Regression model for the estimation of head weight was designed with non-destructive measured growth variables (NDGV) such as leaf length (LL), leaf width (LW), head height (HH), head width (HW), and growing degree days (GDD), which was $y=6897.5-3.57{\times}GDD-136{\times}LW+116{\times}PH+155{\times}HH-423{\times}HW+0.28{\times}HH{\times}HW{\times}HW$, ($r^2=0.989$), and was improved by using compensation terms such as the ratio (LW estimated with GDD/measured LW ), leaf growth rate by soil moisture, and relative growth rate of leaf during drought period. In addition, we proposed Excel spreadsheet model for simulation of yield prediction of highland Kimchi cabbage. This Excel spreadsheet was composed four different sheets; growth data sheet measured at famer's field, daily average temperature data sheet for calculating GDD, soil moisture content data sheet for evaluating the soil water effect on leaf growth, and equation sheet for simulating the estimation of production. This Excel spreadsheet model can be practically used for predicting the production of highland Kimchi cabbage, which was calculated by (acreage of cultivation) ${\times}$ (number of plants) ${\times}$ (head weight estimated with growth variables and GDD) ${\times}$ (compensation terms derived relationship of GDD and growth by soil moisture) ${\times}$ (marketable head rate).

Three-Dimensional Conversion of Two-Dimensional Movie Using Optical Flow and Normalized Cut (Optical Flow와 Normalized Cut을 이용한 2차원 동영상의 3차원 동영상 변환)

  • Jung, Jae-Hyun;Park, Gil-Bae;Kim, Joo-Hwan;Kang, Jin-Mo;Lee, Byoung-Ho
    • Korean Journal of Optics and Photonics
    • /
    • v.20 no.1
    • /
    • pp.16-22
    • /
    • 2009
  • We propose a method to convert a two-dimensional movie to a three-dimensional movie using normalized cut and optical flow. In this paper, we segment an image of a two-dimensional movie to objects first, and then estimate the depth of each object. Normalized cut is one of the image segmentation algorithms. For improving speed and accuracy of normalized cut, we used a watershed algorithm and a weight function using optical flow. We estimate the depth of objects which are segmented by improved normalized cut using optical flow. Ordinal depth is estimated by the change of the segmented object label in an occluded region which is the difference of absolute values of optical flow. For compensating ordinal depth, we generate the relational depth which is the absolute value of optical flow as motion parallax. A final depth map is determined by multiplying ordinal depth by relational depth, then dividing by average optical flow. In this research, we propose the two-dimensional/three-dimensional movie conversion method which is applicable to all three-dimensional display devices and all two-dimensional movie formats. We present experimental results using sample two-dimensional movies.