• Title/Summary/Keyword: 지연필터

Search Result 400, Processing Time 0.027 seconds

Optimal Rejection of Sea Bottom, Peg-leg and Free-surface Multiples for Multichannel Seismic Data on South-eastern Sea, Korea (동해 남동해역 다중채널 해양탄성파 탐사자료의 해저면, 페그-레그 및 자유해수면 다중반사파 제거 최적화 전산처리)

  • Cheong, Snons;Koo, Nam-Hyung;Kim, Won-Sik;Lee, Ho-Young;Shin, Won-Chul;Park, Keun-Pil;Kim, Jin-Ho
    • Geophysics and Geophysical Exploration
    • /
    • v.12 no.4
    • /
    • pp.289-298
    • /
    • 2009
  • Optimal data processing parameters were designed to attenuate multiples in seismic data acquired in the south-eastern area of the East Sea, in 2008. Bunch of multiples caused by shallow sea water depth were perceived periodically up to two way travel time of 1,750 ms at every 250 ms over seismic traces. We abbreviated sea bottom multiple as SBM, Peg-leg multiple as PLM, and free-surface multiple as FSM. To attenuate these multiples, seismic data processing flow was constructed including NMO, stack, minimum phase predictive deconvolution filter and wave equation multiple rejections (WEMR). Prevalent multiples were suppressed by predictive deconvolution and remaining multiples were attenuated by WEMR. We concluded that combining deconvolution with WEMR was effective to a seismic data of study area. Derived parameter can be applied to the seismic data processing on adjacent survey area.

The Extraction of the Edge Histogram using Wavelet Coefficients in the Wavelet Domain (웨이블릿 영역에서의 웨이블릿 계수들을 이용한 에지 히스토그램 추출 기법 연구)

  • Song, Jin-Ho;Eom, Min-Young;Choe, Yoon-Sik
    • Journal of the Institute of Electronics Engineers of Korea SP
    • /
    • v.42 no.5 s.305
    • /
    • pp.137-144
    • /
    • 2005
  • In this paper, the extraction method of the edge histogram directly using wavelet coefficients in the wavelet domain for JPEG2000 images is proposed. MPEG-7 Edge Histogram Descriptor(EHD) extracts edge histogram in the spacial domain. This algorithm has much multiplication and addition for the edge extraction because it needs the decoding processing. However because the proposed algorithm extracts the edge histogram in the wavelet domain, it doesn't need the decoding processing and it decreases multiplication and addition. The Discrete Wavelet Transform(DWT) is a standard transform in JPEG2000. The proposed algorithm uses Le Gall 5/3 filter in JPEG2000 and odd coefficients in LH2 and HL2 sub-band. The edge direction can be decided to use rate of HL2 and LH2 odd coefficients. According to experiments, there is no difference of the efficiency between EHD and the proposed algorithm And the proposed algorithm is much better than EHD for multiplication and addition in the edge extraction of images.

Analysis of Input/Output Transfer Characteristic to Transmit Modulated Signals through a Dynamic Frequency Divider (동적 주파수 분할기의 변조신호 전송 조건을 위한 입출력 전달 특성 분석과 설계에 대한 연구)

  • Ryu, Sungheon;Park, Youngcheol
    • The Journal of Korean Institute of Electromagnetic Engineering and Science
    • /
    • v.27 no.2
    • /
    • pp.170-175
    • /
    • 2016
  • In order to transmit baseband signals through frequency dividing devices, we studied the transfer function of the device in the term of the baseband signal distortion. From the analysis, it is shown that the magnitude of the envelope signal is related to the mixer gain and the insertion loss of the low pass filter whilst the phase is the additional function with the 1/2 of the phase delay. For the purpose of the verification of the study, we designed a dynamic frequency divider at 1,400 MHz. The operating frequency range of the device is closely related to the conversion gain of mixers and the amplitude of input signal, and becomes wide as the conversion gain of mixers increases. The designed frequency divider operates between 0.9 GHz and 3.2 GHz, for -14.5 dBm input power. The circuit shows 20 mW power dissipation at $V_{DD}=2.5V$, and the simulation result shows that an amplitude modulated signal at 1,400 MHz with the modulation index of 0.9 was successfully downconverted to 700 MHz.

Optimization of Esterification of Jatropha Oil by Amberlyst-15 and Biodiesel Production (Amberlyst-15를 이용한 자트로파 오일의 에스테르화 반응 최적화 및 바이오디젤 생산)

  • Choi, Jong-Doo;Kim, Deog-Keun;Park, Ji-Yeon;Rhee, Young-Woo;Lee, Jin-Suk
    • Korean Chemical Engineering Research
    • /
    • v.46 no.1
    • /
    • pp.194-199
    • /
    • 2008
  • In this study, the effective method to esterify the free fatty acids in jatropha oil was examined. Compared to other plant oils, the acid value of jatropha oil was remarkably high, 11.5 mgKOH/g. So direct transesterification by a base catalyst was not suitable for the oil. After the free fatty acids were esterified with methanol, jatropha oil was transesterified. The activities of four solid acid catalysts were tested and Amberlyst-15 showed the best activity for the esterification. After constructing the experiment matrix based on RSM and analyzing the statistical data, the optimal esterification conditions were determined to be 6.79% of methanol and 17.14% of Amberlyst-15. After the pretreatment, jatropha biodiesel was produced by the transesterification using KOH in a pressurized batch reactor. Jatropha biodiesel produced could meet the major specifications of Korean biodiesel standards; 97.35% of FAME, 8.17 h of oxidation stability, 0.125% of total glycerol and $0^{\circ}C$ of CFPP.

Development of Android Smart Phone App for Analysis of Remote Sensing Images (위성영상정보 분석을 위한 안드로이드 스마트폰 앱 개발)

  • Kang, Sang-Goo;Lee, Ki-Won
    • Korean Journal of Remote Sensing
    • /
    • v.26 no.5
    • /
    • pp.561-570
    • /
    • 2010
  • The purpose of this study is to develop an Android smartphone app providing analysis capabilities of remote sensing images, by using mobile browsing open sources of gvSIG, open source remote sensing software of OTB and open source DBMS of PostgreSQL. In this app, five kinds of remote sensing algorithms for filtering, segmentation, or classification are implemented, and the processed results are also stored and managed in image database to retrieve. Smartphone users can easily use their functions through graphical user interfaces of app which are internally linked to application server for image analysis processing and external DBMS. As well, a practical tiling method for smartphone environments is implemented to reduce delay time between user's requests and its processing server responses. Till now, most apps for remotely sensed image data sets are mainly concerned to image visualization, distinguished from this approach providing analysis capabilities. As the smartphone apps with remote sensing analysis functions for general users and experts are widely utilizing, remote sensing images are regarded as information resources being capable of producing actual mobile contents, not potential resources. It is expected that this study could trigger off the technological progresses and other unique attempts to develop the variety of smartphone apps for remote sensing images.

Deep Learning Based Group Synchronization for Networked Immersive Interactions (네트워크 환경에서의 몰입형 상호작용을 위한 딥러닝 기반 그룹 동기화 기법)

  • Lee, Joong-Jae
    • KIPS Transactions on Computer and Communication Systems
    • /
    • v.11 no.10
    • /
    • pp.373-380
    • /
    • 2022
  • This paper presents a deep learning based group synchronization that supports networked immersive interactions between remote users. The goal of group synchronization is to enable all participants to synchronously interact with others for increasing user presence Most previous methods focus on NTP-based clock synchronization to enhance time accuracy. Moving average filters are used to control media playout time on the synchronization server. As an example, the exponentially weighted moving average(EWMA) would be able to track and estimate accurate playout time if the changes in input data are not significant. However it needs more time to be stable for any given change over time due to codec and system loads or fluctuations in network status. To tackle this problem, this work proposes the Deep Group Synchronization(DeepGroupSync), a group synchronization based on deep learning that models important features from the data. This model consists of two Gated Recurrent Unit(GRU) layers and one fully-connected layer, which predicts an optimal playout time by utilizing the sequential playout delays. The experiments are conducted with an existing method that uses the EWMA and the proposed method that uses the DeepGroupSync. The results show that the proposed method are more robust against unpredictable or rapid network condition changes than the existing method.

Multiple Reference Network Data Processing Algorithms for High Precision of Long-Baseline Kinematic Positioning by GPS/INS Integration (GPS/INS 통합에 의한 고정밀 장기선 동적 측위를 위한 다중 기준국 네트워크 데이터 처리 알고리즘)

  • Lee, Hung-Kyu
    • KSCE Journal of Civil and Environmental Engineering Research
    • /
    • v.29 no.1D
    • /
    • pp.135-143
    • /
    • 2009
  • Integrating the Global Positioning System (GPS) and Inertial Navigation System (INS) sensor technologies using the precise GPS Carrier phase measurements is a methodology that has been widely applied in those application fields requiring accurate and reliable positioning and attitude determination; ranging from 'kinematic geodesy', to mobile mapping and imaging, to precise navigation. However, such integrated system may not fulfil the demanding performance requirements when the baseline length between reference and mobil user GPS receiver is grater than a few tens of kilometers. This is because their positioning/attitude determination is still very dependent on the errors of the GPS observations, so-called "baseline dependent errors". This limitation can be remedied by the integration of GPS and INS sensors, using multiple reference stations. Hence, in order to derive the GPS distance dependent errors, this research proposes measurement processing algorithms for multiple reference stations, such as a reference station ambiguity resolution procedure using linear combination techniques, a error estimation based on Kalman filter and a error interpolation. In addition, all the algorithms are evaluated by processing real observations and results are summarized in this paper.

Flood Disaster Prediction and Prevention through Hybrid BigData Analysis (하이브리드 빅데이터 분석을 통한 홍수 재해 예측 및 예방)

  • Ki-Yeol Eom;Jai-Hyun Lee
    • The Journal of Bigdata
    • /
    • v.8 no.1
    • /
    • pp.99-109
    • /
    • 2023
  • Recently, not only in Korea but also around the world, we have been experiencing constant disasters such as typhoons, wildfires, and heavy rains. The property damage caused by typhoons and heavy rain in South Korea alone has exceeded 1 trillion won. These disasters have resulted in significant loss of life and property damage, and the recovery process will also take a considerable amount of time. In addition, the government's contingency funds are insufficient for the current situation. To prevent and effectively respond to these issues, it is necessary to collect and analyze accurate data in real-time. However, delays and data loss can occur depending on the environment where the sensors are located, the status of the communication network, and the receiving servers. In this paper, we propose a two-stage hybrid situation analysis and prediction algorithm that can accurately analyze even in such communication network conditions. In the first step, data on river and stream levels are collected, filtered, and refined from diverse sensors of different types and stored in a bigdata. An AI rule-based inference algorithm is applied to analyze the crisis alert levels. If the rainfall exceeds a certain threshold, but it remains below the desired level of interest, the second step of deep learning image analysis is performed to determine the final crisis alert level.

Reducing error rates in general nuclear medicine imaging to increase patient satisfaction (핵의학 일반영상 검사업무 오류개선 활동에 따른 환자 만족도)

  • Kim, Ho-Sung;Im, In-Chul;Park, Cheol-Woo;Lim, Jong-Duek;Kim, Sun-Geun;Lee, Jae-Seung
    • Journal of the Korean Society of Radiology
    • /
    • v.5 no.5
    • /
    • pp.295-302
    • /
    • 2011
  • To n the field of nuclear medicine, with regard to checking regular patients, from the moment they register up to the doctor's diagnosis, the person in charge of the checks can find errors in the diagnosis, reexamine, reanalyze the results or save images to PACS. Through this process, the results obtained from the readings are delayed due to checks and additional tests which occur in hospitals, causing patient satisfaction and affected reliability. Accordingly, the purpose is to include visual inspection of the results to minimize error, improve efficiency and increase patient satisfaction. Nuclear medicine and imaging tests from examines at Asan Medical Center, Seoul, from March 2008 to December 2008, were analyzed for errors. The first stage, from January 2009 to December 2009, established procedures and know-how. The second stage from January 2010 until June 2010 conducted Pre-and Post-filtering assessment, and the third stage from July 2010 until October 2010 consisted of cross-checks and attaching stickers and comparing error cases. Of 92 errors, the 1st, 2nd and 3rd stage had 32 cases, and there were 46 cases after the 4th stage, with the overall errors reduced by 74.3% from 94.6%. In the field of general nuclear medicine, where various kinds of checks are performed according to the patient's needs, analysis, image composition, differing images in PACS, etc, all have the potential for mistakes to be made. In order to decrease error rates, the image can continuously Cross-Check and Confirm diagnosis.

Optimization of Pre-treatment of Tropical Crop Oil by Sulfuric Acid and Bio-diesel Production (황산을 이용한 열대작물 오일의 전처리 반응 최적화 및 바이오디젤 생산)

  • Kim, Deog-Keun;Choi, Jong-Doo;Park, Ji-Yeon;Lee, Jin-Suk;Park, Seung-Bin;Park, Soon-Chul
    • Korean Chemical Engineering Research
    • /
    • v.47 no.6
    • /
    • pp.762-767
    • /
    • 2009
  • In this study, the feasibility of using vegetable oil extracted from tropical crop seed as a biodiesel feedstock was investigated by producing biodiesel and analysing the quality parameters as a transport fuel. In order to produce biodiesel efficiently, two step reaction process(pre-treatment and transesterificaion) was required because the tropical crop oil have a high content of free fatty acids. To determine the suitable acid catalyst for the pre-esterification, three kinds of acid catalysts were tested and sulfuric acid was identified as the best catalyst. After constructing the experimental matrix based on RSM and analysing the statistical data, the optimal pre-treatment conditions were determined to be 26.7% of methanol and 0.982% of sulfuric acid. Trans-esterification experiments of the pre-esterified oil based on RSM were carried out, then discovered 1.24% of KOH catalyst and 22.76% of methanol as the optimal trans-esterification conditions. However, the quantity of KOH was higher than the previously established KOH concentration of our team. So, we carried out supplemental experiment to determine the quantity of catalyst and methanol. As a result, the optimal transesterification conditions were determined to be 0.8% of KOH and 16.13% of methanol. After trans-esterification of tropical crop oil, the produced biodiesel could meet the major quality standard specifications; 100.8% of FAME, 0.45 mgKOH/g of acid value, 0.00% of water, 0.04% of total glycerol, $4.041mm^2/s$ of kinematic viscosity(at $40^{\circ}C$).