• Title/Summary/Keyword: log-ratio 기법

Search Result 52, Processing Time 0.028 seconds

Computational Latency Reduction via Simplified Soft-bit Estimation of Hierarchical Modulation (근사화된 계층 변조의 연판정 비트 검출을 통한 연산 지연시간 감소)

  • You, Dongho
    • Proceedings of the Korean Society of Broadcast Engineers Conference
    • /
    • 2022.06a
    • /
    • pp.175-178
    • /
    • 2022
  • 본 논문은 고차 계층 변조, 즉 계층 64QAM의 연판정 비트 검출을 위한 단순화된 연산 방법을 다룬다. 이는 기존 계층 변조의 연판정 비트, 즉 LLR(Log-Likelihood Ratio)값의 근사를 통해 불필요한 연산을 줄여 이에 필요한 지연시간을 줄일 수 있다. 또한 제안된 기법은 기존의 연판정 비트 검출 기법과 매우 유사한 비트 오류율(BER: Bit Error Rate) 성능을 유지하기 때문에 연판정 비트를 활용하는 방송 및 통신 시스템에 폭넓게 적용될 수 있을 것으로 기대한다.

  • PDF

Two-dimensional Inundation Analysis Using Stochastic Rainfall Variation and Geographic Information System (추계학적 강우변동생성 기법과 GIS를 연계한 2차원 침수해석)

  • Lee, Jin-Young;Cho, Wan-Hee;Han, Kun-Yeun;Ahn, Ki-Hong
    • Journal of the Korean Association of Geographic Information Studies
    • /
    • v.13 no.1
    • /
    • pp.101-113
    • /
    • 2010
  • Recently actual rainfall pattern is decreasing rainy days and increasing in rainfall intensity and the frequency of flood occurrence is also increased. To consider recent situation, Engineers use deterministic methods like a PMP(Probable Maximum Precipitation). If design storm wouldn't occur, increasing of design criteria is extravagant. In addition, the biggest structure cause trouble with residents and environmental problem. And then it is necessary to study considering probability of rainfall parameter in each sub-basin for design of water structure. In this study, stochastic rainfall patterns are generated by using log-ratio method, Johnson system and multivariate Monte Carlo simulation. Using the stochastic rainfall patterns, hydrological analysis, hydraulic analysis and 2nd flooding analysis were performed based on GIS for their applicability. The results of simulations are similar to the actual damage area so the methodology of this study should be used about making a flood risk map or regidental shunting rout map against the region.

Discontinuity Analysis Using Well Log Methods from a Borehole-PABH1 in the Pungam Sedimentary Basin (풍암퇴적분지 내 시추공 PABH1에서 불연속면에 대한 물리검층방법의 적용)

  • 김영화;장승익;김중열;현혜자
    • The Journal of Engineering Geology
    • /
    • v.8 no.3
    • /
    • pp.261-273
    • /
    • 1998
  • Multiple well log analysis technique consisting of geophysical well log and geological core log has been made to analysis the discontinuities of a test borehole-PABH1 located in Pungam sedimentary basin, Sosok, Hongchon-gun, Kangwon Province. Well log methods consist of normal resistivity log, focussed log, single point resistance log, SP log, gamma log, natural gamma log as well as acoustic televiewer log and borehole television log. Core scanning technique was used as an aid for geological core log. The analysis was made by comparing firstly the televiewer and core discontinuities, and then the results from conventional geophysical log analysis were compared to those from core log and acoustic televiewer log. Fractures deduced from the acoustic televiewer log coincide well with discontinuities shown on the core and conventional geophysical logs. Particularly close coincidence could be observed between fractures derived from acoustic televiewer and conventional geophysical log analysis. It has been noted that the geophysical logs such as, caliper, resistivity, density and high resolution gamma gamma curves are effective in delineating the fractures. For example the ratio between density and resistivity (BRD/SHN) provides also an alternative indicator for discerning the fracture condition in the study area.

  • PDF

Comparison of multiscale multiple change-points estimators (SMUCE와 FDR segmentation 방법에 의한 다중변화점 추정법 비교)

  • Kim, Jaehee
    • The Korean Journal of Applied Statistics
    • /
    • v.32 no.4
    • /
    • pp.561-572
    • /
    • 2019
  • We study false discovery rate segmentation (FDRSeg) and simultaneous multiscale change-point estimator (SMUCE) methods for multiscale multiple change-point estimation, and compare empirical behavior via simulation. FSRSeg is based on the control of a false discovery rate while SMUCE used for the multiscale local likelihood ratio tests. FDRSeg seems to work best if the number of change-points is large; however, FDRSeg and SMUCE methods can both provide similar estimation results when there are only a small number of change-points. As a real data application, multiple change-points estimation is done with the well-log data.

Automatic Electrofacies Classification from Well Logs Using Multivariate Statistical Techniques (다변량 통계 기법을 이용한 물리검층 자료로부터의 암석물리학상 결정)

  • Lim Jong-Se;Kim Jungwhan;Kang Joo-Myung
    • Geophysics and Geophysical Exploration
    • /
    • v.1 no.3
    • /
    • pp.170-175
    • /
    • 1998
  • A systematic methodology is developed for the prediction of the lithology using electrofacies classification from wireline log data. Multivariate statistical techniques are adopted to segment well log measurements and group the segments into electrofacies types. To consider corresponding contribution of each log and reduce the computational dimension, multivariate logs are transformed into a single variable through principal components analysis. Resultant principal components logs are segmented using the statistical zonation method to enhance the quality and efficiency of the interpreted results. Hierarchical cluster analysis is then used to group the segments into electrofacies. Optimal number of groups is determined on the basis of the ratio of within-group variance to total variance and core data. This technique is applied to the wells in the Korea Continental Shelf. The results of field application demonstrate that the prediction of lithology based on the electrofacies classification works well with reliability to the core and cutting data. This methodology for electrofacies determination can be used to define reservoir characterization which is helpful to the reservoir management.

  • PDF

Low-Complexity Soft-MIMO Detection Algorithm Based on Ordered Parallel Tree-Search Using Efficient Node Insertion (효율적인 노드 삽입을 이용한 순서화된 병렬 트리-탐색 기반 저복잡도 연판정 다중 안테나 검출 알고리즘)

  • Kim, Kilhwan;Park, Jangyong;Kim, Jaeseok
    • The Journal of Korean Institute of Communications and Information Sciences
    • /
    • v.37A no.10
    • /
    • pp.841-849
    • /
    • 2012
  • This paper proposes an low-complexity soft-output multiple-input multiple-output (soft-MIMO) detection algorithm for achieving soft-output maximum-likelihood (soft-ML) performance under max-log approximation. The proposed algorithm is based on a parallel tree-search (PTS) applying a channel ordering by a sorted-QR decomposition (SQRD) with altered sort order. The empty-set problem that can occur in calculation of log-likelihood ratio (LLR) for each bit is solved by inserting additional nodes at each search level. Since only the closest node is inserted among nodes with opposite bit value to a selected node, the proposed node insertion scheme is very efficient in the perspective of computational complexity. The computational complexity of the proposed algorithm is approximately 37-74% of that of existing algorithms, and from simulation results for a $4{\times}4$ system, the proposed algorithm shows a performance degradation of less than 0.1dB.

Radius optimization for efficient list sphere decoding (효율적인 리스트 구복호기 검출방식을 위한 구반경의 최적화에 관한 연구)

  • Lee, Jae-Seok;Lee, Byoung-Ju;Shim, Byong-Hyo
    • Proceedings of the Korean Society of Broadcast Engineers Conference
    • /
    • 2010.07a
    • /
    • pp.46-49
    • /
    • 2010
  • 최근의 iterative detection and decoding (IDD) 기법에서의 soft 복호화방식은, log-likelihood ratio (LLR) 값의 신뢰도를 높이기 위해 기존의 구복호화 (sphere decoding) 방식보다는 리스트를 형성하는 구복호화방식 (list sphere decoding : LSD)이 대두되고 있다. 기존의 구복호화 방식과는 달리 리스트 구복호화 방식은 그 성능의 우수함에도 불구하고, 여러 격자 포인트들을 검출해야 하므로 신호대잡음비 (signal-to-noise ratio : SNR) 의 증가에 따른 복잡도의 이득을 거의 취할 수 없을 뿐만 아니라, 무엇보다 신뢰도가 높은 LLR 값을 얻는 데에 영향이 작은 포인트를 검출하는 경우도 생긴다는 점에서 비효율적인 측면이 있다. 이에 본 논문에서는 리스트 구복호화 검출방식의 효율성을 높이기 위해 LLR 값에 적은 영향을 미치는 격자 포인트들을 제거하는 방식에 대해 연구하였다. 본 연구의 목표는 MIMO 시스템에서의 기존의 리스트 구복호화 기법의 capacity와 실제 성능과 최대한 유사한 성능을 내면서도 그 복잡도를 현저히 줄이는 것이며, 구체적으로는 검출을 위한 초기 구반경의 최적화를 기반으로 한다.

  • PDF

Soft-Decision Algorithm with Low Complexity for MIMO Systems Using High-Order Modulations (고차 변조 방식을 사용하는 MIMO 시스템을 위한 낮은 복잡도를 갖는 연판정 알고리즘)

  • Lee, Jaeyoon;Kim, Kyoungtaek
    • The Journal of Korean Institute of Communications and Information Sciences
    • /
    • v.40 no.6
    • /
    • pp.981-989
    • /
    • 2015
  • In a log likelihood ratio(LLR) calculation of the detected symbol, multiple-input multiple-output(MIMO) system applying an optimal or suboptimal algorithm such as a maximum likelihood(ML) detection, sphere decoding(SD), and QR decomposition with M-algorithm Maximum Likelihood Detection(QRM-MLD) suffers from exponential complexity growth with number of spatial streams and modulation order. In this paper, we propose a LLR calculation method with very low complexity in the QRM-MLD based symbol detector for a high order modulation based $N_T{\times}N_R$ MIMO system. It is able to approach bit error rate(BER) performance of full maximum likelihood detector to within 1 dB. We also analyze the BER performance through computer simulation to verify the validity of the proposed method.

Data Consistency-Control Scheme Using a Rollback-Recovery Mechanism for Storage Class Memory (스토리지 클래스 메모리를 위한 롤백-복구 방식의 데이터 일관성 유지 기법)

  • Lee, Hyun Ku;Kim, Junghoon;Kang, Dong Hyun;Eom, Young Ik
    • Journal of KIISE
    • /
    • v.42 no.1
    • /
    • pp.7-14
    • /
    • 2015
  • Storage Class Memory(SCM) has been considered as a next-generation storage device because it has positive advantages to be used both as a memory and storage. However, there are significant problems of data consistency in recently proposed file systems for SCM such as insufficient data consistency or excessive data consistency-control overhead. This paper proposes a novel data consistency-control scheme, which changes the write mode for log data depending on the modified data ratio in a block, using a rollback-recovery scheme instead of the Write Ahead Logging (WAL) scheme. The proposed scheme reduces the log data size and the synchronization cost for data consistency. In order to evaluate the proposed scheme, we implemented our scheme on a Linux 3.10.2-based system and measured its performance. The experimental results show that our scheme enhances the write throughput by 9 times on average when compared to the legacy data consistency control scheme.

Text Categorization Based on the Maximum Entropy Principle (최대 엔트로피 기반 문서 분류기의 학습)

  • 장정호;장병탁;김영택
    • Proceedings of the Korean Information Science Society Conference
    • /
    • 1999.10b
    • /
    • pp.57-59
    • /
    • 1999
  • 본 논문에서는 최대 엔트로피 원리에 기반한 문서 분류기의 학습을 제안한다. 최대 엔트로피 기법은 자연언어 처리에서 언어 모델링(Language Modeling), 품사 태깅 (Part-of-Speech Tagging) 등에 널리 사용되는 방법중의 하나이다. 최대 엔트로피 모델의 효율성을 위해서는 자질 선정이 중요한데, 본 논문에서는 자질 집합의 선택을 위한 기준으로 chi-square test, log-likelihood ratio, information gain, mutual information 등의 방법을 이용하여 실험하고, 전체 후보 자질에 대한 실험 결과와 비교해 보았다. 데이터 집합으로는 Reuters-21578을 사용하였으며, 각 클래스에 대한 이진 분류 실험을 수행하였다.

  • PDF