• Title/Summary/Keyword: block state analysis

Search Result 158, Processing Time 0.029 seconds

Analysis on the Setfiement Conditions in the Troubled Reclaimed Areas Under State Control (II) - Farming and Rural Economy- (미완공간척지의 정주생활 실태분석(II) -영농 및 농촌경제-)

  • 최수명;황한철
    • Magazine of the Korean Society of Agricultural Engineers
    • /
    • v.33 no.2
    • /
    • pp.104-111
    • /
    • 1991
  • In Korea, small-scale reclaimed areas have been suffering from many problems because of the lack of comprehensive developing strategy although considerable investments have been inputed by the public sector since 1970's. For 3 reclaimed sites in Chonnam Province choser as case study areas, the analysis, the second attempt of widely-spanned studies on areal conditions, concentrated on their farming and economic conditions. Its results were as follows ; 1. Although farming pattern has been transferred to the full4ime rice cropping type by the creation of reclaimed paddy field, farming size in the areas has not been increased more than that in existing agricultural areas. This means that agricultural planning should be included in the initial stage of reclamation projects, especially with reference to the substantial enlargement of farming size. 2. Block parcelling of severely fragmented holdings in new and old lands should be carried out, which can make farming activities efficient and farming route shortened. In large-scale reclaimed areas, new village planning could be considered in its central zone for efficent farming. 3. Because soil in the areas contains much more salt and water than that in other areas, new design methodology should be introduced for the efficient use of agricultural machines in reclaimed areas. 4. There are deep-seated economic problems in reclaimed area, which have been caused by very poor level and agriculturally dominated structure of household income. These problems should motivate farmers to give up positive action for qualitative and quantitative improvement in farming.

  • PDF

Implementation of platform for long-term evolution cell perspective resource utilization analysis

  • Um, Jungsun;Kim, Igor;Park, Seungkeun
    • ETRI Journal
    • /
    • v.43 no.2
    • /
    • pp.232-245
    • /
    • 2021
  • As wireless communication continues to develop in limited frequency resource environments, it is becoming important to identify the state of spectrum utilization and predict the amount needed in future. It is essential to collect reliable information for data analysis. This paper introduces a platform that enables the gathering of the scheduling information of a long-term evolution (LTE) cellular system without connecting to the network. A typical LTE terminal can confirm its assigned resource information using the configuration parameters delivered from a network. However, our platform receives and captures only the LTE signal over the air and then enables the estimation of the data relevant to scheduling for all terminals within an LTE cell. After extracting the control channel signal without loss from all LTE subframes, it detects valid downlink control information using the proposed algorithm, which is based on the error vector magnitude of depatterned symbols. We verify the reliability of the developed platform by comparing it with real data from mobile phones and service operators. The average difference in resource block utilization is only 0.28%.

A Study on Sequential Digital Logic Systems and Computer Architecture based on Extension Logic (확장논리에 기초한 순차디지털논리시스템 및 컴퓨터구조에 관한 연구)

  • Park, Chun-Myoung
    • The Journal of the Institute of Internet, Broadcasting and Communication
    • /
    • v.8 no.2
    • /
    • pp.15-21
    • /
    • 2008
  • This paper discuss the sequential digital logic systems and arithmetic operation algorithms which is the important material in computer architecture using analysis and synthesis which is based on extension logic for binary logic over galois fields. In sequential digital logic systems, we construct the moore model without feedback sequential logic systems after we obtain the next state function and output function using building block T-gate. Also, we obtain each algorithms of the addition, subtraction, multiplication, division based on the finite fields mathematical properties. Especially, in case of P=2 over GF($P^m$), the proposed algorithm have a advantage which will be able to apply traditional binary logic directly.The proposed method can construct more efficiency digital logic systems because it can be extended traditional binary logic to extension logic.

  • PDF

Performance Analysis on Soft Decision Decoding using Erasure Technique (COFDM 시스템에서 채널상태정보를 이용한 Viterbi 디코더)

  • 이원철
    • The Journal of Korean Institute of Communications and Information Sciences
    • /
    • v.24 no.10A
    • /
    • pp.1563-1570
    • /
    • 1999
  • This paper relates to the soft decision method with erasure technique in digital terrestrial television broadcasting system. The proposed decoder use the CSI derived from using the pilots in receiver. The active real(I) and imaginary(Q) data are transferred to the branch metric calculation block that decides the Euclidean distance for the soft decision decoding and also the estimated CSI values are transferred to the same block. After calculating the Euclidean distance for the soft decision decoding, the Euclidean distance of branch metric is multiplied by CSI. To do so, new branch metric values that consider each carrier state information are obtained. We simulated this method in better performance of about 0.15dB to 0.17dB and 2.2dB to 2.9dB in Rayleigh channel than that of the conventional soft decision Viterbi decoding with or without bit interleaver where the constellation is QPSK, 16-QAM and 64-QAM.

  • PDF

Effective Noise Reduction using STFT-based Content Analysis (STFT 기반 영상분석을 이용한 효과적인 잡음제거 알고리즘)

  • Baek, Seungin;Jeong, Soowoong;Choi, Jong-Soo;Lee, Sangkeun
    • Journal of the Institute of Electronics and Information Engineers
    • /
    • v.52 no.4
    • /
    • pp.145-155
    • /
    • 2015
  • Noise reduction has been actively studied in the digital image processing and recently, block-based denoising algorithms are widely used. In particular, a low rank approximation employing WNNM(Weighted Nuclear Norm Minimization) and block-based approaches demonstrated the potential for effective noise reduction. However, the algorithm based on low rank a approximation generates the artifacts in the image restoration step. In this paper, we analyzes the image content using the STFT(Short Time Fourier Transform) and proposes an effective method of minimizing the artifacts generated from the conventional algorithm. To evaluate the performance of the proposed scheme, we use the test images containing a wide range of noise levels and compare the results with the state-of-art algorithms.

Time- and Frequency-Domain Block LMS Adaptive Digital Filters: Part Ⅱ - Performance Analysis (시간영역 및 주파수영역 블럭적응 여파기에 관한 연구 : 제 2 부- 성능분석)

  • Lee, Jae-Chon;Un, Chong-Kwan
    • The Journal of the Acoustical Society of Korea
    • /
    • v.7 no.4
    • /
    • pp.54-76
    • /
    • 1988
  • In Part Ⅰ of the paper, we have developed various block least mean-square (BLMS) adaptive digital filters (ADF's) based on a unified matrix treatment. In Part Ⅱ we analyze the convergence behaviors of the self-orthogonalizing frequency-domain BLMS (FBLMS) ADF and the unconstrained FBLMS (UFBLMS) ADF both for the overlap-save and overlap-add sectioning methods. We first show that, unlike the FBLMS ADF with a constant convergence factor, the convergence behavior of the self-orthogonalizing FBLMS ADF is governed by the same autocorrelation matrix as that of the UFBLMS ADF. We then show that the optimum solution of the UFBLMS ADF is the same as that of the constrained FBLMS ADF when the filter length is sufficiently long. The mean of the weight vector of the UFBLMS ADF is also shown to converge to the optimum Wiener weight vector under a proper condition. However, the steady-state mean-squared error(MSE) of the UFBLMS ADF turns out to be slightly worse than that of the constrained algorithm if the same convergence constant is used in both cases. On the other hand, when the filter length is not sufficiently long, while the constrained FBLMS ADF yields poor performance, the performance of the UFBLMS ADF can be improved to some extent by utilizing its extended filter-length capability. As for the self-orthogonalizing FBLMS ADF, we study how we can approximate the autocorrelation matrix by a diagonal matrix in the frequency domain. We also analyze the steady-state MSE's of the self-orthogonalizing FBLMS ADF's with and without the constant. Finally, we present various simulation results to verify our analytical results.

  • PDF

Multivariate Time Series Simulation With Component Analysis (독립성분분석을 이용한 다변량 시계열 모의)

  • Lee, Tae-Sam;Salas, Jose D.;Karvanen, Juha;Noh, Jae-Kyoung
    • Proceedings of the Korea Water Resources Association Conference
    • /
    • 2008.05a
    • /
    • pp.694-698
    • /
    • 2008
  • In hydrology, it is a difficult task to deal with multivariate time series such as modeling streamflows of an entire complex river system. Normal distribution based model such as MARMA (Multivariate Autorgressive Moving average) has been a major approach for modeling the multivariate time series. There are some limitations for the normal based models. One of them might be the unfavorable data-transformation forcing that the data follow the normal distribution. Furthermore, the high dimension multivariate model requires the very large parameter matrix. As an alternative, one might be decomposing the multivariate data into independent components and modeling it individually. In 1985, Lins used Principal Component Analysis (PCA). The five scores, the decomposed data from the original data, were taken and were formulated individually. The one of the five scores were modeled with AR-2 while the others are modeled with AR-1 model. From the time series analysis using the scores of the five components, he noted "principal component time series might provide a relatively simple and meaningful alternative to conventional large MARMA models". This study is inspired from the researcher's quote to develop a multivariate simulation model. The multivariate simulation model is suggested here using Principal Component Analysis (PCA) and Independent Component Analysis (ICA). Three modeling step is applied for simulation. (1) PCA is used to decompose the correlated multivariate data into the uncorrelated data while ICA decomposes the data into independent components. Here, the autocorrelation structure of the decomposed data is still dominant, which is inherited from the data of the original domain. (2) Each component is resampled by block bootstrapping or K-nearest neighbor. (3) The resampled components bring back to original domain. From using the suggested approach one might expect that a) the simulated data are different with the historical data, b) no data transformation is required (in case of ICA), c) a complex system can be decomposed into independent component and modeled individually. The model with PCA and ICA are compared with the various statistics such as the basic statistics (mean, standard deviation, skewness, autocorrelation), and reservoir-related statistics, kernel density estimate.

  • PDF

Analysis of Hybrid Additive Cellular Automata (하이브리드 가산 셀룰라 오토마타의 분석)

  • Cho, Sung-Jin;Choi, Un-Sook;Kim, Han-Doo;Hwang, Yoon-Hee;Kim, Jin-Gyoung;Kim, Bong-Soo
    • Proceedings of the Korean Institute of Information and Commucation Sciences Conference
    • /
    • 2008.05a
    • /
    • pp.221-224
    • /
    • 2008
  • Anghelescu et al. proposed a block cryptosystem based on 8-cell hybrid additive cellular automata with cycle length 8 using state transition rules 51, 60 (or 102). All states must divided into the same cycles in the diagram of the cellular automata. But there exist cellular automata which don't satisfy this condition in Anghelescu et al.'s cryptosystem. In this paper we analyze hybrid additive cellular automata and propose an improved method.

  • PDF

Interference-Aware Radio Resource Allocation in D2D Underlaying LTE-Advanced Networks

  • Xu, Shaoyi;Kwak, Kyung Sup;Rao, Ramesh R.
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • v.8 no.8
    • /
    • pp.2626-2646
    • /
    • 2014
  • This study presents a power and Physical Resource Blocks (PRBs) joint allocation algorithm to coordinate uplink (UL) interference in the device-to-device (D2D) underlaying Long Term Evolution-Advanced (LTE-A) networks. The objective is to find a mechanism to mitigate the UL interference between the two subsystems and maximize the weighted sum throughput as well. This optimization problem is formulated as a mixed integer nonlinear programming (MINLP) which is further decomposed into PRBs assignment and transmission power allocation. Specifically, the scenario of applying imperfect channel state information (CSI) is also taken into account in our study. Analysis reveals that the proposed PRBs allocation strategy is energy efficient and it suppresses the interference not only suffered by the LTE-A system but also to the D2D users. In another side, a low-complexity technique is proposed to obtain the optimal power allocation which resides in one of at most three feasible power vectors. Simulations show that the optimal power allocation combined with the proposed PRBs assignment achieves a higher weighted sum throughput as compared to traditional algorithms even when imperfect CSI is utilized.

CONTROL THEORY OF WALSH FUNCTIONS-A SURVEY (WALSH함수와 제어이론)

  • Ahn, Doo-Soo;Lee, Myung-Kyu;Lee, Hae-Ki;Lee, Seung
    • Proceedings of the KIEE Conference
    • /
    • 1991.07a
    • /
    • pp.657-665
    • /
    • 1991
  • Although orthogonal function is introduced in control theory in early 1970's, it is not perfect. Since the concept of integral operator by Chen and Hsiao in mid 1970's, orthogonal function (for example Walsh, Block-pulse, Haar, Laguerre, Legendre, Chebychev etc) has been widely applied In system's analysis and identification, model reduction, state estimation, optimal control, signal processing, image processing, EEG, and ECG etc. The reason why Walsh Functions introduces in control theory is that as integral of Walsh function is also developed in Walsh orthogonal function, if we transfer give system into integral equation and introduce Walsh function. We can know that system's characteristic by algebraical expression. This approach is based on least square error and that result is expressed as computer calculation and partly continuous constant value which is easy to apply. Such a Walsh function has been actively studied in USA, TAIWAN, INDO, CHINA, EUROPE etc and in domestic, author has studied it for 10 years since it was is introduced in 1982. This paper is consider the that author has studied for 10 years and Walsh function's efficiency.

  • PDF