• Title/Summary/Keyword: 예측성능 개선

Search Result 977, Processing Time 0.031 seconds

A Study on Pipelined Architecture with Branch Prediction and Two Paths Strategy (분기 예측과 이중 경로 전략을 결합한 파이프라인 구조에 관한 연구)

  • Ju, Yeong-Sang;Jo, Gyeong-San
    • The Transactions of the Korea Information Processing Society
    • /
    • v.3 no.1
    • /
    • pp.181-190
    • /
    • 1996
  • Pipelined architecture improves processor performance by overlapping the execution of several different instructions. The effect of control hazard stalls the pipeline and reduces processor performance. In order to reduce the effect of control hazard caused by branch, we proposes a new approach combining both branch prediction and two paths strategy. In addition, we verify the performance improvement in a proposed approach by utilizing system performance metric CPI rather than BEP.

  • PDF

A GMDH-type performance modeling for FMS with unreliable RAM and LCC (GMDH 방법을 이용한 FMS의 성능 예측 방안의 연구)

  • 황흥석
    • Proceedings of the Korean Operations and Management Science Society Conference
    • /
    • 1995.04a
    • /
    • pp.111-120
    • /
    • 1995
  • 통합생산시스템에서의 고장, 정비 및 가용 도는 매우 중요한 역할을 한다. 시스템 설계시의 RAM 파라메터의 결정은 시스템의 성능과 소요비용 및 구성(System Configuration)등에 크게 영향을 미친다. 이러한 시스템관련 요소의 숫자가 많거나 불확실할 경우는 시스템의 성능예측이 매우 복잡하게 된다. 이러한 시스템의 성능(performance) 평가를 위하여 발견적 방법인 GMDH(Group Method Data Handlinng) Type Modeling 방법을 이용하여 FMS의 성능 평가를 시도하였다. RAM 및 기계작업시간의 Data로부터 시스템성능의 척도로서 단위 사이클 기간동안의 생산률, 시스템내의 총 흐름시간, 각 작업장이 기계의 RAM 및 LCC등을 고려하였다. GMDH 알고리즘의 계산을 위한 프로그램을 개발하고, 이를 L형 Bracket제조시스템의 성능 예측에 시험 적용하였다. 본 Modeling에 의한 시스템의 성능예측 방법은 입출력 자료의 처리과정을 개선할 경우 FMS계획및 운영 단계에서 성능평가에 매우 유용하게 활용될수 있을 것으로 본다.

  • PDF

Performance Improvement of Operand Fetching with the Operand Reference Prediction Cache(ORPC) (오퍼랜드 참조 예측 캐쉬(ORPC)를 활용한 오퍼랜드 페치의 성능 개선)

  • Kim, Heung-Jun;Cho, Kyung-San
    • The Transactions of the Korea Information Processing Society
    • /
    • v.5 no.6
    • /
    • pp.1652-1659
    • /
    • 1998
  • To provide performance gains by reducing the operand referencing latency and data cache bandwidth requirements, we present an operand reference prediction cache (ORPC) which predicts operand value and address translation during the instruction fetch stage. The prediction is verified in the early stage, and thus it minimizes the performance penalty caused by the misprediction. Through the trace-driven simulation of six benchmark programs, the performance improvement by proposed three aRPC stmctures (OfiPC1, OfiPC2. ORPC3)is analysed and validated.

  • PDF

Improving dam inflow prediction in LSTM-s2s model with luong attention (Attention 기법을 통한 LSTM-s2s 모델의 댐유입량 예측 개선)

  • Jonghyeok Lee;Yeonjoo Kim
    • Proceedings of the Korea Water Resources Association Conference
    • /
    • 2023.05a
    • /
    • pp.226-226
    • /
    • 2023
  • 하천유량, 댐유입량 등을 예측하기 위해 다양한 Long Short-Term Memory (LSTM) 방법들이 활발하게 적용 및 개발되고 있다. 최근 연구들은 s2s (sequence-to-sequence), Attention 기법 등을 통해 LSTM의 성능을 개선할 수 있음을 제시하고 있다. 이에 따라 본 연구에서는 LSTM-s2s와 LSTM-s2s에 attention까지 첨가한 모델을 구축하고, 시간 단위 자료를 사용하여 유입량 예측을 수행하여, 이의 실제 댐 운영에 모델들의 활용 가능성을 확인하고자 하였다. 소양강댐 유역을 대상으로 2013년부터 2020년까지의 유입량 시자료와 종관기상관측기온 및 강수량 데이터를 학습, 검증, 평가로 나누어 훈련한 후, 모델의 성능 평가를 진행하였다. 최적 시퀀스 길이를 결정하기 위해 R2, RRMSE, CC, NSE, 그리고 PBIAS을 사용하였다. 분석 결과, LSTM-s2s 모델보다 attention까지 첨가한 모델이 전반적으로 성능이 우수했으며, attention 첨가 모델이 첨두값 예측에서도 높은 정확도를 보였다. 두 모델 모두 첨두값 발생 동안 유량 패턴을 잘 반영하였지만 세밀한 시간 단위 변화량 패턴 모의에는 한계가 있었다. 시간 단위 예측의 한계에도 불구하고, LSTM-s2s에 attention까지 추가한 모델은 향후 댐유입량 예측에 활용될 수 있을 것으로 판단한다.

  • PDF

Performance Analysis of Pilot Symbol Assisted Multi-Carrier CDMA BPSK System with Space Diversity in Rician Fading Channels (라이시안 페이딩 채널에서 공간 다이버시티를 적용한 Pilot Symbol Assisted Multi-Carrier CDMA BPSK 시스템의 성능 분석)

  • 노재성;오창헌;김언곤;조성준
    • The Journal of Korean Institute of Communications and Information Sciences
    • /
    • v.26 no.6A
    • /
    • pp.913-922
    • /
    • 2001
  • Multi-Carrier CDMA 시스템에서 불완전한 채널 예측과 다중사용자 간섭은 시스템의 성능 저하를 유발한다. 본 논문에서는 역방향 링크에서 파일럿 심볼을 이용하여 채널을 예측하는 Multi-Carrier CDMA BPSK 시스템에서 불완전한 채널 예측과 다중사용자 간섭의 영향을 연구하였다. 더욱이, 라이시안 페이딩 채널의 역방향 링크에서 최대비 합성 공간 다이버시티와 선택 합성 공간 다이버시티 기법을 적용한 성능 개선에 대하여 연구하였다. 수치계산 결과, 파일럿 심볼을 이용하여 채널을 예측하는 Multi-Carrier CDMA BPSK 시스템의 BER 성능은 등가 잡음 대역폭의 분산($\sigma$$^2$$_{c}$)에 매우 민감하나 무선 채널에서의 신호 전력 대 잡음 전력비에는 그리 심하지 않았다. 불완전한 채널 예측에 의한 BER 열화와 파일럿 심볼 간격의 최적화는 라이시안 페이딩 채널에서 Multi-Carrier CDMA BPSK 시스템의 BER 계산을 통하여 얻을 수 있었다. 그리고 불완전한 채널 예측에 의한 BER 열화는 신호 전력 대 잡음 전력비와 채널 예측 필터의 등가 잡음 대역폭에 관계하고 있음을 알 수 있었다.

  • PDF

Performance Improvement of OFDM based DSRC System adopting Selective Pilot Overlay Channel Estimation Scheme (선택형 파일럿 중첩 채널예측기법을 적용한 OFDM 기반 DSRC 시스템의 성능개선)

  • Kwak, Jae-Min
    • Journal of the Korea Institute of Information and Communication Engineering
    • /
    • v.10 no.10
    • /
    • pp.1863-1868
    • /
    • 2006
  • In this paper we propose the communication system model which improve the performance of OFDM based DSRC system by adopting selective pilot overlay channel estimation scheme. Assuming AWGN and fading channel environment, the performance of OFDM system according to IEEE802.11p physical layer being standardized for OFDM based DSRC is obtained, and the performance of proposed OFDM based DSRC system adopting selective pilot overlay channel estimation scheme is compared with the conventional system. from the simulation results, it is shown that proposed system is superior to conventional one due to reducing channel estimation error.

Radar rainfall prediction based on deep learning considering temporal consistency (시간 연속성을 고려한 딥러닝 기반 레이더 강우예측)

  • Shin, Hongjoon;Yoon, Seongsim;Choi, Jaemin
    • Journal of Korea Water Resources Association
    • /
    • v.54 no.5
    • /
    • pp.301-309
    • /
    • 2021
  • In this study, we tried to improve the performance of the existing U-net-based deep learning rainfall prediction model, which can weaken the meaning of time series order. For this, ConvLSTM2D U-Net structure model considering temporal consistency of data was applied, and we evaluated accuracy of the ConvLSTM2D U-Net model using a RainNet model and an extrapolation-based advection model. In addition, we tried to improve the uncertainty in the model training process by performing learning not only with a single model but also with 10 ensemble models. The trained neural network rainfall prediction model was optimized to generate 10-minute advance prediction data using four consecutive data of the past 30 minutes from the present. The results of deep learning rainfall prediction models are difficult to identify schematically distinct differences, but with ConvLSTM2D U-Net, the magnitude of the prediction error is the smallest and the location of rainfall is relatively accurate. In particular, the ensemble ConvLSTM2D U-Net showed high CSI, low MAE, and a narrow error range, and predicted rainfall more accurately and stable prediction performance than other models. However, the prediction performance for a specific point was very low compared to the prediction performance for the entire area, and the deep learning rainfall prediction model also had limitations. Through this study, it was confirmed that the ConvLSTM2D U-Net neural network structure to account for the change of time could increase the prediction accuracy, but there is still a limitation of the convolution deep neural network model due to spatial smoothing in the strong rainfall region or detailed rainfall prediction.

Performance Improvement of IEEE 802.11a WLAN System by Improved Channel Estimation Scheme using Long/Short Training Symbol (Long/Short 훈련심볼을 이용하는 개선된 채널추정기법에 의한 IEEE 802.11a 무선 LAN 시스템의 성능 개선)

  • Kwak, Jae-Min;Jung, Hae-Won;Cho, Sung-Joon;Lee, Hyeong-Ho
    • Journal of Advanced Navigation Technology
    • /
    • v.6 no.3
    • /
    • pp.203-210
    • /
    • 2002
  • In this paper, the BER performance of IEEE 802.11a OFDM WLAN system is obtained by simulation and it is shown that the proposed modified channel estimation algorithm improves the channel estimation performance of the system. The wireless channel used in channel simulation includes AWGN and delay spread channel implemented by TDL model. At first, the performance of OFDM WLAN system according to data rate and coding rate defined in standard is evaluated in AWGN channel. Then, imperfect channel estimation in indoor wireless channel is considered. After the performance of conventional channel estimation scheme using only two long training symbols is evaluated, and that of proposed modified channel estimation scheme using additional 8 short training symbol is compared with it. From the simulation results, it is shown that modified channel estimation scheme provides reduced channel estimation error and improves the channel estimation performance due to noise averaging effect with the same preamble format as defined in specification.

  • PDF

The Credit Information Feature Selection Method in Default Rate Prediction Model for Individual Businesses (개인사업자 부도율 예측 모델에서 신용정보 특성 선택 방법)

  • Hong, Dongsuk;Baek, Hanjong;Shin, Hyunjoon
    • Journal of the Korea Society for Simulation
    • /
    • v.30 no.1
    • /
    • pp.75-85
    • /
    • 2021
  • In this paper, we present a deep neural network-based prediction model that processes and analyzes the corporate credit and personal credit information of individual business owners as a new method to predict the default rate of individual business more accurately. In modeling research in various fields, feature selection techniques have been actively studied as a method for improving performance, especially in predictive models including many features. In this paper, after statistical verification of macroeconomic indicators (macro variables) and credit information (micro variables), which are input variables used in the default rate prediction model, additionally, through the credit information feature selection method, the final feature set that improves prediction performance was identified. The proposed credit information feature selection method as an iterative & hybrid method that combines the filter-based and wrapper-based method builds submodels, constructs subsets by extracting important variables of the maximum performance submodels, and determines the final feature set through prediction performance analysis of the subset and the subset combined set.

Simple Recovery Mechanism for Branch Misprediction in Global-History-Based Branch Predictors Allowing the Speculative Update of Branch History (분기 히스토리의 모험적 갱신을 허용하는 전역 히스토리 기반 분기예측기에서 분기예측실패를 위한 간단한 복구 메커니즘)

  • Ko, Kwang-Hyun;Cho, Young-Il
    • Journal of KIISE:Computer Systems and Theory
    • /
    • v.32 no.6
    • /
    • pp.306-313
    • /
    • 2005
  • Conditional branch prediction is an important technique for improving processor performance. Branch mispredictions, however, waste a large number of cycles, inhibit out-of-order execution, and waste electric power on mis-speculated instructions. Hence, the branch predictor with higher accuracy is necessary for good processor performance. In global-history-based predictors like gshare and GAg, many mispredictions come from commit update of the history. Some works on this subject have discussed the need for speculative update of the history and recovery mechanisms for branch mispredictions. In this paper, we present a simple mechanism for recovering the branch history after a misprediction. The proposed mechanism adds an age_counter to the original predictor and doubles the size of the branch history register. The age_counter counts the number of outstanding branches and uses it to recover the branch history register. Simulation results on the Simplescalar 3.0/PISA tool set and the SPECINTgS benchmarks show that gshare and GAg with the proposed recovery mechanism improved the average prediction accuracy by 2.14$\%$ and 9.21$\%$, respectively and the average IPC by 8.75$\%$ and 18.08$\%$, respectively over the original predictor.