• Title/Summary/Keyword: quantum algorithm

Search Result 149, Processing Time 0.025 seconds

Hybrid Scheduling Algorithm based on DWDRR using Hysteresis for QoS of Combat Management System Resource Control

  • Lee, Gi-Yeop
    • Journal of the Korea Society of Computer and Information
    • /
    • v.25 no.1
    • /
    • pp.21-27
    • /
    • 2020
  • In this paper, a hybrid scheduling algorithm is proposed for CMS(Combat Management System) to improve QoS(Quality of Service) based on DWDRR(Dynamic Weighted Deficit Round Robin) and priority-based scheduling method. The main proposed scheme, DWDRR is method of packet transmission through giving weight by traffic of queue and priority. To demonstrate an usefulness of proposed algorithm through simulation, efficiency in special section of the proposed algorithm is proved. Therefore, We propose hybrid algorithm between existing algorithm and proposed algorithm. Also, to prevent frequent scheme conversion, a hysteresis method is applied. The proposed algorithm shows lower packet loss rate and delay in the same traffic than existing algorithm.

Bayesian Multiple Change-Point Estimation for Single Quantum Dot Luminescence Intensity Data (단일 양자점으로부터 발생한 발광세기 변화에 대한 베이지안 다중 변화점 추정)

  • Kima, Jaehee;Kimb, Hahkjoon
    • The Korean Journal of Applied Statistics
    • /
    • v.26 no.4
    • /
    • pp.569-579
    • /
    • 2013
  • In the field of single-molecule spectroscopy, it is essential to analyze luminescence Intensity changes that result from a single molecule. With the CdSe/ZnS core-shell structured quantum dot photon emission data Bayesian multiple change-point estimation is done with the gamma prior for Poisson parameters and truncated Poisson distribution for the number of change-points.

INVESTIGATION OF REACTOR CONDITION MONITORING AND SINGULARITY DETECTION VIA WAVELET TRANSFORM AND DE-NOISING

  • Kim, Ok-Joo;Cho, Nan-Zin;Park, Chang-Je;Park, Moon-Ghu
    • Nuclear Engineering and Technology
    • /
    • v.39 no.3
    • /
    • pp.221-230
    • /
    • 2007
  • Wavelet theory was applied to detect a singularity in a reactor power signal. Compared to Fourier transform, wavelet transform has localization properties in space and frequency. Therefore, using wavelet transform after de-noising, singular points can easily be found. To test this theory, reactor power signals were generated using the HANARO(a Korean multi-purpose research reactor) dynamics model consisting of 39 nonlinear differential equations contaminated with Gaussian noise. Wavelet transform decomposition and de-noising procedures were applied to these signals. It was possible to detect singular events such as a sudden reactivity change and abrupt intrinsic property changes. Thus, this method could be profitably utilized in a real-time system for automatic event recognition(e.g., reactor condition monitoring).

Power-Based Side Channel Attack and Countermeasure on the Post-Quantum Cryptography NTRU (양자내성암호 NTRU에 대한 전력 부채널 공격 및 대응방안)

  • Jang, Jaewon;Ha, Jaecheol
    • Journal of the Korea Institute of Information Security & Cryptology
    • /
    • v.32 no.6
    • /
    • pp.1059-1068
    • /
    • 2022
  • A Post-Quantum Cryptographic algorithm NTRU, which is designed by considering the computational power of quantum computers, satisfies the mathematically security level. However, it should consider the characteristics of side-channel attacks such as power analysis attacks in hardware implementation. In this paper, we verify that the private key can be recovered by analyzing the power signal generated during the decryption process of NTRU. To recover the private keys, the Simple Power Analysis (SPA), Correlation Power Analysis (CPA) and Differential Deep Learning Analysis (DDLA) were all applicable. There is a shuffling technique as a basic countermeasure to counter such a power side-channel attack. Neverthe less, we propose a more effective method. The proposed method can prevent CPA and DDLA attacks by preventing leakage of power information for multiplication operations by only performing addition after accumulating each coefficient, rather than performing accumulation after multiplication for each index.

STUDY OF CORE SUPPORT BARREL VIBRATION MONITORING USING EX-CORE NEUTRON NOISE ANALYSIS AND FUZZY LOGIC ALGORITHM

  • CHRISTIAN, ROBBY;SONG, SEON HO;KANG, HYUN GOOK
    • Nuclear Engineering and Technology
    • /
    • v.47 no.2
    • /
    • pp.165-175
    • /
    • 2015
  • The application of neutron noise analysis (NNA) to the ex-core neutron detector signal for monitoring the vibration characteristics of a reactor core support barrel (CSB) was investigated. Ex-core flux data were generated by using a nonanalog Monte Carlo neutron transport method in a simulated CSB model where the implicit capture and Russian roulette technique were utilized. First and third order beam and shell modes of CSB vibration were modeled based on parallel processing simulation. A NNA module was developed to analyze the ex-core flux data based on its time variation, normalized power spectral density, normalized cross-power spectral density, coherence, and phase differences. The data were then analyzed with a fuzzy logic module to determine the vibration characteristics. The ex-core neutron signal fluctuation was directly proportional to the CSB's vibration observed at 8Hz and15Hzin the beam mode vibration, and at 8Hz in the shell mode vibration. The coherence result between flux pairs was unity at the vibration peak frequencies. A distinct pattern of phase differences was observed for each of the vibration models. The developed fuzzy logic module demonstrated successful recognition of the vibration frequencies, modes, orders, directions, and phase differences within 0.4 ms for the beam and shell mode vibrations.

A Theoretical Representation of Relaxation Processes in Complex Spin System Using Liouville Space Method

  • Kyunglae Park
    • Bulletin of the Korean Chemical Society
    • /
    • v.14 no.1
    • /
    • pp.21-29
    • /
    • 1993
  • For the study of relaxation processes in complex spin system, a general master equation, which can be used to simulate a vast range of pulse experiments, has been formulated using the Liouville representation of quantum mechanics. The state of a nonequilibrium spin system in magnetic field is described by a density vector in Liouville space and the time evolution of the system is followed by the application of a linear master operator to the density vector in this Liouville space. In this master equation the nuclear spin relaxation due to intramolecular dipolar interaction or randomly fluctuating field interaction is explicitly implemented as a relaxation supermatrix for a strong coupled two-spin (1/2) system. The whole dynamic information inherent in the spin system is thus contained in the density vector and the master operator. The radiofrequency pulses are applied in the same space by corresponding unitary rotational supertransformations of the density vector. If the resulting FID is analytically Fourier transformed, it is possible to represent the final nonstationary spectrum using a frequency dependent spectral vector and intensity determining shape vector. The overall algorithm including relaxation interactions is then translated into an ANSIFORTRAN computer program, which can simulate a variety of two dimensional spectra. Furthermore a new strategy is tested by simulation of multiple quantum signals to differentiate the two relaxation interaction types.

Development and validation of a fast sub-channel code for LWR multi-physics analyses

  • Chaudri, Khurrum Saleem;Kim, Jaeha;Kim, Yonghee
    • Nuclear Engineering and Technology
    • /
    • v.51 no.5
    • /
    • pp.1218-1230
    • /
    • 2019
  • A sub-channel solver, named ${\underline{S}}teady$ and ${\underline{T}}ransient$ ${\underline{A}}nalyzer$ for ${\underline{R}}eactor$ ${\underline{T}}hermal$ hydraulics (START), has been developed using the homogenous model for two-phase conditions of light water reactors. The code is developed as a fast and accurate TH-solver for coupled and multi-physics calculations. START has been validated against the NUPEC PWR Sub-channel and Bundle Test (PSBT) database. Tests like single-channel quality and void-fraction for steady state, outlet fluid temperature for steady state, rod-bundle quality and void-fraction for both steady state and transient conditions have been analyzed and compared with experimental values. Results reveal a good accuracy of solution for both steady state and transient scenarios. Axially different values for turbulent mixing coefficient are used based on different grid-spacer types. This provides better results as compared to using a single value of turbulent mixing coefficient. Code-to-code evaluation of PSBT results by the START code compares well with other industrial codes. The START code has been parallelized with the OpenMP algorithm and its numerical performance is evaluated with a large whole PWR core. Scaling study of START shows a good parallel performance.

Development of machine learning model for automatic ELM-burst detection without hyperparameter adjustment in KSTAR tokamak

  • Jiheon Song;Semin Joung;Young-Chul Ghim;Sang-hee Hahn;Juhyeok Jang;Jungpyo Lee
    • Nuclear Engineering and Technology
    • /
    • v.55 no.1
    • /
    • pp.100-108
    • /
    • 2023
  • In this study, a neural network model inspired by a one-dimensional convolution U-net is developed to automatically accelerate edge localized mode (ELM) detection from big diagnostic data of fusion devices and increase the detection accuracy regardless of the hyperparameter setting. This model recognizes the input signal patterns and overcomes the problems of existing detection algorithms, such as the prominence algorithm and those of differential methods with high sensitivity for the threshold and signal intensity. To train the model, 10 sets of discharge radiation data from the KSTAR are used and sliced into 11091 inputs of length 12 ms, of which 20% are used for validation. According to the receiver operating characteristic curves, our model shows a positive prediction rate and a true prediction rate of approximately 90% each, which is comparable to the best detection performance afforded by other algorithms using their optimized hyperparameters. The accurate and automatic ELM-burst detection methodology used in our model can be beneficial for determining plasma properties, such as the ELM frequency from big data measured in multiple experiments using machines from the KSTAR device and ITER. Additionally, it is applicable to feature detection in the time-series data of other engineering fields.

FPGA-Based Post-Quantum Cryptography Hardware Accelerator Design using High Level Synthesis (HLS 를 이용한 FPGA 기반 양자내성암호 하드웨어 가속기 설계)

  • Haesung Jung;Hanyoung Lee;Hanho Lee
    • Transactions on Semiconductor Engineering
    • /
    • v.1 no.1
    • /
    • pp.1-8
    • /
    • 2023
  • This paper presents the design and implementation of Crystals-Kyber, a next-generation postquantum cryptography, as a hardware accelerator on an FPGA using High-Level Synthesis (HLS). We optimized the Crystals-Kyber algorithm using various directives provided by Vitis HLS, configured the AXI interface, and designed a hardware accelerator that can be implemented on an FPGA. Then, we used Vivado tool to design the IP block and implement it on the ZYNQ ZCU106 FPGA. Finally, the video was recorded and H.264 compressed with Python code in the PYNQ framework, and the video encryption and decryption were accelerated using Crystals-Kyber hardware accelerator implemented on the FPGA.

A methodology to quantify effects of constitutive equations on safety analysis using integral effect test data

  • ChoHwan Oh;Jeong Ik Lee
    • Nuclear Engineering and Technology
    • /
    • v.56 no.8
    • /
    • pp.2999-3029
    • /
    • 2024
  • To improve the predictive capability of a nuclear thermal hydraulic safety analysis code by developing a better constitutive equation for individual phenomenon has been the general research direction until now. This paper proposes a new method to directly use complex experimental data obtained from integral effect test (IET) to improve constitutive models holistically and simultaneously. The method relies on the sensitivity of a simulation result of IET data to the multiple constitutive equations utilized during the simulation, and the sensitivity of individual model determines the direction of modification for the constitutive model. To develop a robust and generalized method, a clustering algorithm using an artificial neural network, sample space size determination using non-parametric statistics, and sampling method of Latin hypercube sampling are used in a combined manner. The value of the proposed methodology is demonstrated by applying the method to the ATLAS DSP-05 IET experiment. A sensitivity of each observation parameter to the constitutive models is analyzed. The new methodology suggested in the study can be used to improve the code prediction results of complex IET data by identifying the direction for constitutive equations to be modified.