• Title/Summary/Keyword: 입력설계기법

Search Result 828, Processing Time 0.022 seconds

Enhanced Polynomial Selection Method for GNFS (GNFS를 위한 향상된 다항식 선택 기법)

  • Kim, Suhri;Kwon, Jihoon;Cho, Sungmin;Chang, Nam Su;Yoon, Kisoon;Han, Chang;Park, Young-Ho;Hong, Seokhie
    • Journal of the Korea Institute of Information Security & Cryptology
    • /
    • v.26 no.5
    • /
    • pp.1121-1130
    • /
    • 2016
  • RSA cryptosystem is one of the most widely used public key cryptosystem. The security of RSA cryptosystem is based on hardness of factoring large number and hence there are ongoing attempt to factor RSA modulus. General Number Field Sieve (GNFS) is currently the fastest known method for factoring large numbers so that CADO-NFS - publicly well-known software that was used to factor RSA-704 - is also based on GNFS. However, one disadvantage is that CADO-NFS could not always select the optimal polynomial for given parameters. In this paper, we analyze CADO-NFS's polynomial selection stage. We propose modified polynomial selection using Chinese Remainder Theorem and Euclidean Distance. In this way, we can always select polynomial better than original version of CADO-NFS and expected to use for factoring RSA-1024.

An Approach to Constructing an Efficient Entropy Source on Multicore Processor (멀티코어 환경에서 효율적인 엔트로피 원의 설계 기법)

  • Kim, SeongGyeom;Lee, SeungJoon;Kang, HyungChul;Hong, Deukjo;Sung, Jaechul;Hong, Seokhie
    • Journal of the Korea Institute of Information Security & Cryptology
    • /
    • v.28 no.1
    • /
    • pp.61-71
    • /
    • 2018
  • In the Internet of Things, in which plenty of devices have connection to each other, cryptographically secure Random Number Generators (RNGs) are essential. Particularly, entropy source, which is the only one non-deterministic part in generating random numbers, has to equip with an unpredictable noise source(or more) for the required security strength. This might cause an requirement of additional hardware extracting noise source. Although additional hardware resources has better performance, it is needed to make the best use of existing resources in order to avoid extra costs, such as area, power consumption. In this paper, we suggest an entropy source which uses a multi-threaded program without any additional hardware. As a result, it reduces the difficulty when implementing on lightweight, low-power devices. Additionally, according to NIST's entropy estimation test suite, the suggested entropy source is tested to be secure enough for source of entropy input.

An Efficient Code Expansion from EM to SPARC Code (EM에서 SPARC 코드로 효율적인 코드 확장)

  • Oh, Se-Man;Yun, Young-Shick
    • The Transactions of the Korea Information Processing Society
    • /
    • v.4 no.10
    • /
    • pp.2596-2604
    • /
    • 1997
  • There are two kinds of backends in ACK:code generator(full-fledged backend) and code expander(fast backend). Code generators generate target code using string pattern matching and code expanders generate target code using macro expansion. ACK translates EM to SPARC code using code expander. The corresponding SPARC code sequences for a EM code are generated and then push-pop optimization is performed. But, there is the problem of maintaining hybrid stack. And code expander is not considered to passes parameters of a procedure call through register windows. The purpose of this paper is to improve SPARC code quality. We suggest a method of SPARC cod generation using EM tree. Our method is divided into two phases:EM tree building phase and code expansion phase. The EM tree building phase creates the EM tree and code expansion phase translates it into SPARC code. EM tree is designed to pass parameters of a procedure call through register windows. To remove hybrid stack, we extract an additional information from EM code. We improved many disadvantages that arise from code expander in ACK.

  • PDF

Replacement Condition Detection of Railway Point Machines Using Data Cube and SVM (데이터 큐브 모델과 SVM을 이용한 철도 선로전환기의 교체시기 탐지)

  • Choi, Yongju;Oh, Jeeyoung;Park, Daihee;Chung, Yongwha;Kim, Hee-Young
    • Smart Media Journal
    • /
    • v.6 no.2
    • /
    • pp.33-41
    • /
    • 2017
  • Railway point machines act as actuators that provide different routes to trains by driving switchblades from the current position to the opposite one. Since point failure caused by the aging effect can significantly affect railway operations with potentially disastrous consequences, replacement detection of point machine at an appropriate time is critical. In this paper, we propose a replacement condition detection method of point machine in railway condition monitoring systems using electrical current signals, after analyzing and relabeling domestic in-field replacement data by means of OLAP(On-Line Analytical Processing) operations in the multidimensional data cube into "does-not-need-to-be replaced" and "needs-to-be-replaced" data. The system enables extracting suitable feature vectors from the incoming electrical current signals by DWT(Discrete Wavelet Transform) with reduced feature dimensions using PCA(Principal Components Analysis), and employs SVM(Support Vector Machine) for the real-time replacement detection of point machine. Experimental results with in-field replacement data including points anomalies show that the system could detect the replacement conditions of railway point machines with accuracy exceeding 98%.

Calculation of Probabilistic Damage Stability Based on Grid Model (격자모델을 이용한 확률론적 손상복원력 계산의 전산화)

  • Jong-Ho Nam;Won-Don Kim;Kwang-Wook Kim
    • Journal of the Society of Naval Architects of Korea
    • /
    • v.31 no.1
    • /
    • pp.14-21
    • /
    • 1994
  • The studios on the stability of damaged ships have been carried out continuously to prevent frequent damages or sinkings which cause large loss of life and fortunes. For dry cargo ships, continuing losses have resulted in new legislation of the probabilistic damage stability. IMO has developed requirements for the subdivison and damage stability of dry cargo ships based on probabilistic concepts. The calculation of the probabilistc damage stability is a complicated and iterative job hence development of computer programs is indispensable. In this research, programming of the probabilistic damage stability according to new requirements has been done and the results were compared with those carried out by the other foreign packages. New algorithm using a grid model in a transversal section was introduced to reduce efforts in preparing input data for damage scenarios and as a result, has brought significant improvement in efficiency and performance.

  • PDF

Numerical Web Model for Quality Management of Concrete based on Compressive Strength (압축강도 기반의 콘크리트 품질관리를 위한 웹 전산모델 개발)

  • Lee, Goon-Jae;Kim, Hak-Young;Lee, Hye-Jin;Hwang, Seung-Hyeon;Yang, Keun-Hyeok
    • Journal of the Korea Institute of Building Construction
    • /
    • v.21 no.3
    • /
    • pp.195-202
    • /
    • 2021
  • Concrete quality is mainly managed through the reliable prediction and control of compressive strength. Although related industries have established a relevant datasets based on the mixture proportions and compressive strength gain, whereas they have not been shared due to various reasons including technology leakage. Consequently, the costs and efforts for quality control have been wasted excessively. This study aimed to develop a web-based numerical model, which would present diverse optimal values including concrete strength prediction to the user, and to establish a sustainable database (DB) collection system by inducing the data entered by the user to be collected for the DB. The system handles the overall technology related to the concrete. Particularly, it predicts compressive strength at a mean accuracy of 89.2% by applying the artificial neural network method, modeled based on extensive DBs.

RGB-Depth Camera for Dynamic Measurement of Liquid Sloshing (RGB-Depth 카메라를 활용한 유체 표면의 거동 계측분석)

  • Kim, Junhee;Yoo, Sae-Woung;Min, Kyung-Won
    • Journal of the Computational Structural Engineering Institute of Korea
    • /
    • v.32 no.1
    • /
    • pp.29-35
    • /
    • 2019
  • In this paper, a low-cost dynamic measurement system using the RGB-depth camera, Microsoft $Kinect^{(R)}$ v2, is proposed for measuring time-varying free surface motion of liquid dampers used in building vibration mitigation. Various experimental studies are conducted consecutively: performance evaluation and validation of the $Kinect^{(R)}$ v2, real-time monitoring using the $Kinect^{(R)}$ v2 SDK(software development kits), point cloud acquisition of liquid free surface in the 3D space, comparison with the existing video sensing technology. Utilizing the proposed $Kinect^{(R)}$ v2-based measurement system in this study, dynamic behavior of liquid in a laboratory-scaled small tank under a wide frequency range of input excitation is experimentally analyzed.

A Low Power Voltage Controlled Oscillator with Bandwidth Extension Scheme (대역폭 증가 기법을 사용한 저전력 전압 제어 발진기)

  • Lee, Won-Young;Lee, Gye-Min
    • The Journal of the Korea institute of electronic communication sciences
    • /
    • v.16 no.1
    • /
    • pp.69-74
    • /
    • 2021
  • This paper introduces a low-power voltage-controlled oscillator(VCO) with filters that consist of resistors and capacitors. The proposed VCO contains a 5-stage current mode buffer, and each buffer cell has a resistor-capacitor filter that connects input and output terminals. The filter adds a zero to the buffer cell. Because the zero moves the oscillation condition to high frequencies, the proposed VCO can generate a high frequency clock with low power consumption. The proposed circuit has been designed with 0.18 ㎛ CMOS process. The power consumption is 9.83 mW at 2.7 GHz. The proposed VCO shows 3.64 pJ/Hz in our simulation study, whereas the conventional circuit shows 4.79 pJ/Hz, indicating that our VCO achieves 24% reduction in power consumption.

An Efficient LWE-Based Reusable Fuzzy Extractor (효율적인 LWE 기반 재사용 가능한 퍼지 추출기)

  • Kim, Juon;Lee, Kwangsu;Lee, Dong Hoon
    • Journal of the Korea Institute of Information Security & Cryptology
    • /
    • v.32 no.5
    • /
    • pp.779-790
    • /
    • 2022
  • Fuzzy extractor is a biometric encryption that generates keys from biometric data where input values are not always the same due to the noisy data, and performs authentication securely without exposing biometric information. However, if a user registers biometric data on multiple servers, various attacks on helper data which is a public information used to extract keys during the authentication process of the fuzzy extractor can expose the keys. Therefore many studies have been conducted on reusable fuzzy extractors that are secure to register biometric data of the same person on multiple servers. But as the key length increases, the studies presented so far have gradually increased the number of key recovery processes, making it inefficient and difficult to utilize in security systems. In this paper, we design an efficient and reusable fuzzy extractor based on LWE with the same or similar number of times of the authentication process even if the key length is increased, and show that the proposed algorithm is reusably-secure defined by Apon et al.[5].

Hair Classification and Region Segmentation by Location Distribution and Graph Cutting (위치 분포 및 그래프 절단에 의한 모발 분류와 영역 분할)

  • Kim, Yong-Gil;Moon, Kyung-Il
    • The Journal of the Institute of Internet, Broadcasting and Communication
    • /
    • v.22 no.3
    • /
    • pp.1-8
    • /
    • 2022
  • Recently, Google MedeiaPipe presents a novel approach for neural network-based hair segmentation from a single camera input specifically designed for real-time, mobile application. Though neural network related to hair segmentation is relatively small size, it produces a high-quality hair segmentation mask that is well suited for AR effects such as a realistic hair recoloring. However, it has undesirable segmentation effects according to hair styles or in case of containing noises and holes. In this study, the energy function of the test image is constructed according to the estimated prior distributions of hair location and hair color likelihood function. It is further optimized according to graph cuts algorithm and initial hair region is obtained. Finally, clustering algorithm and image post-processing techniques are applied to the initial hair region so that the final hair region can be segmented precisely. The proposed method is applied to MediaPipe hair segmentation pipeline.