• Title/Summary/Keyword: Consensus algorithms

Search Result 68, Processing Time 0.025 seconds

A Study on Efficient Data De-Identification Method for Blockchain DID

  • Min, Youn-A
    • International Journal of Internet, Broadcasting and Communication
    • /
    • v.13 no.2
    • /
    • pp.60-66
    • /
    • 2021
  • Blockchain is a technology that enables trust-based consensus and verification based on a decentralized network. Distributed ID (DID) is based on a decentralized structure, and users have the right to manage their own ID. Recently, interest in self-sovereign identity authentication is increasing. In this paper, as a method for transparent and safe sovereignty management of data, among data pseudonymization techniques for blockchain use, various methods for data encryption processing are examined. The public key technique (homomorphic encryption) has high flexibility and security because different algorithms are applied to the entire sentence for encryption and decryption. As a result, the computational efficiency decreases. The hash function method (MD5) can maintain flexibility and is higher than the security-related two-way encryption method, but there is a threat of collision. Zero-knowledge proof is based on public key encryption based on a mutual proof method, and complex formulas are applied to processes such as personal identification, key distribution, and digital signature. It requires consensus and verification process, so the operation efficiency is lowered to the level of O (logeN) ~ O(N2). In this paper, data encryption processing for blockchain DID, based on zero-knowledge proof, was proposed and a one-way encryption method considering data use range and frequency of use was proposed. Based on the content presented in the thesis, it is possible to process corrected zero-knowledge proof and to process data efficiently.

An Overview of Blockchain Technology: Concepts, Consensus, Standardization, and Security Threats (블록체인 기술 동향에 관한 연구)

  • Park, Roy C.;Lee, Young Sil
    • Journal of the Institute of Convergence Signal Processing
    • /
    • v.20 no.4
    • /
    • pp.218-225
    • /
    • 2019
  • Since the publication of Satoshi Nakamoto's white paper on Bitcoin in 2008, blockchain is in the spotlight as one of the core technologies of the Fouth Industrial Revolution, which can be used in various industries beyond simple cryptocurrency. various researches and developments are being conducted worldwide to utilize blockchain technology, and a global blockchain consortium is formed. In addition, attempts are being made to apply to various industries such as logistics, distribution, and medical care as well as the financial sector. However, blockchain tecnology developments still do not reach the level that meets these concerns and expectations. In this paper, we presents a comprehensive overview of blockchain technology by giving its brief concepts, consensus algorithms, standardization, and security threats.

Reliability-Based Adaptive Consensus Algorithm for Synchronization in a Distributed Network (분산 네트워크에서 단말 간 동기화를 위한 신뢰도 기반의 적응적 컨센서스 알고리즘)

  • Seo, Sangah;Yun, Sangseok;Ha, Jeongseok
    • The Journal of Korean Institute of Communications and Information Sciences
    • /
    • v.42 no.3
    • /
    • pp.545-553
    • /
    • 2017
  • This paper investigates a synchronization algorithm for a distributed network which does not have a centralized infrastructure. In order to operate a distributed network, synchronization across distributed terminals should be acquired in advance, and hence, a plenty of distributed synchronization algorithms have been studied extensively in the past. However, most of the previous studies focus on the synchronization only in fault-free networks. Thus, if there are some malfunctioning terminals in the network, the synchronization can not be guaranteed with conventional distributed synchronization methods. In this paper, we propose a reliability-based adaptive consensus algorithm which can effectively acquire the synchronization across distributed terminals and confirm performance of the proposed algorithm by conducting numerical simulations.

Fully Automatic Segmentation of Acute Ischemic Lesions on Diffusion-Weighted Imaging Using Convolutional Neural Networks: Comparison with Conventional Algorithms

  • Ilsang Woo;Areum Lee;Seung Chai Jung;Hyunna Lee;Namkug Kim;Se Jin Cho;Donghyun Kim;Jungbin Lee;Leonard Sunwoo;Dong-Wha Kang
    • Korean Journal of Radiology
    • /
    • v.20 no.8
    • /
    • pp.1275-1284
    • /
    • 2019
  • Objective: To develop algorithms using convolutional neural networks (CNNs) for automatic segmentation of acute ischemic lesions on diffusion-weighted imaging (DWI) and compare them with conventional algorithms, including a thresholding-based segmentation. Materials and Methods: Between September 2005 and August 2015, 429 patients presenting with acute cerebral ischemia (training:validation:test set = 246:89:94) were retrospectively enrolled in this study, which was performed under Institutional Review Board approval. Ground truth segmentations for acute ischemic lesions on DWI were manually drawn under the consensus of two expert radiologists. CNN algorithms were developed using two-dimensional U-Net with squeeze-and-excitation blocks (U-Net) and a DenseNet with squeeze-and-excitation blocks (DenseNet) with squeeze-and-excitation operations for automatic segmentation of acute ischemic lesions on DWI. The CNN algorithms were compared with conventional algorithms based on DWI and the apparent diffusion coefficient (ADC) signal intensity. The performances of the algorithms were assessed using the Dice index with 5-fold cross-validation. The Dice indices were analyzed according to infarct volumes (< 10 mL, ≥ 10 mL), number of infarcts (≤ 5, 6-10, ≥ 11), and b-value of 1000 (b1000) signal intensities (< 50, 50-100, > 100), time intervals to DWI, and DWI protocols. Results: The CNN algorithms were significantly superior to conventional algorithms (p < 0.001). Dice indices for the CNN algorithms were 0.85 for U-Net and DenseNet and 0.86 for an ensemble of U-Net and DenseNet, while the indices were 0.58 for ADC-b1000 and b1000-ADC and 0.52 for the commercial ADC algorithm. The Dice indices for small and large lesions, respectively, were 0.81 and 0.88 with U-Net, 0.80 and 0.88 with DenseNet, and 0.82 and 0.89 with the ensemble of U-Net and DenseNet. The CNN algorithms showed significant differences in Dice indices according to infarct volumes (p < 0.001). Conclusion: The CNN algorithm for automatic segmentation of acute ischemic lesions on DWI achieved Dice indices greater than or equal to 0.85 and showed superior performance to conventional algorithms.

Blockchain Technology for Combating Deepfake and Protect Video/Image Integrity

  • Rashid, Md Mamunur;Lee, Suk-Hwan;Kwon, Ki-Ryong
    • Journal of Korea Multimedia Society
    • /
    • v.24 no.8
    • /
    • pp.1044-1058
    • /
    • 2021
  • Tempered electronic contents have multiplied in last few years, thanks to the emergence of sophisticated artificial intelligence(AI) algorithms. Deepfakes (fake footage, photos, speech, and videos) can be a frightening and destructive phenomenon that has the capacity to distort the facts and hamper reputation by presenting a fake reality. Evidence of ownership or authentication of digital material is crucial for combating the fabricated content influx we are facing today. Current solutions lack the capacity to track digital media's history and provenance. Due to the rise of misrepresentation created by technologies like deepfake, detection algorithms are required to verify the integrity of digital content. Many real-world scenarios have been claimed to benefit from blockchain's authentication capabilities. Despite the scattered efforts surrounding such remedies, relatively little research has been undertaken to discover where blockchain technology can be used to tackle the deepfake problem. Latest blockchain based innovations such as Smart Contract, Hyperledger fabric can play a vital role against the manipulation of digital content. The goal of this paper is to summarize and discuss the ongoing researches related to blockchain's capabilities to protect digital content authentication. We have also suggested a blockchain (smart contract) dependent framework that can keep the data integrity of original content and thus prevent deepfake. This study also aims at discussing how blockchain technology can be used more effectively in deepfake prevention as well as highlight the current state of deepfake video detection research, including the generating process, various detection algorithms, and existing benchmarks.

Past, Present and Future of Blockchain Technology (블록체인 세대별 기술 동향)

  • Park, J.S.;Park, J.Y.;Choi, S.M.;Oh, J.T.;Kim, K.Y.
    • Electronics and Telecommunications Trends
    • /
    • v.33 no.6
    • /
    • pp.139-153
    • /
    • 2018
  • The explosive interest in block chain, which was triggered by Bitcoin in 2009, is leading to substantial investment and the development of block chain technology. There is no dispute among experts that block chain will be the next generation of innovation. However, despite the high expectations for block chains, the related technology still has certain limitations. In addition to improving issues such as a low transaction throughput, inefficient agreement algorithms, and an inflexible governance structure, it is necessary to solve various problems for commercialization and full-scale spreading owing to the trilemma problem among the scalability, security, and decentralization. Under this situation, identification of the technology characteristics according to the generation is helpful for the development of the core technology requirements and commercialization blueprint in establishing an R&D direction. Therefore, in this article, the development of blockchain technology is divided into generations and analyzed in terms of the operational structure, consensus algorithm, governance, scalability, and security.

Selection of Fusion Level for Adolescent Idiopathic Scoliosis Surgery : Selective Fusion versus Postoperative Decompensation

  • Kim, Do-Hyoung;Hyun, Seung-Jae;Kim, Ki-Jeong
    • Journal of Korean Neurosurgical Society
    • /
    • v.64 no.4
    • /
    • pp.473-485
    • /
    • 2021
  • Adolescent idiopathic scoliosis (AIS), which is associated with an extensive range of clinical and radiological presentations, is the one of the most challenging spinal disorders. The goals of surgery are to correct the deformity in 3 dimensions and to preserve motion segments while avoiding complications. Despite the ongoing evolution of classification systems and algorithms for the surgical treatment of AIS, there has been considerable debate regarding the selection of an appropriate fusion level in AIS. In addition, there is no consensus regarding the exact description, relationship, and risk factors of coronal decompensation following selective fusion. In this review, we summarize the current concepts of selection of the fusion level for AIS and review the available information about postoperative coronal decompensation.

QSPR analysis for predicting heat of sublimation of organic compounds (유기화합물의 승화열 예측을 위한 QSPR분석)

  • Park, Yu Sun;Lee, Jong Hyuk;Park, Han Woong;Lee, Sung Kwang
    • Analytical Science and Technology
    • /
    • v.28 no.3
    • /
    • pp.187-195
    • /
    • 2015
  • The heat of sublimation (HOS) is an essential parameter used to resolve environmental problems in the transfer of organic contaminants to the atmosphere and to assess the risk of toxic chemicals. The experimental measurement of the heat of sublimation is time-consuming, expensive, and complicated. In this study, quantitative structural property relationships (QSPR) were used to develop a simple and predictive model for measuring the heat of sublimation of organic compounds. The population-based forward selection method was applied to select an informative subset of descriptors of learning algorithms, such as by using multiple linear regression (MLR) and the support vector machine (SVM) method. Each individual model and consensus model was evaluated by internal validation using the bootstrap method and y-randomization. The predictions of the performance of the external test set were improved by considering their applicability to the domain. Based on the results of the MLR model, we showed that the heat of sublimation was related to dispersion, H-bond, electrostatic forces, and the dipole-dipole interaction between inter-molecules.

Feature Matching Algorithm Robust To Viewpoint Change (시점 변화에 강인한 특징점 정합 기법)

  • Jung, Hyun-jo;Yoo, Ji-sang
    • The Journal of Korean Institute of Communications and Information Sciences
    • /
    • v.40 no.12
    • /
    • pp.2363-2371
    • /
    • 2015
  • In this paper, we propose a new feature matching algorithm which is robust to the viewpoint change by using the FAST(Features from Accelerated Segment Test) feature detector and the SIFT(Scale Invariant Feature Transform) feature descriptor. The original FAST algorithm unnecessarily results in many feature points along the edges in the image. To solve this problem, we apply the principal curvatures for refining it. We use the SIFT descriptor to describe the extracted feature points and calculate the homography matrix through the RANSAC(RANdom SAmple Consensus) with the matching pairs obtained from the two different viewpoint images. To make feature matching robust to the viewpoint change, we classify the matching pairs by calculating the Euclidean distance between the transformed coordinates by the homography transformation with feature points in the reference image and the coordinates of the feature points in the different viewpoint image. Through the experimental results, it is shown that the proposed algorithm has better performance than the conventional feature matching algorithms even though it has much less computational load.

Validation of housekeeping genes as candidate internal references for quantitative expression studies in healthy and nervous necrosis virus-infected seven-band grouper (Hyporthodus septemfasciatus)

  • Krishnan, Rahul;Qadiri, Syed Shariq Nazir;Kim, Jong-Oh;Kim, Jae-Ok;Oh, Myung-Joo
    • Fisheries and Aquatic Sciences
    • /
    • v.22 no.12
    • /
    • pp.28.1-28.8
    • /
    • 2019
  • Background: In the present study, we evaluated four commonly used housekeeping genes, viz., actin-β, elongation factor-1α (EF1α), acidic ribosomal protein (ARP), and glyceraldehyde 3-phosphate dehydrogenase (GAPDH) as internal references for quantitative analysis of immune genes in nervous necrosis virus (NNV)-infected seven-band grouper, Hyporthodus septemfasciatus. Methods: Expression profiles of the four genes were estimated in 12 tissues of healthy and infected seven-band grouper. Expression stability of the genes was calculated using the delta Ct method, BestKeeper, NormFinder, and geNorm algorithms. Consensus ranking was performed using RefFinder, and statistical analysis was done using GraphpadPrism 5.0. Results: Tissue-specific variations were observed in the four tested housekeeping genes of healthy and NNV-infected seven-band grouper. Fold change calculation for interferon-1 and Mx expression using the four housekeeping genes as internal references presented varied profiles for each tissue. EF1α and actin-β was the most stable expressed gene in tissues of healthy and NNV-infected seven-band grouper, respectively. Consensus ranking using RefFinder suggested EF1α as the least variable and highly stable gene in the healthy and infected animals. Conclusions: These results suggest that EF1α can be a fairly better internal reference in comparison to other tested genes in this study during the NNV infection process. This forms the pilot study on the validation of reference genes in Hyporthodus septemfasciatus, in the context of NNV infection.