• Title/Summary/Keyword: Verification overhead

Search Result 71, Processing Time 0.024 seconds

Verification Algorithm for the Duplicate Verification Data with Multiple Verifiers and Multiple Verification Challenges

  • Xu, Guangwei;Lai, Miaolin;Feng, Xiangyang;Huang, Qiubo;Luo, Xin;Li, Li;Li, Shan
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • v.15 no.2
    • /
    • pp.558-579
    • /
    • 2021
  • The cloud storage provides flexible data storage services for data owners to remotely outsource their data, and reduces data storage operations and management costs for data owners. These outsourced data bring data security concerns to the data owner due to malicious deletion or corruption by the cloud service provider. Data integrity verification is an important way to check outsourced data integrity. However, the existing data verification schemes only consider the case that a verifier launches multiple data verification challenges, and neglect the verification overhead of multiple data verification challenges launched by multiple verifiers at a similar time. In this case, the duplicate data in multiple challenges are verified repeatedly so that verification resources are consumed in vain. We propose a duplicate data verification algorithm based on multiple verifiers and multiple challenges to reduce the verification overhead. The algorithm dynamically schedules the multiple verifiers' challenges based on verification time and the frequent itemsets of duplicate verification data in challenge sets by applying FP-Growth algorithm, and computes the batch proofs of frequent itemsets. Then the challenges are split into two parts, i.e., duplicate data and unique data according to the results of data extraction. Finally, the proofs of duplicate data and unique data are computed and combined to generate a complete proof of every original challenge. Theoretical analysis and experiment evaluation show that the algorithm reduces the verification cost and ensures the correctness of the data integrity verification by flexible batch data verification.

A New Certificate Validation Scheme for Delegating the Digital Signature Verification (디지틀 서명 검증을 위임하기 위한 새로운 인증서 검증 기법)

  • Choi Yeon-Hee;Park Mi-Og;Jun Moon-Seog
    • Journal of Internet Computing and Services
    • /
    • v.4 no.4
    • /
    • pp.53-64
    • /
    • 2003
  • To perform the certificate validation on the user-side application induces the very considerable overhead on the user-side system because of the complex and time-consuming characteristic of the validation processing. Most of the time spend for performing the validation processing is required for the digital signature verification, since the verification accompanies with the cryptographic calculation over each certificate on the certificate path. In this paper, we propose a new certificate validation scheme using DSVP(Delegated Signature Validation Protocol) which can reduce the overhead for the user-side certificate validation processing. It is achieved by delegating the digital signature verification to CAs of the PKI domain. As the proposed DSVP is the protocol performed between a user and CAs, it is applied to the hierarchical PKI efficiently and used for delegating the digital signature verification reliably and safely, our proposed scheme can not only reduces the overhead for the validation processing by decreasing the cryptographic calculation but also improves the utilization of CAs by employing them to the validation processing.

  • PDF

A Fast and Exact Verification of Inter-Domain Data Transfer based on PKI

  • Jung, Im-Y.;Eom, Hyeon-Sang;Yeom, Heon-Y.
    • Journal of Information Technology Applications and Management
    • /
    • v.18 no.3
    • /
    • pp.61-72
    • /
    • 2011
  • Trust for the data created, processed and transferred on e-Science environments can be estimated with provenance. The information to form provenance, which says how the data was created and reached its current state, increases as data evolves. It is a heavy burden to trace and verify the massive provenance in order to trust data. On the other hand, it is another issue how to trust the verification of data with provenance. This paper proposes a fast and exact verification of inter-domain data transfer and data origin for e-Science environment based on PKI. The verification, which is called two-way verification, cuts down the tracking overhead of the data along the causality presented on Open Provenance Model with the domain specialty of e-Science environment supported by Grid Security Infrastructure (GSI). The proposed scheme is easy-applicable without an extra infrastructure, scalable irrespective of the number of provenance records, transparent and secure with cryptography as well as low-overhead.

A Secure Digital Signature Delegation Scheme using CAs (CA를 이용한 안전한 서명 검증 위임 기법)

  • 최연희;박미옥;전문석
    • Journal of the Korea Institute of Information Security & Cryptology
    • /
    • v.13 no.6
    • /
    • pp.55-65
    • /
    • 2003
  • To perform the certificate validation processing on the user-side application induces the very considerable overhead because of the complex and time-consuming characteristic of the validation processing. Especially, the verification for digital signature over a certificate can be the major reason of the overhead, since the verification accompanies with the cryptographic calculation over each certificate on the certificate path. In this paper, we propose a new certificate validation scheme can reduce the overhead caused by user-side certificate validation processing and improve the utilization of CAs. As the result, our proposed scheme can not only reduces the overhead for the validation processing by decreasing the cryptographic calculation but also improves the utilization of CAs by employing them to the validation processing.

Indirect Branch Target Address Verification for Defense against Return-Oriented Programming Attacks (Return-Oriented Programming 공격 방어를 위한 간접 분기 목적 주소 검증 기법)

  • Park, Soohyun;Kim, Sunil
    • KIPS Transactions on Computer and Communication Systems
    • /
    • v.2 no.5
    • /
    • pp.217-222
    • /
    • 2013
  • Return-Oriented Programming(ROP) is an advanced code-reuse attack like a return-to-libc attack. ROP attacks combine gadgets in program code area and make functions like a Turing-complete language. Some of previous defense methods against ROP attacks show high performance overhead because of dynamic execution flow analysis and can defend against only certain types of ROP attacks. In this paper, we propose Indirect Branch Target Address Verification (IBTAV). IBTAV detects ROP attacks by checking if target addresses of indirect branches are valid. IBTAV can defends against almost all ROP attacks because it verifies a target address of every indirect branch instruction. Since IBTAV does not require dynamic execution flow analysis, the performance overhead of IBTAV is relatively low. Our evaluation of IBTAV on SPEC CPU 2006 shows less than 15% performance overhead.

An Analysis on Electrical Property Measurement of Catenary System in Railway (철도 전차선로 전기적 특성 검측 기술 분석)

  • Park, Young;Cho, Yong-Hyeon;Jung, Ho-Sung;Lee, Ki-Won;Gwon, Sam-Yeong
    • Proceedings of the Korean Institute of Electrical and Electronic Material Engineers Conference
    • /
    • 2010.06a
    • /
    • pp.115-115
    • /
    • 2010
  • This paper introduces a measurement system that measures behavior and electrical characteristics of overhead contact line irregular sections in real-time. For verification, we developed a prototype of the real-time overhead contact line irregular section behavior measurement system and a monitoring system for field tests. The current and temperature of contact wires and messenger wires were measured real-time by applying the system at KTX a commercial line. Therefore, acquiring data is possible with the developed system and this system that measures one of the fundamental and key factors, the catenary current, should be applicable to various areas such as detecting characteristics for designing overhead contact lines, enhancing speed, and enhancing energy.

  • PDF

The Low-Area of New arc-tangent Look-up Table and A Low-Overhead for CATV Modem Systems

  • Ban, Young-Hoon;Park, Jong-Woo;Cho, Byung-Lok;Song, Jai-Chul
    • Proceedings of the IEEK Conference
    • /
    • 1999.06a
    • /
    • pp.857-860
    • /
    • 1999
  • It is made possible a removal of the preamble for carrier recovery and symbol-timing recovery by storing a burst in memory with low overhead QPSK demodulation and this demodulation method also effects frame efficiency improved by processed synchronization performance. In this paper, we have proposed that new algorithm for arc-tangent look-up table which transform the input I, Q data by phase. This I, Q data plays an important role in demodulation and makes demodulator with low-overhead by storing a burst in memory. To evaluate proposed new algorithm and symbol-timing recovery method, function simulation and timing verification have been done by using synopsys VHDL tool.

  • PDF

Development of an Uplift Measurement System for Overhead Contact Wire using High Speed Camera (고속카메라를 이용한 전차선 압상량 검측 시스템 개발)

  • Park, Young;Cho, Yong-Hyeon;Lee, Ki-Won;Kim, Hyung-Jun;Kim, In-Chol
    • Journal of the Korean Institute of Electrical and Electronic Material Engineers
    • /
    • v.22 no.10
    • /
    • pp.864-869
    • /
    • 2009
  • The measurement of contact wire uplift in electric railways is one of the most important test parameters to accepting the maximum permitted speed of new electric vehicles and pantographs. The contact wire uplift can be measured over short periods when the pantograph passes monitoring stations. In this paper, a high-speed image measurement system and its image processing method are being developed to evaluate dynamic uplift of overhead contact wires caused by pantograph contact forces of Korea Tilting Train eXpress (TTX) and Korea Train eXpress (KTX). The image measurement system was implemented utilizing a high-speed CMOS (Complementary Metal Oxide Semiconductor) camera and gigabit ethernet LAN. Unlike previous systems, the uplift measurement system using high speed camera is installed on the side of the rail, making maintenance convenient. On-field verification of the uplift measurement system for overhead contact wire using high speed camera was conducted by measuring uplift of the TTX followed by operation speeds at the Honam conventional line and high-speed railway line. The proposed high-speed image measurement system to evaluate dynamic uplift of overhead contact wires shows promising on-field applications for high speed trains such as KTX and TTX.

Efficient Public Verification on the Integrity of Multi-Owner Data in the Cloud

  • Wang, Boyang;Li, Hui;Liu, Xuefeng;Li, Fenghua;Li, Xiaoqing
    • Journal of Communications and Networks
    • /
    • v.16 no.6
    • /
    • pp.592-599
    • /
    • 2014
  • Cloud computing enables users to easily store their data and simply share data with others. Due to the security threats in an untrusted cloud, users are recommended to compute verification metadata, such as signatures, on their data to protect the integrity. Many mechanisms have been proposed to allow a public verifier to efficiently audit cloud data integrity without receiving the entire data from the cloud. However, to the best of our knowledge, none of them has considered about the efficiency of public verification on multi-owner data, where each block in data is signed by multiple owners. In this paper, we propose a novel public verification mechanism to audit the integrity of multi-owner data in an untrusted cloud by taking the advantage of multisignatures. With our mechanism, the verification time and storage overhead of signatures on multi-owner data in the cloud are independent with the number of owners. In addition, we demonstrate the security of our scheme with rigorous proofs. Compared to the straightforward extension of previous mechanisms, our mechanism shows a better performance in experiments.

A Privacy-preserving Data Aggregation Scheme with Efficient Batch Verification in Smart Grid

  • Zhang, Yueyu;Chen, Jie;Zhou, Hua;Dang, Lanjun
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • v.15 no.2
    • /
    • pp.617-636
    • /
    • 2021
  • This paper presents a privacy-preserving data aggregation scheme deals with the multidimensional data. It is essential that the multidimensional data is rarely mentioned in all researches on smart grid. We use the Paillier Cryptosystem and blinding factor technique to encrypt the multidimensional data as a whole and take advantage of the homomorphic property of the Paillier Cryptosystem to achieve data aggregation. Signature and efficient batch verification have also been applied into our scheme for data integrity and quick verification. And the efficient batch verification only requires 2 pairing operations. Our scheme also supports fault tolerance which means that even some smart meters don't work, our scheme can still work well. In addition, we give two extensions of our scheme. One is that our scheme can be used to compute a fixed user's time-of-use electricity bill. The other is that our scheme is able to effectively and quickly deal with the dynamic user situation. In security analysis, we prove the detailed unforgeability and security of batch verification, and briefly introduce other security features. Performance analysis shows that our scheme has lower computational complexity and communication overhead than existing schemes.