• Title/Summary/Keyword: Data Verification

Search Result 3,480, Processing Time 0.037 seconds

A Rapid Locating Protocol of Corrupted Data for Cloud Data Storage

  • Xu, Guangwei;Yang, Yanbin;Yan, Cairong;Gan, Yanglan
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • v.10 no.10
    • /
    • pp.4703-4723
    • /
    • 2016
  • The verification of data integrity is an urgent topic in remote data storage environments with the wide deployment of cloud data storage services. Many traditional verification algorithms focus on the block-oriented verification to resolve the dispute of dynamic data integrity between the data owners and the storage service providers. However, these algorithms scarcely pay attention to the data verification charge and the users' verification experience. The users more concern about the availability of accessed files rather than data blocks. Moreover, the data verification charge limits the number of checked data in each verification. Therefore, we propose a mixed verification protocol to verify the data integrity, which rapidly locates the corrupted files by the file-oriented verification, and then identifies the corrupted blocks in these files by the block-oriented verification. Theoretical analysis and simulation results demonstrate that the protocol reduces the cost of the metadata computation and transmission relative to the traditional block-oriented verification at the expense of little cost of additional file-oriented metadata computation and storage at the data owner. Both the opportunity of data extracted and the scope of suspicious data are optimized to improve the verification efficiency under the same verification cost.

Verification Algorithm for the Duplicate Verification Data with Multiple Verifiers and Multiple Verification Challenges

  • Xu, Guangwei;Lai, Miaolin;Feng, Xiangyang;Huang, Qiubo;Luo, Xin;Li, Li;Li, Shan
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • v.15 no.2
    • /
    • pp.558-579
    • /
    • 2021
  • The cloud storage provides flexible data storage services for data owners to remotely outsource their data, and reduces data storage operations and management costs for data owners. These outsourced data bring data security concerns to the data owner due to malicious deletion or corruption by the cloud service provider. Data integrity verification is an important way to check outsourced data integrity. However, the existing data verification schemes only consider the case that a verifier launches multiple data verification challenges, and neglect the verification overhead of multiple data verification challenges launched by multiple verifiers at a similar time. In this case, the duplicate data in multiple challenges are verified repeatedly so that verification resources are consumed in vain. We propose a duplicate data verification algorithm based on multiple verifiers and multiple challenges to reduce the verification overhead. The algorithm dynamically schedules the multiple verifiers' challenges based on verification time and the frequent itemsets of duplicate verification data in challenge sets by applying FP-Growth algorithm, and computes the batch proofs of frequent itemsets. Then the challenges are split into two parts, i.e., duplicate data and unique data according to the results of data extraction. Finally, the proofs of duplicate data and unique data are computed and combined to generate a complete proof of every original challenge. Theoretical analysis and experiment evaluation show that the algorithm reduces the verification cost and ensures the correctness of the data integrity verification by flexible batch data verification.

Verification Control Algorithm of Data Integrity Verification in Remote Data sharing

  • Xu, Guangwei;Li, Shan;Lai, Miaolin;Gan, Yanglan;Feng, Xiangyang;Huang, Qiubo;Li, Li;Li, Wei
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • v.16 no.2
    • /
    • pp.565-586
    • /
    • 2022
  • Cloud storage's elastic expansibility not only provides flexible services for data owners to store their data remotely, but also reduces storage operation and management costs of their data sharing. The data outsourced remotely in the storage space of cloud service provider also brings data security concerns about data integrity. Data integrity verification has become an important technology for detecting the integrity of remote shared data. However, users without data access rights to verify the data integrity will cause unnecessary overhead to data owner and cloud service provider. Especially malicious users who constantly launch data integrity verification will greatly waste service resources. Since data owner is a consumer purchasing cloud services, he needs to bear both the cost of data storage and that of data verification. This paper proposes a verification control algorithm in data integrity verification for remotely outsourced data. It designs an attribute-based encryption verification control algorithm for multiple verifiers. Moreover, data owner and cloud service provider construct a common access structure together and generate a verification sentinel to verify the authority of verifiers according to the access structure. Finally, since cloud service provider cannot know the access structure and the sentry generation operation, it can only authenticate verifiers with satisfying access policy to verify the data integrity for the corresponding outsourced data. Theoretical analysis and experimental results show that the proposed algorithm achieves fine-grained access control to multiple verifiers for the data integrity verification.

Energy-Efficient Algorithm for Assigning Verification Tasks in Cloud Storage

  • Xu, Guangwei;Sun, Zhifeng;Yan, Cairong;Shi, Xiujin;Li, Yue
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • v.11 no.1
    • /
    • pp.1-17
    • /
    • 2017
  • Mobile Cloud Computing has become a promising computing platform. It moves users' data to the centralized large data centers for users' mobile devices to conveniently access. Since the data storage service may not be fully trusted, many public verification algorithms are proposed to check the data integrity. However, these algorithms hardly consider the huge computational burden for the verifiers with resource-constrained mobile devices to execute the verification tasks. We propose an energy-efficient algorithm for assigning verification tasks (EEAVT) to optimize the energy consumption and assign the verification tasks by elastic and customizable ways. The algorithm prioritizes verification tasks according to the expected finish time of the verification, and assigns the number of checked blocks referring to devices' residual energy and available operation time. Theoretical analysis and experiment evaluation show that our algorithm not only shortens the verification finish time, but also decreases energy consumption, thus improving the efficiency and reliability of the verification.

Design of Integrated Verification Process for Sending Data Gathering System (센싱 데이터 수집 시스템을 위한 통합검증 프로세스 설계)

  • Kim, Yu-Doo
    • Proceedings of the Korean Institute of Information and Commucation Sciences Conference
    • /
    • 2021.10a
    • /
    • pp.305-306
    • /
    • 2021
  • It has been designed very complex that gathering system for various sending data. Therefore it is very important that verification process of these functions. In this paper we design of integrated verification process for sensing data gathering system.

  • PDF

Efficient Public Verification on the Integrity of Multi-Owner Data in the Cloud

  • Wang, Boyang;Li, Hui;Liu, Xuefeng;Li, Fenghua;Li, Xiaoqing
    • Journal of Communications and Networks
    • /
    • v.16 no.6
    • /
    • pp.592-599
    • /
    • 2014
  • Cloud computing enables users to easily store their data and simply share data with others. Due to the security threats in an untrusted cloud, users are recommended to compute verification metadata, such as signatures, on their data to protect the integrity. Many mechanisms have been proposed to allow a public verifier to efficiently audit cloud data integrity without receiving the entire data from the cloud. However, to the best of our knowledge, none of them has considered about the efficiency of public verification on multi-owner data, where each block in data is signed by multiple owners. In this paper, we propose a novel public verification mechanism to audit the integrity of multi-owner data in an untrusted cloud by taking the advantage of multisignatures. With our mechanism, the verification time and storage overhead of signatures on multi-owner data in the cloud are independent with the number of owners. In addition, we demonstrate the security of our scheme with rigorous proofs. Compared to the straightforward extension of previous mechanisms, our mechanism shows a better performance in experiments.

One-stop Platform for Verification of ICT-based environmental monitoring sensor data (ICT 기반 환경모니터링 센서 데이터 검증을 위한 원스탑 플랫폼)

  • Chae, Minah;Cho, Jae Hyuk
    • Journal of Platform Technology
    • /
    • v.9 no.1
    • /
    • pp.32-39
    • /
    • 2021
  • Existing environmental measuring devices mainly focus on electromagnetic wave and eco-friendly product certification and durability test, and sensor reliability verification and verification of measurement data are conducted mainly through sensor performance evaluation through type approval and registration, acceptance test, initial calibration, and periodic test. This platform has established an ICT-based environmental monitoring sensor reliability verification system that supports not only performance evaluation for each target sensor, but also a verification system for sensor data reliability. A sensor board to collect sensor data for environmental information was produced, and a sensor and data reliability evaluation and verification service system was standardized. In addition, to evaluate and verify the reliability of sensor data based on ICT, a sensor data platform monitoring prototype using LoRa communication was produced, and the test was conducted in smart cities. To analyze the data received through the system, an optimization algorithm was developed using machine learning. Through this, a sensor big data analysis system is established for reliability verification, and the foundation for an integrated evaluation and verification system is provide.

On Speaker Adaptations with Sparse Training Data for Improved Speaker Verification

  • Ahn, Sung-Joo;Kang, Sun-Mee;Ko, Han-Seok
    • Speech Sciences
    • /
    • v.7 no.1
    • /
    • pp.31-37
    • /
    • 2000
  • This paper concerns effective speaker adaptation methods to solve the over-training problem in speaker verification, which frequently occurs when modeling a speaker with sparse training data. While various speaker adaptations have already been applied to speech recognition, these methods have not yet been formally considered in speaker verification. This paper proposes speaker adaptation methods using a combination of MAP and MLLR adaptations, which are successfully used in speech recognition, and applies to speaker verification. Experimental results show that the speaker verification system using a weighted MAP and MLLR adaptation outperforms that of the conventional speaker models without adaptation by a factor of up to 5 times. From these results, we show that the speaker adaptation method achieves significantly better performance even when only small training data is available for speaker verification.

  • PDF

A Privacy-preserving Data Aggregation Scheme with Efficient Batch Verification in Smart Grid

  • Zhang, Yueyu;Chen, Jie;Zhou, Hua;Dang, Lanjun
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • v.15 no.2
    • /
    • pp.617-636
    • /
    • 2021
  • This paper presents a privacy-preserving data aggregation scheme deals with the multidimensional data. It is essential that the multidimensional data is rarely mentioned in all researches on smart grid. We use the Paillier Cryptosystem and blinding factor technique to encrypt the multidimensional data as a whole and take advantage of the homomorphic property of the Paillier Cryptosystem to achieve data aggregation. Signature and efficient batch verification have also been applied into our scheme for data integrity and quick verification. And the efficient batch verification only requires 2 pairing operations. Our scheme also supports fault tolerance which means that even some smart meters don't work, our scheme can still work well. In addition, we give two extensions of our scheme. One is that our scheme can be used to compute a fixed user's time-of-use electricity bill. The other is that our scheme is able to effectively and quickly deal with the dynamic user situation. In security analysis, we prove the detailed unforgeability and security of batch verification, and briefly introduce other security features. Performance analysis shows that our scheme has lower computational complexity and communication overhead than existing schemes.

A Study on Duplication Verification of Public Library Catalog Data: Focusing on the Case of G Library in Busan (공공도서관 목록데이터의 중복검증에 관한 연구 - 부산 지역 G도서관 사례를 중심으로 -)

  • Min-geon Song;Soo-Sang Lee
    • Journal of Korean Library and Information Science Society
    • /
    • v.55 no.1
    • /
    • pp.1-26
    • /
    • 2024
  • The purpose of this study is to derive an integration plan for bibliographic records by applying a duplicate verification algorithm to the item-based catalog in public libraries. To this, G Library, which was opened recently in Busan, was selected. After collecting OPAC data from G Library through web crawling, multipart monographs of Korean Literature (KDC 800) were selected and KERIS duplicate verification algorithm was applied. After two rounds of data correction based on the verification results, the duplicate verification rate increased by a total of 2.74% from 95.53% to 98.27%. Even after data correction, 24 books that were judged to be similar or inconsistent were identified as data from other published editions after receiving separate ISBN such as revised versions or hard copies. Through this, it was confirmed that the duplicate verification rate could be improved through catalog data correction work, and the possibility of using the KERIS duplicate verification algorithm as a tool to convert duplicate item-based records from public libraries into manifestation-based records was confirmed.