• Title/Summary/Keyword: data storage security architecture

Search Result 29, Processing Time 0.021 seconds

A Four-Layer Robust Storage in Cloud using Privacy Preserving Technique with Reliable Computational Intelligence in Fog-Edge

  • Nirmala, E.;Muthurajkumar, S.
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • v.14 no.9
    • /
    • pp.3870-3884
    • /
    • 2020
  • The proposed framework of Four Layer Robust Storage in Cloud (FLRSC) architecture involves host server, local host and edge devices in addition to Virtual Machine Monitoring (VMM). The goal is to protect the privacy of stored data at edge devices. The computational intelligence (CI) part of our algorithm distributes blocks of data to three different layers by partially encoded and forwarded for decoding to the next layer using hash and greed Solomon algorithms. VMM monitoring uses snapshot algorithm to detect intrusion. The proposed system is compared with Tiang Wang method to validate efficiency of data transfer with security. Hence, security is proven against the indexed efficiency. It is an important study to integrate communication between local host software and nearer edge devices through different channels by verifying snapshot using lamport mechanism to ensure integrity and security at software level thereby reducing the latency. It also provides thorough knowledge and understanding about data communication at software level with VMM. The performance evaluation and feasibility study of security in FLRSC against three-layered approach is proven over 232 blocks of data with 98% accuracy. Practical implications and contributions to the growing knowledge base are highlighted along with directions for further research.

MyData Cloud: Secure Cloud Architecture for Strengthened Control Over Personal Data (MyData Cloud: 개인 정보 통제 강화를 위한 안전한 클라우드 아키텍쳐 설계)

  • Seungmin Heo;Yonghee Kwon;Beomjoong Kim;Kiseok Jeon;Junghee Lee
    • Journal of the Korea Institute of Information Security & Cryptology
    • /
    • v.34 no.4
    • /
    • pp.597-613
    • /
    • 2024
  • MyData is an approach of personal data management, which grants data subjects the right to decide how to use and where to provide their data. With the explicit consent of the subjects, service providers can collect scattered data from data sources and offer personalized services based on the collected data. In existing service models, personal data saved in data storage can be shared with data processors of service providers or third parties. However, once personal data are transferred to third-party processors, it is difficult for data subjects to trace and control their personal data. Therefore, in this paper, we propose a cloud model where both data storage and processor are located within a single cloud, ensuring that data do not leave the cloud.

Universal Description of Access Control Systems

  • Karel Burda
    • International Journal of Computer Science & Network Security
    • /
    • v.24 no.8
    • /
    • pp.43-53
    • /
    • 2024
  • Access control systems are used to control the access of people to assets. In practice, assets are either tangible (e.g. goods, cash, etc.) or data. In order to handle tangible assets, a person must physically access the space in which the assets are located (e.g. a room or a building). Access control systems for this case have been known since antiquity and are based either on mechanical locks or on certificates. In the middle of the 20th century, systems based on electromagnetic phenomena appeared. In the second half of the same century, the need to control access to data also arose. And since data can also be accessed via a computer network, it was necessary to control not only the access of persons to areas with data storage, but also to control the electronic communication of persons with these storage facilities. The different types of the above systems have developed separately and more or less independently. This paper provides an overview of the current status of different types of systems, showing that these systems are converging technologically based on the use of electronics, computing and computer communication. Furthermore, the terminology and architecture of these systems is expanded in the article to allow a unified description of these systems. The article also describes the most common types of access control system configurations.

The Security and Privacy Issues of Fog Computing

  • Sultan Algarni;Khalid Almarhabi;Ahmed M. Alghamdi;Asem Alradadi
    • International Journal of Computer Science & Network Security
    • /
    • v.23 no.4
    • /
    • pp.25-31
    • /
    • 2023
  • Fog computing diversifies cloud computing by using edge devices to provide computing, data storage, communication, management, and control services. As it has a decentralised infrastructure that is capable of amalgamating with cloud computing as well as providing real-time data analysis, it is an emerging method of using multidisciplinary domains for a variety of applications; such as the IoT, Big Data, and smart cities. This present study provides an overview of the security and privacy concerns of fog computing. It also examines its fundamentals and architecture as well as the current trends, challenges, and potential methods of overcoming issues in fog computing.

Low Power Security Architecture for the Internet of Things (사물인터넷을 위한 저전력 보안 아키텍쳐)

  • Yun, Sun-woo;Park, Na-eun;Lee, Il-gu
    • Proceedings of the Korean Institute of Information and Commucation Sciences Conference
    • /
    • 2021.10a
    • /
    • pp.199-201
    • /
    • 2021
  • The Internet of Things (IoT) is a technology that can organically connect people and things without time and space constraints by using communication network technology and sensors, and transmit and receive data in real time. The IoT used in all industrial fields has limitations in terms of storage allocation, such as device size, memory capacity, and data transmission performance, so it is important to manage power consumption to effectively utilize the limited battery capacity. In the prior research, there is a problem in that security is deteriorated instead of improving power efficiency by lightening the security algorithm of the encryption module. In this study, we proposes a low-power security architecture that can utilize high-performance security algorithms in the IoT environment. This can provide high security and power efficiency by using relatively complex security modules in low-power environments by executing security modules only when threat detection is required based on inspection results.

  • PDF

Security and Privacy Protection of Vehicle-To-Grid Technology for Electric Vehicle in Smart Grid Environment (스마트 그리드환경에서 전기자동차 양방향 충전기술의 보안과 개인정보 보호에 관한 연구)

  • Lee, Sunguk
    • The Journal of the Convergence on Culture Technology
    • /
    • v.6 no.1
    • /
    • pp.441-448
    • /
    • 2020
  • With help of Vehicle-to-Grid(V2G) technology battery in electric vehicle can be used as distributed energy resource and energy storage in a smart grid environment. Several problems of security vulnerability and privacy preservation can be occurred because V2G network supports 2 way communication among all components. This paper explains and makes analysis of architecture, privacy sensitive data, security vulnerability and security requirement of V2G system. Furthermore efficient architecture and operating scheme for V2G system are proposed. This scheme uses symmetric cryptosystem and hash algorithm to support privacy preservation and mutual authentication.

Verification Control Algorithm of Data Integrity Verification in Remote Data sharing

  • Xu, Guangwei;Li, Shan;Lai, Miaolin;Gan, Yanglan;Feng, Xiangyang;Huang, Qiubo;Li, Li;Li, Wei
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • v.16 no.2
    • /
    • pp.565-586
    • /
    • 2022
  • Cloud storage's elastic expansibility not only provides flexible services for data owners to store their data remotely, but also reduces storage operation and management costs of their data sharing. The data outsourced remotely in the storage space of cloud service provider also brings data security concerns about data integrity. Data integrity verification has become an important technology for detecting the integrity of remote shared data. However, users without data access rights to verify the data integrity will cause unnecessary overhead to data owner and cloud service provider. Especially malicious users who constantly launch data integrity verification will greatly waste service resources. Since data owner is a consumer purchasing cloud services, he needs to bear both the cost of data storage and that of data verification. This paper proposes a verification control algorithm in data integrity verification for remotely outsourced data. It designs an attribute-based encryption verification control algorithm for multiple verifiers. Moreover, data owner and cloud service provider construct a common access structure together and generate a verification sentinel to verify the authority of verifiers according to the access structure. Finally, since cloud service provider cannot know the access structure and the sentry generation operation, it can only authenticate verifiers with satisfying access policy to verify the data integrity for the corresponding outsourced data. Theoretical analysis and experimental results show that the proposed algorithm achieves fine-grained access control to multiple verifiers for the data integrity verification.

Optimal Video Streaming Based on Delivery Information Sharing in Hybrid CDN/P2P Architecture

  • Lee, Jun Pyo;Lee, Won Joo;Lee, Kang-Ho
    • Journal of the Korea Society of Computer and Information
    • /
    • v.23 no.9
    • /
    • pp.35-42
    • /
    • 2018
  • In this paper, we propose an optimal streaming service method based on Hybrid CDN/P2P architecture. Recently, video streaming utilizes a CDN (Content Delivery Network) operation technique based on a Proxy Server, which is an end node located close to a user. However, since CDN has a fixed network traffic bandwidth and data information exchange among CDNs in the network is not smooth, it is difficult to guarantee traffic congestion and quality of image service. In the hybrid CDN/P2P network, a data selection technique is used to select only the data that is expected to be continuously requested among all the data in order to guarantee the QoS of the user who utilizes the limited bandwidth efficiently. In order to search user requested data, this technique effectively retrieves the storage information of the constituent nodes of CDN and P2P, and stores the new image information and calculates the deletion priority based on the request possibility as needed. Therefore, the streaming service scheme proposed in this paper can effectively improve the quality of the video streaming service on the network.

A Cache Privacy Protection Mechanism based on Dynamic Address Mapping in Named Data Networking

  • Zhu, Yi;Kang, Haohao;Huang, Ruhui
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • v.12 no.12
    • /
    • pp.6123-6138
    • /
    • 2018
  • Named data networking (NDN) is a new network architecture designed for next generation Internet. Router-side content caching is one of the key features in NDN, which can reduce redundant transmission, accelerate content distribution and alleviate congestion. However, several security problems are introduced as well. One important security risk is cache privacy leakage. By measuring the content retrieve time, adversary can infer its neighbor users' hobby for privacy content. Focusing on this problem, we propose a cache privacy protection mechanism (named as CPPM-DAM) to identify legitimate user and adversary using Bloom filter. An optimization for storage cost is further provided to make this mechanism more practical. The simulation results of ndnSIM show that CPPM-DAM can effectively protect cache privacy.

Verification Algorithm for the Duplicate Verification Data with Multiple Verifiers and Multiple Verification Challenges

  • Xu, Guangwei;Lai, Miaolin;Feng, Xiangyang;Huang, Qiubo;Luo, Xin;Li, Li;Li, Shan
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • v.15 no.2
    • /
    • pp.558-579
    • /
    • 2021
  • The cloud storage provides flexible data storage services for data owners to remotely outsource their data, and reduces data storage operations and management costs for data owners. These outsourced data bring data security concerns to the data owner due to malicious deletion or corruption by the cloud service provider. Data integrity verification is an important way to check outsourced data integrity. However, the existing data verification schemes only consider the case that a verifier launches multiple data verification challenges, and neglect the verification overhead of multiple data verification challenges launched by multiple verifiers at a similar time. In this case, the duplicate data in multiple challenges are verified repeatedly so that verification resources are consumed in vain. We propose a duplicate data verification algorithm based on multiple verifiers and multiple challenges to reduce the verification overhead. The algorithm dynamically schedules the multiple verifiers' challenges based on verification time and the frequent itemsets of duplicate verification data in challenge sets by applying FP-Growth algorithm, and computes the batch proofs of frequent itemsets. Then the challenges are split into two parts, i.e., duplicate data and unique data according to the results of data extraction. Finally, the proofs of duplicate data and unique data are computed and combined to generate a complete proof of every original challenge. Theoretical analysis and experiment evaluation show that the algorithm reduces the verification cost and ensures the correctness of the data integrity verification by flexible batch data verification.