• Title/Summary/Keyword: Cloud Service Providers

Search Result 156, Processing Time 0.027 seconds

An Efficient Log Data Processing Architecture for Internet Cloud Environments

  • Kim, Julie;Bahn, Hyokyung
    • International Journal of Internet, Broadcasting and Communication
    • /
    • v.8 no.1
    • /
    • pp.33-41
    • /
    • 2016
  • Big data management is becoming an increasingly important issue in both industry and academia of information science community today. One of the important categories of big data generated from software systems is log data. Log data is generally used for better services in various service providers and can also be used to improve system reliability. In this paper, we propose a novel big data management architecture specialized for log data. The proposed architecture provides a scalable log management system that consists of client and server side modules for efficient handling of log data. To support large and simultaneous log data from multiple clients, we adopt the Hadoop infrastructure in the server-side file system for storing and managing log data efficiently. We implement the proposed architecture to support various client environments and validate the efficiency through measurement studies. The results show that the proposed architecture performs better than the existing logging architecture by 42.8% on average. All components of the proposed architecture are implemented based on open source software and the developed prototypes are now publicly available.

A Rapid Locating Protocol of Corrupted Data for Cloud Data Storage

  • Xu, Guangwei;Yang, Yanbin;Yan, Cairong;Gan, Yanglan
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • v.10 no.10
    • /
    • pp.4703-4723
    • /
    • 2016
  • The verification of data integrity is an urgent topic in remote data storage environments with the wide deployment of cloud data storage services. Many traditional verification algorithms focus on the block-oriented verification to resolve the dispute of dynamic data integrity between the data owners and the storage service providers. However, these algorithms scarcely pay attention to the data verification charge and the users' verification experience. The users more concern about the availability of accessed files rather than data blocks. Moreover, the data verification charge limits the number of checked data in each verification. Therefore, we propose a mixed verification protocol to verify the data integrity, which rapidly locates the corrupted files by the file-oriented verification, and then identifies the corrupted blocks in these files by the block-oriented verification. Theoretical analysis and simulation results demonstrate that the protocol reduces the cost of the metadata computation and transmission relative to the traditional block-oriented verification at the expense of little cost of additional file-oriented metadata computation and storage at the data owner. Both the opportunity of data extracted and the scope of suspicious data are optimized to improve the verification efficiency under the same verification cost.

e-Transformation Strategy of Data Integration Model : Long-Term Care Agency Case (데이터 통합 모델 기반 e-Transformation 전략 : 장기요양기관 사례)

  • Um, Hyemi
    • Journal of Information Technology Applications and Management
    • /
    • v.28 no.3
    • /
    • pp.23-30
    • /
    • 2021
  • Korea currently provides long-term care benefits for the elderly with poor functionality, but most of the service providers are private businesses. This is the time when quality management of care services is required, which is just around the corner of the super-aged era. In this study, we would like to look at the case in which 'A company', which operates a long-term care institution, attempted to make voluntary changes ahead of social demands. The company tried to transform the social needs of quality management by judging them as opportunities, not threats, and establishing an integrated database of centers. First, the company processed data and built a cloud-based database system. Second, the company automatically linked data from existing systems for the efficiency of data utilization. Third, the company pursued visualization for the convenience of data utilization. This allowed the company to make data-driven strategic decisions internally. This is expected to increase sales as it will soon lead to securing new customers and pioneering new markets. It is also significant in that it can provide best practices for the long-term care industry.

Methods to Apply GoF Design Patterns in Service-Oriented Computing (서비스 지향 컴퓨팅을 위한 GoF 디자인 패턴 적용 기법)

  • Kim, Moon-Kwon;La, Hyun-Jung;Kim, Soo-Dong
    • The KIPS Transactions:PartD
    • /
    • v.19D no.2
    • /
    • pp.187-202
    • /
    • 2012
  • As a representative reuse paradigm, the theme of service-oriented Paradigm (SOC) is largely centered on publishing and subscribing reusable services. Here, SOC is the term including service oriented architecture and cloud computing. Service providers can produce high profits with reusable services, and service consumers can develop their applications with less time and effort by reusing the services. Design Patterns (DP) is a set of reusable methods to resolve commonly occurring design problems and to provide design structures to deal with the problems by following open/close princples. However, since DPs are mainly proposed for building object-oriented systems and there are distinguishable differences between object-oriented paradigm and SOC, it is challenging to apply the DPs to SOC design problems. Hence, DPs need to be customized by considering the two aspects; for service providers to design services which are highly reusable and reflect their unique characteristics and for service consumers to develop their target applications by reusing and customizing services as soon as possible. Therefore, we propose a set of DPs that are customized to SOC. With the proposed DPs, we believe that service provider can effectively develop highly reusable services, and service consumers can efficiently adapt services for their applications.

Data hiding in partially encrypted HEVC video

  • Xu, Dawen
    • ETRI Journal
    • /
    • v.42 no.3
    • /
    • pp.446-458
    • /
    • 2020
  • In this study, an efficient scheme for hiding data directly in partially encrypted versions of high efficiency video coding (HEVC) videos is proposed. The content owner uses stream cipher to selectively encrypt some HEVC-CABAC bin strings in a format-compliant manner. Then, the data hider embeds the secret message into the encrypted HEVC videos using the specific coefficient modification technique. Consequently, it can be used in third-party computing environments (more generally, cloud computing). For security and privacy purposes, service providers cannot access the visual content of the host video. As the coefficient is only slightly modified, the quality of the decrypted video is satisfactory. The encrypted and marked bitstreams meet the requirements of format compatibility, and have the same bit rate. At the receiving end, data extraction can be performed in the encrypted domain or decrypted domain that can be adapted to different application scenarios. Several standard video sequences with different resolutions and contents have been used for experimental evaluation.

Unsupervised learning with hierarchical feature selection for DDoS mitigation within the ISP domain

  • Ko, Ili;Chambers, Desmond;Barrett, Enda
    • ETRI Journal
    • /
    • v.41 no.5
    • /
    • pp.574-584
    • /
    • 2019
  • A new Mirai variant found recently was equipped with a dynamic update ability, which increases the level of difficulty for DDoS mitigation. Continuous development of 5G technology and an increasing number of Internet of Things (IoT) devices connected to the network pose serious threats to cyber security. Therefore, researchers have tried to develop better DDoS mitigation systems. However, the majority of the existing models provide centralized solutions either by deploying the system with additional servers at the host site, on the cloud, or at third party locations, which may cause latency. Since Internet service providers (ISP) are links between the internet and users, deploying the defense system within the ISP domain is the panacea for delivering an efficient solution. To cope with the dynamic nature of the new DDoS attacks, we utilized an unsupervised artificial neural network to develop a hierarchical two-layered self-organizing map equipped with a twofold feature selection for DDoS mitigation within the ISP domain.

Home Screen Adaptive Next Generation Broadcasting Service using MSA-ABR (MSA-ABR을 이용한 홈 스크린 적응형 차세대 방송 서비스 연구)

  • Mariappan, Vinayagam;Lee, Minwoo;Lee, Seungyoun;Lee, Junghoon;Lee, Juyoung;Lim, Yunsik;Cha, Jaesang
    • Journal of Satellite, Information and Communications
    • /
    • v.11 no.3
    • /
    • pp.37-42
    • /
    • 2016
  • In this paper, in today's highly complex video and broadcast operations, broadcasters are constantly challenged to reliably deliver low-latency, high-quality video to multiscreen audiences on-air and online. The Adaptive Bit Rate (ABR) protocols enable internet video to a wide range of multiscreen devices. However, video quality is often marginal and would prove unacceptable for valued linear broadcast content delivered to the Big Screen today. The Media information processing technology advances in ABR enables service providers to take control and offer quality managed linear video services to ALL screens in the home, including the Big Screen, with a single unified IP Video infrastructure. The New Multiscreen-Assisted ABR (MSA-ABR) delivery management system proposed using Cloud based multicast-assisted ABR for a broadcast facility that performs routing of contribution content and online publishing services within a virtual, centralized cloud infrastructure.

A design of GPU container co-execution framework measuring interference among applications (GPU 컨테이너 동시 실행에 따른 응용의 간섭 측정 프레임워크 설계)

  • Kim, Sejin;Kim, Yoonhee
    • KNOM Review
    • /
    • v.23 no.1
    • /
    • pp.43-50
    • /
    • 2020
  • As General Purpose Graphics Processing Unit (GPGPU) recently plays an essential role in high-performance computing, several cloud service providers offer GPU service. Most cluster orchestration platforms in a cloud environment using containers allocate the integer number of GPU to jobs and do not allow a node shared with other jobs. In this case, resource utilization of a GPU node might be low if a job does not intensively require either many cores or large size of memory in GPU. GPU virtualization brings opportunities to realize kernel concurrency and share resources. However, performance may vary depending on characteristics of applications running concurrently and interference among them due to resource contention on a node. This paper proposes GPU container co-execution framework with multiple server creation and execution based on Kubernetes, container orchestration platform for measuring interference which may be occurred by sharing GPU resources. Performance changes according to scheduling policies were investigated by executing several jobs on GPU. The result shows that optimal scheduling is not possible only considering GPU memory and computing resource usage. Interference caused by co-execution among applications is measured using the framework.

Development of Wireless Base Station Remote Monitoring System Using IoT Based on Cloud Server (클라우드 서버 기반 IoT를 이용한 무선기지국 원격 감시시스템 개발)

  • Lee, Yang-weon;Kim, Chul-won
    • Journal of the Korea Institute of Information and Communication Engineering
    • /
    • v.22 no.6
    • /
    • pp.849-854
    • /
    • 2018
  • Radio base stations, which are widely distributed across large areas, have many difficulties in managing them. Unmanned radio base stations in remote mountains are having a hard time accessing them in case of emergencies. Major telephone service providers only remotely control incoming and outgoing information and local small business partners responsible for maintaining actual facilities do not possess such technologies, so they are each checked during field visits. In this study, in order to process the sensor raw data and smoothing, we apply the particle filters and confirmed that the performance of sensor data accuracy is increased. Integrated system using temperature, humidity, fire condition, and power operation at a wide range of radio base stations under the real-time monitoring status is operated well. It show that all of the status of base station are monitored at the remote office using the cloud server through internet networking.

Analyzing OTT Interactive Content Using Text Mining Method (텍스트 마이닝으로 OTT 인터랙티브 콘텐츠 다시보기)

  • Sukchang Lee
    • The Journal of the Convergence on Culture Technology
    • /
    • v.9 no.5
    • /
    • pp.859-865
    • /
    • 2023
  • In a situation where service providers are increasingly focusing on content development due to the intense competition in the OTT market, interactive content that encourages active participation from viewers is garnering significant attention. In response to this trend, research on interactive content is being conducted more actively. This study aims to analyze interactive content through text mining techniques, with a specific focus on online unstructured data. The analysis includes deriving the characteristics of keywords according to their weight, examining the relationship between OTT platforms and interactive content, and tracking changes in the trends of interactive content based on objective data. To conduct this analysis, detailed techniques such as 'Word Cloud', 'Relationship Analysis', and 'Keyword Trend' are used, and the study also aims to derive meaningful implications from these analyses.