• Title/Summary/Keyword: cloud computing systems

Search Result 602, Processing Time 0.029 seconds

Cost-Aware Scheduling of Computation-Intensive Tasks on Multi-Core Server

  • Ding, Youwei;Liu, Liang;Hu, Kongfa;Dai, Caiyan
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • v.12 no.11
    • /
    • pp.5465-5480
    • /
    • 2018
  • Energy-efficient task scheduling on multi-core server is a fundamental issue in green cloud computing. Multi-core processors are widely used in mobile devices, personal computers, and servers. Existing energy efficient task scheduling methods chiefly focus on reducing the energy consumption of the processor itself, and assume that the cores of the processor are controlled independently. However, the cores of some processors in the market are divided into several voltage islands, in each of which the cores must operate on the same status, and the cost of the server includes not only energy cost of the processor but also the energy of other components of the server and the cost of user waiting time. In this paper, we propose a cost-aware scheduling algorithm ICAS for computation intensive tasks on multi-core server. Tasks are first allocated to cores, and optimal frequency of each core is computed, and the frequency of each voltage island is finally determined. The experiments' results show the cost of ICAS is much lower than the existing method.

A Study on the Factors Influencing the Performance of FinTech Platform (핀테크 플랫폼의 성과에 영향을 미치는 요인 연구)

  • Xian, Feng Si;Um, Hyemi
    • Journal of Information Technology Applications and Management
    • /
    • v.28 no.2
    • /
    • pp.1-16
    • /
    • 2021
  • In recent years, as IT technologies such as cloud computing and mobile payment have evolved and Internet users have increased, the Internet financial market has become intelligent, mobile, and platformed. This study considers the impact of the psychological characteristics of platform systems and users on the performance of fintech platforms. The results of this study are as follows. Information quality affected trust and commitment, service quality affected commitment only, and system quality affected trust and commitment. The perceived risk affected trust and commitment, and the perceived benefit only affected trust and was shown to have an insignificant relationship with immersion. Trust has been shown to have a significant relationship with commitment, and both trust and commitment affected performance. In the validation of mediation effects, trust has shown a partially mediated effect between information quality, system quality, perceived risks, and perceived benefits and performance. There was no mediation effect between service quality and performance. Immersion has been shown to have a partial mediating effect between information quality, service quality, system quality, perceived risk and performance, and there is no mediating effect between perceived benefits and performance. This study showed what are the main factors that affect the performance of the fintech platform and will be used as a useful foundation for increasing the performance of the platform in the future.

On the Scaling of Drone Imagery Platform Methodology Based on Container Technology

  • Phitchawat Lukkanathiti;Chantana Chantrapornchai
    • Journal of Information Processing Systems
    • /
    • v.20 no.4
    • /
    • pp.442-457
    • /
    • 2024
  • The issues were studied of an open-source scaling drone imagery platform, called WebODM. It is known that processing drone images has a high demand for resources because of many preprocessing and post-processing steps involved in image loading, orthophoto, georeferencing, texturing, meshing, and other procedures. By default, WebODM allocates one node for processing. We explored methods to expand the platform's capability to handle many processing requests, which should be beneficial to platform designers. Our primary objective was to enhance WebODM's performance to support concurrent users through the use of container technology. We modified the original process to scale the task vertically and horizontally utilizing the Kubernetes cluster. The effectiveness of the scaling approaches enabled handling more concurrent users. The response time per active thread and the number of responses per second were measured. Compared to the original WebODM, our modified version sometimes had a longer response time by 1.9%. Nonetheless, the processing throughput was improved by up to 101% over the original WebODM's with some differences in the drone image processing results. Finally, we discussed the integration with the infrastructure as code to automate the scaling is discussed.

3-D Hetero-Integration Technologies for Multifunctional Convergence Systems

  • Lee, Kang-Wook
    • Journal of the Microelectronics and Packaging Society
    • /
    • v.22 no.2
    • /
    • pp.11-19
    • /
    • 2015
  • Since CMOS device scaling has stalled, three-dimensional (3-D) integration allows extending Moore's law to ever high density, higher functionality, higher performance, and more diversed materials and devices to be integrated with lower cost. 3-D integration has many benefits such as increased multi-functionality, increased performance, increased data bandwidth, reduced power, small form factor, reduced packaging volume, because it vertically stacks multiple materials, technologies, and functional components such as processor, memory, sensors, logic, analog, and power ICs into one stacked chip. Anticipated applications start with memory, handheld devices, and high-performance computers and especially extend to multifunctional convengence systems such as cloud networking for internet of things, exascale computing for big data server, electrical vehicle system for future automotive, radioactivity safety system, energy harvesting system and, wireless implantable medical system by flexible heterogeneous integrations involving CMOS, MEMS, sensors and photonic circuits. However, heterogeneous integration of different functional devices has many technical challenges owing to various types of size, thickness, and substrate of different functional devices, because they were fabricated by different technologies. This paper describes new 3-D heterogeneous integration technologies of chip self-assembling stacking and 3-D heterogeneous opto-electronics integration, backside TSV fabrication developed by Tohoku University for multifunctional convergence systems. The paper introduce a high speed sensing, highly parallel processing image sensor system comprising a 3-D stacked image sensor with extremely fast signal sensing and processing speed and a 3-D stacked microprocessor with a self-test and self-repair function for autonomous driving assist fabricated by 3-D heterogeneous integration technologies.

Next-generation Sequencing for Environmental Biology - Full-fledged Environmental Genomics around the Corner (차세대 유전체 기술과 환경생물학 - 환경유전체학 시대를 맞이하여)

  • Song, Ju Yeon;Kim, Byung Kwon;Kwon, Soon-Kyeong;Kwak, Min-Jung;Kim, Jihyun F.
    • Korean Journal of Environmental Biology
    • /
    • v.30 no.2
    • /
    • pp.77-89
    • /
    • 2012
  • With the advent of the genomics era powered by DNA sequencing technologies, life science is being transformed significantly and biological research and development have been accelerated. Environmental biology concerns the relationships among living organisms and their natural environment, which constitute the global biogeochemical cycle. As sustainability of the ecosystems depends on biodiversity, examining the structure and dynamics of the biotic constituents and fully grasping their genetic and metabolic capabilities are pivotal. The high-speed high-throughput next-generation sequencing can be applied to barcoding organisms either thriving or endangered and to decoding the whole genome information. Furthermore, diversity and the full gene complement of a microbial community can be elucidated and monitored through metagenomic approaches. With regard to human welfare, microbiomes of various human habitats such as gut, skin, mouth, stomach, and vagina, have been and are being scrutinized. To keep pace with the rapid increase of the sequencing capacity, various bioinformatic algorithms and software tools that even utilize supercomputers and cloud computing are being developed for processing and storage of massive data sets. Environmental genomics will be the major force in understanding the structure and function of ecosystems in nature as well as preserving, remediating, and bioprospecting them.

Big Data Meets Telcos: A Proactive Caching Perspective

  • Bastug, Ejder;Bennis, Mehdi;Zeydan, Engin;Kader, Manhal Abdel;Karatepe, Ilyas Alper;Er, Ahmet Salih;Debbah, Merouane
    • Journal of Communications and Networks
    • /
    • v.17 no.6
    • /
    • pp.549-557
    • /
    • 2015
  • Mobile cellular networks are becoming increasingly complex to manage while classical deployment/optimization techniques and current solutions (i.e., cell densification, acquiring more spectrum, etc.) are cost-ineffective and thus seen as stopgaps. This calls for development of novel approaches that leverage recent advances in storage/memory, context-awareness, edge/cloud computing, and falls into framework of big data. However, the big data by itself is yet another complex phenomena to handle and comes with its notorious 4V: Velocity, voracity, volume, and variety. In this work, we address these issues in optimization of 5G wireless networks via the notion of proactive caching at the base stations. In particular, we investigate the gains of proactive caching in terms of backhaul offloadings and request satisfactions, while tackling the large-amount of available data for content popularity estimation. In order to estimate the content popularity, we first collect users' mobile traffic data from a Turkish telecom operator from several base stations in hours of time interval. Then, an analysis is carried out locally on a big data platformand the gains of proactive caching at the base stations are investigated via numerical simulations. It turns out that several gains are possible depending on the level of available information and storage size. For instance, with 10% of content ratings and 15.4Gbyte of storage size (87%of total catalog size), proactive caching achieves 100% of request satisfaction and offloads 98% of the backhaul when considering 16 base stations.

Methods to Apply GoF Design Patterns in Service-Oriented Computing (서비스 지향 컴퓨팅을 위한 GoF 디자인 패턴 적용 기법)

  • Kim, Moon-Kwon;La, Hyun-Jung;Kim, Soo-Dong
    • The KIPS Transactions:PartD
    • /
    • v.19D no.2
    • /
    • pp.187-202
    • /
    • 2012
  • As a representative reuse paradigm, the theme of service-oriented Paradigm (SOC) is largely centered on publishing and subscribing reusable services. Here, SOC is the term including service oriented architecture and cloud computing. Service providers can produce high profits with reusable services, and service consumers can develop their applications with less time and effort by reusing the services. Design Patterns (DP) is a set of reusable methods to resolve commonly occurring design problems and to provide design structures to deal with the problems by following open/close princples. However, since DPs are mainly proposed for building object-oriented systems and there are distinguishable differences between object-oriented paradigm and SOC, it is challenging to apply the DPs to SOC design problems. Hence, DPs need to be customized by considering the two aspects; for service providers to design services which are highly reusable and reflect their unique characteristics and for service consumers to develop their target applications by reusing and customizing services as soon as possible. Therefore, we propose a set of DPs that are customized to SOC. With the proposed DPs, we believe that service provider can effectively develop highly reusable services, and service consumers can efficiently adapt services for their applications.

A Novel Compressed Sensing Technique for Traffic Matrix Estimation of Software Defined Cloud Networks

  • Qazi, Sameer;Atif, Syed Muhammad;Kadri, Muhammad Bilal
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • v.12 no.10
    • /
    • pp.4678-4702
    • /
    • 2018
  • Traffic Matrix estimation has always caught attention from researchers for better network management and future planning. With the advent of high traffic loads due to Cloud Computing platforms and Software Defined Networking based tunable routing and traffic management algorithms on the Internet, it is more necessary as ever to be able to predict current and future traffic volumes on the network. For large networks such origin-destination traffic prediction problem takes the form of a large under- constrained and under-determined system of equations with a dynamic measurement matrix. Previously, the researchers had relied on the assumption that the measurement (routing) matrix is stationary due to which the schemes are not suitable for modern software defined networks. In this work, we present our Compressed Sensing with Dynamic Model Estimation (CS-DME) architecture suitable for modern software defined networks. Our main contributions are: (1) we formulate an approach in which measurement matrix in the compressed sensing scheme can be accurately and dynamically estimated through a reformulation of the problem based on traffic demands. (2) We show that the problem formulation using a dynamic measurement matrix based on instantaneous traffic demands may be used instead of a stationary binary routing matrix which is more suitable to modern Software Defined Networks that are constantly evolving in terms of routing by inspection of its Eigen Spectrum using two real world datasets. (3) We also show that linking this compressed measurement matrix dynamically with the measured parameters can lead to acceptable estimation of Origin Destination (OD) Traffic flows with marginally poor results with other state-of-art schemes relying on fixed measurement matrices. (4) Furthermore, using this compressed reformulated problem, a new strategy for selection of vantage points for most efficient traffic matrix estimation is also presented through a secondary compression technique based on subset of link measurements. Experimental evaluation of proposed technique using real world datasets Abilene and GEANT shows that the technique is practical to be used in modern software defined networks. Further, the performance of the scheme is compared with recent state of the art techniques proposed in research literature.

Edge-Centric Metamorphic IoT Device Platform for Efficient On-Demand Hardware Replacement in Large-Scale IoT Applications (대규모 IoT 응용에 효과적인 주문형 하드웨어의 재구성을 위한 엣지 기반 변성적 IoT 디바이스 플랫폼)

  • Moon, Hyeongyun;Park, Daejin
    • Journal of the Korea Institute of Information and Communication Engineering
    • /
    • v.24 no.12
    • /
    • pp.1688-1696
    • /
    • 2020
  • The paradigm of Internet-of-things(IoT) systems is changing from a cloud-based system to an edge-based system to solve delays caused by network congestion, server overload and security issues due to data transmission. However, edge-based IoT systems have fatal weaknesses such as lack of performance and flexibility due to various limitations. To improve performance, application-specific hardware can be implemented in the edge device, but performance cannot be improved except for specific applications due to a fixed function. This paper introduces a edge-centric metamorphic IoT(mIoT) platform that can use a variety of hardware through on-demand partial reconfiguration despite the limited hardware resources of the edge device, so we can increase the performance and flexibility of the edge device. According to the experimental results, the edge-centric mIoT platform that executes the reconfiguration algorithm at the edge was able to reduce the number of server accesses by up to 82.2% compared to previous studies in which the reconfiguration algorithm was executed on the server.

An Enhancement of The Enterprise Security for Access Control based on Zero Trust (제로 트러스트 기반 접근제어를 위한 기업 보안 강화 연구)

  • Lee, Seon-A;Kim, Beomseok;Lee, Hyein;Park, Wonhyung
    • Journal of the Korea Institute of Information and Communication Engineering
    • /
    • v.26 no.2
    • /
    • pp.265-270
    • /
    • 2022
  • With the advent of the Fourth Industrial Revolution, the paradigm of finance is also changing. As remote work becomes more active due to cloud computing and coronavirus, the work environment changes and attack techniques are becoming intelligent and advanced, companies should accept new security models to further strengthen their current security systems. Zero trust security increases security by monitoring all networks and allowing strict authentication and minimal access rights for access requesters with the core concept of doubting and not trusting everything. In addition, the use of NAC and EDR for identification subjects and data to strengthen access control of the zero trust-based security system, and strict identity authentication through MFA will be explained. Therefore, this paper introduces a zero-trust security solution that strengthens existing security systems and presents the direction and validity to be introduced in the financial sector.