• Title/Summary/Keyword: Computing time

Search Result 4,297, Processing Time 0.028 seconds

Task Scheduling on Cloudlet in Mobile Cloud Computing with Load Balancing

  • Poonam;Suman Sangwan
    • International Journal of Computer Science & Network Security
    • /
    • v.23 no.10
    • /
    • pp.73-80
    • /
    • 2023
  • The recent growth in the use of mobile devices has contributed to increased computing and storage requirements. Cloud computing has been used over the past decade to cater to computational and storage needs over the internet. However, the use of various mobile applications like Augmented Reality (AR), M2M Communications, V2X Communications, and the Internet of Things (IoT) led to the emergence of mobile cloud computing (MCC). All data from mobile devices is offloaded and computed on the cloud, removing all limitations incorporated with mobile devices. However, delays induced by the location of data centers led to the birth of edge computing technologies. In this paper, we discuss one of the edge computing technologies, i.e., cloudlet. Cloudlet brings the cloud close to the end-user leading to reduced delay and response time. An algorithm is proposed for scheduling tasks on cloudlet by considering VM's load. Simulation results indicate that the proposed algorithm provides 12% and 29% improvement over EMACS and QRR while balancing the load.

MS Load Balancing Algorithm in Cloud Computing

  • Ankita Gupta;Ranu Lal Chouhan
    • International Journal of Computer Science & Network Security
    • /
    • v.24 no.9
    • /
    • pp.157-161
    • /
    • 2024
  • Cloud computing becomes an important technology for distributed computing and parallel computing. Cloud computing provides various facility like to share resources, software packages, information, storage and many different applications depending on user demand at any time and at any place. It provides an extensive measure for computing and storage. A service provided by it to user follows pay-as-you-go model. Although it provides many facilities still there is some problem which are resource discovery, fault tolerance, load balancing, and security. Out of these Load balancing is the main challenges. There are many techniques which used to distribute wor9kload or task equally across the servers. This paper includes cloud computing, cloud computing architecture, virtualization and MS load balancing technique which provide enhanced load balancing.

An Non-Scan DFT Scheme for RTL Circuit Datapath (RTL 회로의 데이터패스를 위한 비주사 DFT 기법)

  • Chang, Hoon;Yang, Sun-Woong;Park, Jae-Heung;Kim, Moon-Joon;Shim, Jae-Hun
    • Journal of the Institute of Electronics Engineers of Korea SD
    • /
    • v.41 no.2
    • /
    • pp.55-65
    • /
    • 2004
  • In this paper, An efficient non-scan DFT method for datapaths described in RTL is proposed. The proposed non-scan DFT method improves testability of datapaths based on hierarchical testability analysis regardless to width of the datapath. It always guarantees higher fault efficiency and faster test pattern generation time with little hardware overhead than previous methods. The experimental result shows the superiority of the proposed method of test pattern generation time, application time, and area overhead compared to the scan method.

Deep Learning-Based Dynamic Scheduling with Multi-Agents Supporting Scalability in Edge Computing Environments (멀티 에이전트 에지 컴퓨팅 환경에서 확장성을 지원하는 딥러닝 기반 동적 스케줄링)

  • JongBeom Lim
    • KIPS Transactions on Software and Data Engineering
    • /
    • v.12 no.9
    • /
    • pp.399-406
    • /
    • 2023
  • Cloud computing has been evolved to support edge computing architecture that combines fog management layer with edge servers. The main reason why it is received much attention is low communication latency for real-time IoT applications. At the same time, various cloud task scheduling techniques based on artificial intelligence have been proposed. Artificial intelligence-based cloud task scheduling techniques show better performance in comparison to existing methods, but it has relatively high scheduling time. In this paper, we propose a deep learning-based dynamic scheduling with multi-agents supporting scalability in edge computing environments. The proposed method shows low scheduling time than previous artificial intelligence-based scheduling techniques. To show the effectiveness of the proposed method, we compare the performance between previous and proposed methods in a scalable experimental environment. The results show that our method supports real-time IoT applications with low scheduling time, and shows better performance in terms of the number of completed cloud tasks in a scalable experimental environment.

Design of A new Algorithm by Using Standard Deviation Techniques in Multi Edge Computing with IoT Application

  • HASNAIN A. ALMASHHADANI;XIAOHENG DENG;OSAMAH R. AL-HWAIDI;SARMAD T. ABDUL-SAMAD;MOHAMMED M. IBRAHM;SUHAIB N. ABDUL LATIF
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • v.17 no.4
    • /
    • pp.1147-1161
    • /
    • 2023
  • The Internet of Things (IoT) requires a new processing model that will allow scalability in cloud computing while reducing time delay caused by data transmission within a network. Such a model can be achieved by using resources that are closer to the user, i.e., by relying on edge computing (EC). The amount of IoT data also grows with an increase in the number of IoT devices. However, building such a flexible model within a heterogeneous environment is difficult in terms of resources. Moreover, the increasing demand for IoT services necessitates shortening time delay and response time by achieving effective load balancing. IoT devices are expected to generate huge amounts of data within a short amount of time. They will be dynamically deployed, and IoT services will be provided to EC devices or cloud servers to minimize resource costs while meeting the latency and quality of service (QoS) constraints of IoT applications when IoT devices are at the endpoint. EC is an emerging solution to the data processing problem in IoT. In this study, we improve the load balancing process and distribute resources fairly to tasks, which, in turn, will improve QoS in cloud and reduce processing time, and consequently, response time.

A Concurrency Control Method using Optimistic Control in Mobile Computing DB Environment (모바일 컴퓨팅 데이터베이스 환경에서의 낙관적 제어기법을 이용한 동시성제어기법)

  • Cho Sung-Je
    • Journal of the Korea Society of Computer and Information
    • /
    • v.11 no.2 s.40
    • /
    • pp.131-143
    • /
    • 2006
  • The rapid growth of mobile communication technology has provided the expansion of mobile internet services, particularly mobile realtime transaction takes much weight among mobile fields. Current mobile transaction service has serious problems which check its development, such as low bandwidths, hand over, expensive charge system, and low response time. but, There is an increasing demand for various mobile applications to process transactions in a mobile computing fields. In mobile computing environments. A mobile host computing system demands for new concurrency control method to use the bandwidth efficiently, to improve the bottleneck and the response time of transactions. This study suggests about an efficient concurrency control in a mobile computing environment. Concurrency control method in existing method uses two phases locking method. In this method, Many clients can't use the same segment simultaneously, and so useless waiting time increases. The characteristic of this proposed method unlike existing one, Enable the transaction approaching different data in the same segment to minimize the useless waiting time by permitting segments, and therefore improves the coexistence of system. Also, It shows the algorithm of the proposed concurrence control method.

  • PDF

EXECUTION TIME AND POWER CONSUMPTION OPTIMIZATION in FOG COMPUTING ENVIRONMENT

  • Alghamdi, Anwar;Alzahrani, Ahmed;Thayananthan, Vijey
    • International Journal of Computer Science & Network Security
    • /
    • v.21 no.1
    • /
    • pp.137-142
    • /
    • 2021
  • The Internet of Things (IoT) paradigm is at the forefront of present and future research activities. The huge amount of sensing data from IoT devices needing to be processed is increasing dramatically in volume, variety, and velocity. In response, cloud computing was involved in handling the challenges of collecting, storing, and processing jobs. The fog computing technology is a model that is used to support cloud computing by implementing pre-processing jobs close to the end-user for realizing low latency, less power consumption in the cloud side, and high scalability. However, it may be that some resources in fog computing networks are not suitable for some kind of jobs, or the number of requests increases outside capacity. So, it is more efficient to decrease sending jobs to the cloud. Hence some other fog resources are idle, and it is better to be federated rather than forwarding them to the cloud server. Obviously, this issue affects the performance of the fog environment when dealing with big data applications or applications that are sensitive to time processing. This research aims to build a fog topology job scheduling (FTJS) to schedule the incoming jobs which are generated from the IoT devices and discover all available fog nodes with their capabilities. Also, the fog topology job placement algorithm is introduced to deploy jobs into appropriate resources in the network effectively. Finally, by comparing our result with the state-of-art first come first serve (FCFS) scheduling technique, the overall execution time is reduced significantly by approximately 20%, the energy consumption in the cloud side is reduced by 18%.

A CMOS Analog Front End for a WPAN Zero-IF Receiver

  • Moon, Yeon-Kug;Seo, Hae-Moon;Park, Yong-Kuk;Won, Kwang-Ho;Lim, Seung-Ok;Kang, Jeong-Hoon;Park, Young-Choong;Yoon, Myung-Hyun;Yoo, June-Jae;Kim, Seong-Dong
    • Proceedings of the IEEK Conference
    • /
    • 2005.11a
    • /
    • pp.769-772
    • /
    • 2005
  • This paper describes a low-voltage and low-power channel selection analog front end with continuous-time low pass filters and highly linear programmable-gain amplifier(PGA). The filters were realized as balanced Gm-C biquadratic filters to achieve a low current consumption. High linearity and a constant wide bandwidth are achieved by using a new transconductance(Gm) cell. The PGA has a voltage gain varying from 0 to 65dB, while maintaining a constant bandwidth. A filter tuning circuit that requires an accurate time base but no external components is presented. With a 1-Vrms differential input and output, the filter achieves -85dB THD and a 78dB signal-to-noise ratio. Both the filter and PGA were implemented in a 0.18um 1P6M n-well CMOS process. They consume 3.2mW from a 1.8V power supply and occupy an area of $0.19mm^2$.

  • PDF

Design of Secure Log System in Cloud Computing Environment (클라우드 컴퓨팅 환경에서의 안전한 로그 시스템 설계)

  • Lee, Byung-Do;Shin, Sang Uk
    • Journal of Korea Multimedia Society
    • /
    • v.19 no.2
    • /
    • pp.300-307
    • /
    • 2016
  • Cloud computing that provide a elastic computing service is more complex compared to the existing computing systems. Accordingly, it has become increasingly important to maintain the stability and reliability of the computing system. And troubleshooting and real-time monitoring to address these challenges must be performed essentially. For these goals, the handling of the log data is needed, but this task in cloud computing environment may be more difficult compared to the traditional logging system. In addition, there are another challenges in order to have the admissibility of the collected log data in court. In this paper, we design secure logging service that provides the management and reliability of log data in a cloud computing environment and then analyze the proposed system.

Web-Based KNHANES System in Cloud Computing

  • Park, Mi-Yeon;Park, Pil-Sook;Kim, Guk-Boh;Park, Jin-Yong;Jeong, Gu-Beom
    • Journal of Korea Multimedia Society
    • /
    • v.17 no.3
    • /
    • pp.353-363
    • /
    • 2014
  • Cloud computing is an internet-based technology, providing services to the virtualized IT environment, and allowing users to add or remove resources of hardware or software at their discretion. Since Cloud computing can construct virtually integrated environments out of multiple local computing environments, various information services can be provided by it. In addition, state organizations also strive to build the cloud computing environments due to the benefits of reduced costs to introduce the system and of reduced time to build and provide the IT services. This study suggests a web-based cloud computing system for the computing environments, to be applied for the Korean National Health and Nutrition Examination Survey (KNHANES) by the Ministry of Health and Welfare, Republic of Korea.