• Title/Summary/Keyword: Edge Network

Search Result 802, Processing Time 0.031 seconds

Frequency Mudularized Deinterlacing Using Neural Network (신경회로망을 이용한 주파수 모듈화된 deinterlacing)

  • 우동헌;엄일규;김유신
    • The Journal of Korean Institute of Communications and Information Sciences
    • /
    • v.28 no.12C
    • /
    • pp.1250-1257
    • /
    • 2003
  • Generally images are classified into two regions: edge and flat region. While low frequency components are popular in the flat region, high frequency components are quite important in the edge region. Therefore, deinterlacing algorithm that considers the characteristic of each region can be more efficient. In this paper, an image is divided into edge region and flat region by the local variance. And then, for each region, frequency modularized neural network is assigned. Using this structure, each modularized neural network can learn only its region intensively and avoid the complexity of learning caused by the data of different region. Using the local AC data for the input of neural network can prevent the degradation of the performance of teaming due to the average intensity values of image that disturbs the effective learning. The proposed method shows the improved performance compared with previous algorithms in the simulation.

Semantic Network Analysis of Online News and Social Media Text Related to Comprehensive Nursing Care Service (간호간병통합서비스 관련 온라인 기사 및 소셜미디어 빅데이터의 의미연결망 분석)

  • Kim, Minji;Choi, Mona;Youm, Yoosik
    • Journal of Korean Academy of Nursing
    • /
    • v.47 no.6
    • /
    • pp.806-816
    • /
    • 2017
  • Purpose: As comprehensive nursing care service has gradually expanded, it has become necessary to explore the various opinions about it. The purpose of this study is to explore the large amount of text data regarding comprehensive nursing care service extracted from online news and social media by applying a semantic network analysis. Methods: The web pages of the Korean Nurses Association (KNA) News, major daily newspapers, and Twitter were crawled by searching the keyword 'comprehensive nursing care service' using Python. A morphological analysis was performed using KoNLPy. Nodes on a 'comprehensive nursing care service' cluster were selected, and frequency, edge weight, and degree centrality were calculated and visualized with Gephi for the semantic network. Results: A total of 536 news pages and 464 tweets were analyzed. In the KNA News and major daily newspapers, 'nursing workforce' and 'nursing service' were highly rated in frequency, edge weight, and degree centrality. On Twitter, the most frequent nodes were 'National Health Insurance Service' and 'comprehensive nursing care service hospital.' The nodes with the highest edge weight were 'national health insurance,' 'wards without caregiver presence,' and 'caregiving costs.' 'National Health Insurance Service' was highest in degree centrality. Conclusion: This study provides an example of how to use atypical big data for a nursing issue through semantic network analysis to explore diverse perspectives surrounding the nursing community through various media sources. Applying semantic network analysis to online big data to gather information regarding various nursing issues would help to explore opinions for formulating and implementing nursing policies.

Performance analysis of local exit for distributed deep neural networks over cloud and edge computing

  • Lee, Changsik;Hong, Seungwoo;Hong, Sungback;Kim, Taeyeon
    • ETRI Journal
    • /
    • v.42 no.5
    • /
    • pp.658-668
    • /
    • 2020
  • In edge computing, most procedures, including data collection, data processing, and service provision, are handled at edge nodes and not in the central cloud. This decreases the processing burden on the central cloud, enabling fast responses to end-device service requests in addition to reducing bandwidth consumption. However, edge nodes have restricted computing, storage, and energy resources to support computation-intensive tasks such as processing deep neural network (DNN) inference. In this study, we analyze the effect of models with single and multiple local exits on DNN inference in an edge-computing environment. Our test results show that a single-exit model performs better with respect to the number of local exited samples, inference accuracy, and inference latency than a multi-exit model at all exit points. These results signify that higher accuracy can be achieved with less computation when a single-exit model is adopted. In edge computing infrastructure, it is therefore more efficient to adopt a DNN model with only one or a few exit points to provide a fast and reliable inference service.

An Edge Removal Algorithm for the Reliability Evaluation of Directed Communication Networks (방향성 통신망의 신뢰도 계정에 관한 에지제거 알고리즘)

  • 임윤구;오영환
    • The Journal of Korean Institute of Communications and Information Sciences
    • /
    • v.13 no.1
    • /
    • pp.63-73
    • /
    • 1988
  • In this paper, an algorithm is proposed to evaluate the source-to-terminal reliability, the probability that a source node can communicate with a terminal node, in a probabilistic derected graph. By using Satyanaratana's factoring $theorem^{(7)}$, the original graph can be partitioned into two reduced graphs obtained by contracting and deleting the edge connected to the source node in the probabilistic directed graph. The edge removal proposed in this paper and the general series-parallel reduction can then be applied to the reduced graph. This edge reduction can be applied recursively to the reduced graphs until a source node can be connected to a terminal node by one edge. A computer program which can be applied to evaluating the source-to-terminal reliability in a complex and large network has also been developed.

  • PDF

A Survey of Computational Offloading in Cloud/Edge-based Architectures: Strategies, Optimization Models and Challenges

  • Alqarni, Manal M.;Cherif, Asma;Alkayal, Entisar
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • v.15 no.3
    • /
    • pp.952-973
    • /
    • 2021
  • In recent years, mobile devices have become an essential part of daily life. More and more applications are being supported by mobile devices thanks to edge computing, which represents an emergent architecture that provides computing, storage, and networking capabilities for mobile devices. In edge computing, heavy tasks are offloaded to edge nodes to alleviate the computations on the mobile side. However, offloading computational tasks may incur extra energy consumption and delays due to network congestion and server queues. Therefore, it is necessary to optimize offloading decisions to minimize time, energy, and payment costs. In this article, different offloading models are examined to identify the offloading parameters that need to be optimized. The paper investigates and compares several optimization techniques used to optimize offloading decisions, specifically Swarm Intelligence (SI) models, since they are best suited to the distributed aspect of edge computing. Furthermore, based on the literature review, this study concludes that a Cuckoo Search Algorithm (CSA) in an edge-based architecture is a good solution for balancing energy consumption, time, and cost.

A Sufferage offloading tasks method for multiple edge servers

  • Zhang, Tao;Cao, Mingfeng;Hao, Yongsheng
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • v.16 no.11
    • /
    • pp.3603-3618
    • /
    • 2022
  • The offloading method is important when there are multiple mobile nodes and multiple edge servers. In the environment, those mobile nodes connect with edge servers with different bandwidths, thus taking different time and energy for offloading tasks. Considering the system load of edge servers and the attributes (the number of instructions, the size of files, deadlines, and so on) of tasks, the energy-aware offloading problem becomes difficult under our mobile edge environment (MCE). Most of the past work mainly offloads tasks by judging where the job consumes less energy. But sometimes, one task needs more energy because the preferred edge servers have been overloaded. Those methods always do not pay attention to the influence of the scheduling on the future tasks. In this paper, first, we try to execute the job locally when the job costs a lower energy consumption executed on the MD. We suppose that every task is submitted to the mobile server which has the highest bandwidth efficiency. Bandwidth efficiency is defined by the sending ratio, the receiving ratio, and their related power consumption. We sort the task in the descending order of the ratio between the energy consumption executed on the mobile server node and on the MD. Then, we give a "suffrage" definition for the energy consumption executed on different mobile servers for offloading tasks. The task selects the mobile server with the largest suffrage. Simulations show that our method reduces the execution time and the related energy consumption, while keeping a lower value in the number of uncompleted tasks.

GAIN-QoS: A Novel QoS Prediction Model for Edge Computing

  • Jiwon Choi;Jaewook Lee;Duksan Ryu;Suntae Kim;Jongmoon Baik
    • Journal of Web Engineering
    • /
    • v.21 no.1
    • /
    • pp.27-52
    • /
    • 2021
  • With recent increases in the number of network-connected devices, the number of edge computing services that provide similar functions has increased. Therefore, it is important to recommend an optimal edge computing service, based on quality-of-service (QoS). However, in the real world, there is a cold-start problem in QoS data: highly sparse invocation. Therefore, it is difficult to recommend a suitable service to the user. Deep learning techniques were applied to address this problem, or context information was used to extract deep features between users and services. However, edge computing environment has not been considered in previous studies. Our goal is to predict the QoS values in real edge computing environments with improved accuracy. To this end, we propose a GAIN-QoS technique. It clusters services based on their location information, calculates the distance between services and users in each cluster, and brings the QoS values of users within a certain distance. We apply a Generative Adversarial Imputation Nets (GAIN) model and perform QoS prediction based on this reconstructed user service invocation matrix. When the density is low, GAIN-QoS shows superior performance to other techniques. In addition, the distance between the service and user slightly affects performance. Thus, compared to other methods, the proposed method can significantly improve the accuracy of QoS prediction for edge computing, which suffers from cold-start problem.

A Study on the Build of Equipment Predictive Maintenance Solutions Based on On-device Edge Computer

  • Lee, Yong-Hwan;Suh, Jin-Hyung
    • Journal of the Korea Society of Computer and Information
    • /
    • v.25 no.4
    • /
    • pp.165-172
    • /
    • 2020
  • In this paper we propose an uses on-device-based edge computing technology and big data analysis methods through the use of on-device-based edge computing technology and analysis of big data, which are distributed computing paradigms that introduce computations and storage devices where necessary to solve problems such as transmission delays that occur when data is transmitted to central centers and processed in current general smart factories. However, even if edge computing-based technology is applied in practice, the increase in devices on the network edge will result in large amounts of data being transferred to the data center, resulting in the network band reaching its limits, which, despite the improvement of network technology, does not guarantee acceptable transfer speeds and response times, which are critical requirements for many applications. It provides the basis for developing into an AI-based facility prediction conservation analysis tool that can apply deep learning suitable for big data in the future by supporting intelligent facility management that can support productivity growth through research that can be applied to the field of facility preservation and smart factory industry with integrated hardware technology that can accommodate these requirements and factory management and control technology.

A study of energy consumption and savings potential in wired network equipment (유선 네트워크 장비의 에너지 소모량과 절약 잠재성 연구)

  • Kim, Ki-Young;Suh, Yu-Hwa
    • Journal of the Korea Academia-Industrial cooperation Society
    • /
    • v.14 no.12
    • /
    • pp.6469-6477
    • /
    • 2013
  • As the Internet has grown, energy consumption and GHG emission from internet use have become issues in recent years. On the other hand, such interest in greening the Internet has focused on edge devices, and there is a lack of deeper related studies of the energy wasted by excessive network-connectivity and the savings potential in wired network equipment. This study presents the background and reasonability of studies on the energy efficiency of wired networks in terms of the environment, economy and energy resources. The energy consumption and savings potential of network equipment were also estimated and the major factors of energy consumption was analyzed based on the data, and future studies for the Internet are presented.

A Load Based Weight Multicasting Technique Design for efficient Multimedia Contents Delivery (효율적인 멀티미디어 컨텐츠 전송을 위한 부하 가중치 멀티캐스팅 기법의 설계)

  • Lee, Seo-Jeong;Kim, Seon-Ho
    • The Journal of Society for e-Business Studies
    • /
    • v.9 no.3
    • /
    • pp.277-288
    • /
    • 2004
  • The purpose of multimedia contents transmission is to resolve the large size and nonformal issues. Various multicasting technologies have been researched to support these issues. This paper suggests a technique to build multicast routing for safe and reliable transmission of multimedia contents. Network server nodes have their own weight with respect to communication loads. The weight is computed by a server's communication load with others. This suggests low delay routing with two or more edge server of content delivery network. We will show the weighted inter-server routing technique and analyze the network performance improvement caused by lower network traffic and delay.

  • PDF