• Title/Summary/Keyword: Edge-Cloud Systems

Search Result 77, Processing Time 0.023 seconds

Distributed Edge Computing for DNA-Based Intelligent Services and Applications: A Review (딥러닝을 사용하는 IoT빅데이터 인프라에 필요한 DNA 기술을 위한 분산 엣지 컴퓨팅기술 리뷰)

  • Alemayehu, Temesgen Seyoum;Cho, We-Duke
    • KIPS Transactions on Computer and Communication Systems
    • /
    • v.9 no.12
    • /
    • pp.291-306
    • /
    • 2020
  • Nowadays, Data-Network-AI (DNA)-based intelligent services and applications have become a reality to provide a new dimension of services that improve the quality of life and productivity of businesses. Artificial intelligence (AI) can enhance the value of IoT data (data collected by IoT devices). The internet of things (IoT) promotes the learning and intelligence capability of AI. To extract insights from massive volume IoT data in real-time using deep learning, processing capability needs to happen in the IoT end devices where data is generated. However, deep learning requires a significant number of computational resources that may not be available at the IoT end devices. Such problems have been addressed by transporting bulks of data from the IoT end devices to the cloud datacenters for processing. But transferring IoT big data to the cloud incurs prohibitively high transmission delay and privacy issues which are a major concern. Edge computing, where distributed computing nodes are placed close to the IoT end devices, is a viable solution to meet the high computation and low-latency requirements and to preserve the privacy of users. This paper provides a comprehensive review of the current state of leveraging deep learning within edge computing to unleash the potential of IoT big data generated from IoT end devices. We believe that the revision will have a contribution to the development of DNA-based intelligent services and applications. It describes the different distributed training and inference architectures of deep learning models across multiple nodes of the edge computing platform. It also provides the different privacy-preserving approaches of deep learning on the edge computing environment and the various application domains where deep learning on the network edge can be useful. Finally, it discusses open issues and challenges leveraging deep learning within edge computing.

Resource Management in 5G Mobile Networks: Survey and Challenges

  • Chien, Wei-Che;Huang, Shih-Yun;Lai, Chin-Feng;Chao, Han-Chieh
    • Journal of Information Processing Systems
    • /
    • v.16 no.4
    • /
    • pp.896-914
    • /
    • 2020
  • With the rapid growth of network traffic, a large number of connected devices, and higher application services, the traditional network is facing several challenges. In addition to improving the current network architecture and hardware specifications, effective resource management means the development trend of 5G. Although many existing potential technologies have been proposed to solve the some of 5G challenges, such as multiple-input multiple-output (MIMO), software-defined networking (SDN), network functions virtualization (NFV), edge computing, millimeter-wave, etc., research studies in 5G continue to enrich its function and move toward B5G mobile networks. In this paper, focusing on the resource allocation issues of 5G core networks and radio access networks, we address the latest technological developments and discuss the current challenges for resource management in 5G.

Energy-Aware Data-Preprocessing Scheme for Efficient Audio Deep Learning in Solar-Powered IoT Edge Computing Environments (태양 에너지 수집형 IoT 엣지 컴퓨팅 환경에서 효율적인 오디오 딥러닝을 위한 에너지 적응형 데이터 전처리 기법)

  • Yeontae Yoo;Dong Kun Noh
    • IEMEK Journal of Embedded Systems and Applications
    • /
    • v.18 no.4
    • /
    • pp.159-164
    • /
    • 2023
  • Solar energy harvesting IoT devices prioritize maximizing the utilization of collected energy due to the periodic recharging nature of solar energy, rather than minimizing energy consumption. Meanwhile, research on edge AI, which performs machine learning near the data source instead of the cloud, is actively conducted for reasons such as data confidentiality and privacy, response time, and cost. One such research area involves performing various audio AI applications using audio data collected from multiple IoT devices in an IoT edge computing environment. However, in most studies, IoT devices only perform sensing data transmission to the edge server, and all processes, including data preprocessing, are performed on the edge server. In this case, it not only leads to overload issues on the edge server but also causes network congestion by transmitting unnecessary data for learning. On the other way, if data preprocessing is delegated to each IoT device to address this issue, it leads to another problem of increased blackout time due to energy shortages in the devices. In this paper, we aim to alleviate the problem of increased blackout time in devices while mitigating issues in server-centric edge AI environments by determining where the data preprocessed based on the energy state of each IoT device. In the proposed method, IoT devices only perform the preprocessing process, which includes sound discrimination and noise removal, and transmit to the server if there is more energy available than the energy threshold required for the basic operation of the device.

IoT Edge Architecture Model to Prevent Blockchain-Based Security Threats (블록체인 기반의 보안 위협을 예방할 수 있는 IoT 엣지 아키텍처 모델)

  • Yoon-Su Jeong
    • Journal of Internet of Things and Convergence
    • /
    • v.10 no.2
    • /
    • pp.77-84
    • /
    • 2024
  • Over the past few years, IoT edges have begun to emerge based on new low-latency communication protocols such as 5G. However, IoT edges, despite their enormous advantages, pose new complementary threats, requiring new security solutions to address them. In this paper, we propose a cloud environment-based IoT edge architecture model that complements IoT systems. The proposed model acts on machine learning to prevent security threats in advance with network traffic data extracted from IoT edge devices. In addition, the proposed model ensures load and security in the access network (edge) by allocating some of the security data at the local node. The proposed model further reduces the load on the access network (edge) and secures the vulnerable part by allocating some functions of data processing and management to the local node among IoT edge environments. The proposed model virtualizes various IoT functions as a name service, and deploys hardware functions and sufficient computational resources to local nodes as needed.

Integrating UAV Remote Sensing with GIS for Predicting Rice Grain Protein

  • Sarkar, Tapash Kumar;Ryu, Chan-Seok;Kang, Ye-Seong;Kim, Seong-Heon;Jeon, Sae-Rom;Jang, Si-Hyeong;Park, Jun-Woo;Kim, Suk-Gu;Kim, Hyun-Jin
    • Journal of Biosystems Engineering
    • /
    • v.43 no.2
    • /
    • pp.148-159
    • /
    • 2018
  • Purpose: Unmanned air vehicle (UAV) remote sensing was applied to test various vegetation indices and make prediction models of protein content of rice for monitoring grain quality and proper management practice. Methods: Image acquisition was carried out by using NIR (Green, Red, NIR), RGB and RE (Blue, Green, Red-edge) camera mounted on UAV. Sampling was done synchronously at the geo-referenced points and GPS locations were recorded. Paddy samples were air-dried to 15% moisture content, and then dehulled and milled to 92% milling yield and measured the protein content by near-infrared spectroscopy. Results: Artificial neural network showed the better performance with $R^2$ (coefficient of determination) of 0.740, NSE (Nash-Sutcliffe model efficiency coefficient) of 0.733 and RMSE (root mean square error) of 0.187% considering all 54 samples than the models developed by PR (polynomial regression), SLR (simple linear regression), and PLSR (partial least square regression). PLSR calibration models showed almost similar result with PR as 0.663 ($R^2$) and 0.169% (RMSE) for cloud-free samples and 0.491 ($R^2$) and 0.217% (RMSE) for cloud-shadowed samples. However, the validation models performed poorly. This study revealed that there is a highly significant correlation between NDVI (normalized difference vegetation index) and protein content in rice. For the cloud-free samples, the SLR models showed $R^2=0.553$ and RMSE = 0.210%, and for cloud-shadowed samples showed 0.479 as $R^2$ and 0.225% as RMSE respectively. Conclusion: There is a significant correlation between spectral bands and grain protein content. Artificial neural networks have the strong advantages to fit the nonlinear problem when a sigmoid activation function is used in the hidden layer. Quantitatively, the neural network model obtained a higher precision result with a mean absolute relative error (MARE) of 2.18% and root mean square error (RMSE) of 0.187%.

An Efficient Software Defined Data Transmission Scheme based on Mobile Edge Computing for the Massive IoT Environment

  • Kim, EunGyeong;Kim, Seokhoon
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • v.12 no.2
    • /
    • pp.974-987
    • /
    • 2018
  • This paper presents a novel and efficient data transmission scheme based on mobile edge computing for the massive IoT environments which should support various type of services and devices. Based on an accurate and precise synchronization process, it maximizes data transmission throughput, and consistently maintains a flow's latency. To this end, the proposed efficient software defined data transmission scheme (ESD-DTS) configures and utilizes synchronization zones in accordance with the 4 usage cases, which are end node-to-end node (EN-EN), end node-to-cloud network (EN-CN), end node-to-Internet node (EN-IN), and edge node-to-core node (EdN-CN); and it transmit the data by the required service attributes, which are divided into 3 groups (low-end group, medium-end group, and high-end group). In addition, the ESD-DTS provides a specific data transmission method, which is operated by a buffer threshold value, for the low-end group, and it effectively accommodates massive IT devices. By doing this, the proposed scheme not only supports a high, medium, and low quality of service, but also is complied with various 5G usage scenarios. The essential difference between the previous and the proposed scheme is that the existing schemes are used to handle each packet only to provide high quality and bandwidth, whereas the proposed scheme introduces synchronization zones for various type of services to manage the efficiency of each service flow. Performance evaluations show that the proposed scheme outperforms the previous schemes in terms of throughput, control message overhead, and latency. Therefore, the proposed ESD-DTS is very suitable for upcoming 5G networks in a variety of massive IoT environments with supporting mobile edge computing (MEC).

A Study on RFID System Based on Cloud (클라우드 기반 RFID 시스템에 관한 연구)

  • Lee, Cheol-Seung
    • The Journal of the Korea institute of electronic communication sciences
    • /
    • v.15 no.6
    • /
    • pp.1145-1150
    • /
    • 2020
  • After the Davos Forum, the recent 4th Industrial Revolution has become an area of interest to countries around the world. Among the technologies of the 4th industrial revolution, the ubiquitous computing environment requires a convergence environment of various devices, networks, and software technologies, and the RFID technology that identifies objects among the IoT technology fields is applied to all industries and has a competitive edge. Systems to which RFID technology is applied are being used in various industrial fields, especially! It is efficiently used for accurate inventory management and SCM management in the field of distribution and logistics. If the RFID system is built in a cloud-based environment, it will be possible to secure reliability in distribution management in consideration of an effective logistics management system and economic feasibility. This study is a study on the RFID system in a cloud computing environment to reduce the cost of operating or maintaining an application server to improve the economy and reliability.

Comparison of Search Performance of SQLite3 Database by Linux File Systems (Linux File Systems에 따른 SQLite3 데이터베이스의 검색 성능 비교)

  • Choi, Jin-Oh
    • Journal of the Korea Institute of Information and Communication Engineering
    • /
    • v.26 no.1
    • /
    • pp.1-6
    • /
    • 2022
  • Recently, IoT sensors are often used to produce stream data locally and they are provided for edge computing applications. Mass-produced data are stored in the mobile device's database for real-time processing and then synchronized with the server when needed. Many mobile databases are developed to support those applications. They are CloudScape, DB2 Everyplace, ASA, PointBase Mobile, etc, and the most widely used database is SQLite3 on Linux. In this paper, we focused on the performance required for synchronization with the server. The search performance required to retrieve SQLite3 was compared and analyzed according to the type of each Linux file system in which the database is stored. Thus, performance differences were checked for each file system according to various search query types, and criteria for applying the more appropriate Linux file system according to the index use environment and table scan environment were prepared and presented.

Development of an intelligent edge computing device equipped with on-device AI vision model (온디바이스 AI 비전 모델이 탑재된 지능형 엣지 컴퓨팅 기기 개발)

  • Kang, Namhi
    • The Journal of the Institute of Internet, Broadcasting and Communication
    • /
    • v.22 no.5
    • /
    • pp.17-22
    • /
    • 2022
  • In this paper, we design a lightweight embedded device that can support intelligent edge computing, and show that the device quickly detects an object in an image input from a camera device in real time. The proposed system can be applied to environments without pre-installed infrastructure, such as an intelligent video control system for industrial sites or military areas, or video security systems mounted on autonomous vehicles such as drones. The On-Device AI(Artificial intelligence) technology is increasingly required for the widespread application of intelligent vision recognition systems. Computing offloading from an image data acquisition device to a nearby edge device enables fast service with less network and system resources than AI services performed in the cloud. In addition, it is expected to be safely applied to various industries as it can reduce the attack surface vulnerable to various hacking attacks and minimize the disclosure of sensitive data.

The Characteristics and Predictability of Convective System Based on GOES-9 Observations during the Summer of 2004 over East Asia (정지기상위성의 밝기온도로 분석한 2004년 동아시아지역에서 발생한 여름철 대류 시스템의 특성과 그 예측 가능성)

  • Baek, Seon-Kyun;Choi, Young-Jean;Chung, Chu-Yong;Cho, Chun-Ho
    • Atmosphere
    • /
    • v.16 no.3
    • /
    • pp.225-234
    • /
    • 2006
  • Convective systems propagate eastward with a persistent pattern in the longitude-time space. The characteristic structure and fluctuation of convective system is helpful in determining its predictability. In this study, convective index (CI) was defined as a difference between GOES-9 window and water vapor channel brightness temperatures following Mosher (2001). Then the temporal-spatial scales and variational characteristics of the summer convective systems in the East Asia were analyzed. It is found that the average moving speed of the convective system is about 14 m/s which is much faster than the low pressure system in the summer. Their average duration is about 12 hours and the average length of the cloud streak is about 750km. These characteristics are consistent with results from other studies. Although the convective systems are forced by the synoptic system and are mostly developed in the eastern edge of the Tibetan Plateau, they have a persistent pattern, i.e., appearance of the maximum intensity of convective systems, as they approach the Korean Peninsula. The consistency of the convective systems, i.e., the eastward propagation, suggests that there exists an intrinsic predictability.