• Title/Summary/Keyword: Big Sensor Data Stream

Search Result 8, Processing Time 0.025 seconds

Context Inference and Sensor Data Classification of Big Data Stream Environment (빅데이터 스트림 환경에서의 센서 데이터 분류와 상황추론)

  • Ryu, Chang-Kun
    • The Journal of the Korea institute of electronic communication sciences
    • /
    • v.9 no.10
    • /
    • pp.1079-1085
    • /
    • 2014
  • The analysis of the variable continuous big data stram should reach the destination context awareness. This study presented a novel way of context inference of the variable data stream from sensor motes. For assessment of the sensor data, we calculated the difference of each measured value at the time window and determined the belief value of each focal element. It was beneficial that calculate and assessment of factor of situation for context inference with the Dempster-Shfer evidence theory.

Study on the Sensor Gateway for Receive the Real-Time Big Data in the IoT Environment (IoT 환경에서 실시간 빅 데이터 수신을 위한 센서 게이트웨이에 관한 연구)

  • Shin, Seung-Hyeok
    • Journal of Advanced Navigation Technology
    • /
    • v.19 no.5
    • /
    • pp.417-422
    • /
    • 2015
  • A service size of the IoT environment is determined by the number of sensors. The number of sensors increase means increases the amount of data generated by the IoT environment. There are studies to reliably operate a network for research and operational dynamic buffer for data when network congestion control congestion in the network environment. There are also studies of the stream data that has been processed in the connectionless network environment. In this study, we propose a sensor gateway for processing big data of the IoT environment. For this, review the RESTful for designing a sensor middleware, and apply the double-buffer algorithm to process the stream data efficiently. Finally, it generates a big data traffic using the MJpeg stream that is based on the HTTP protocol over TCP to evaluate the proposed system, with open source media player VLC using the image received and compare the throughput performance.

Scalable Big Data Pipeline for Video Stream Analytics Over Commodity Hardware

  • Ayub, Umer;Ahsan, Syed M.;Qureshi, Shavez M.
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • v.16 no.4
    • /
    • pp.1146-1165
    • /
    • 2022
  • A huge amount of data in the form of videos and images is being produced owning to advancements in sensor technology. Use of low performance commodity hardware coupled with resource heavy image processing and analyzing approaches to infer and extract actionable insights from this data poses a bottleneck for timely decision making. Current approach of GPU assisted and cloud-based architecture video analysis techniques give significant performance gain, but its usage is constrained by financial considerations and extremely complex architecture level details. In this paper we propose a data pipeline system that uses open-source tools such as Apache Spark, Kafka and OpenCV running over commodity hardware for video stream processing and image processing in a distributed environment. Experimental results show that our proposed approach eliminates the need of GPU based hardware and cloud computing infrastructure to achieve efficient video steam processing for face detection with increased throughput, scalability and better performance.

Data Source Management using weight table in u-GIS DSMS

  • Kim, Sang-Ki;Baek, Sung-Ha;Lee, Dong-Wook;Chung, Warn-Il;Kim, Gyoung-Bae;Bae, Hae-Young
    • Journal of Korea Spatial Information System Society
    • /
    • v.11 no.2
    • /
    • pp.27-33
    • /
    • 2009
  • The emergences of GeoSensor and researches about GIS have promoted many researches of u-GIS. The disaster application coupled in the u-GIS can apply to monitor accident area and to prevent spread of accident. The application needs the u-GIS DSMS technique to acquire, to process GeoSensor data and to integrate them with GIS data. The u-GIS DSMS must process big and large-volume data stream such as spatial data and multimedia data. Due to the feature of the data stream, in u-GIS DSMS, query processing can be delayed. Moreover, as increasing the input rate of data in the area generating events, the network traffic is increased. To solve this problem, in this paper we describe TRIGGER ACTION clause in CQ on the u-GIS DSMS environment and proposes data source management. Data source weight table controls GES information and incoming data rate. It controls incoming data rate as increasing weight at GES of disaster area. Consequently, it can contribute query processing rate and accuracy

  • PDF

Real Time Distributed Parallel Processing to Visualize Noise Map with Big Sensor Data and GIS Data for Smart Cities (스마트시티의 빅 센서 데이터와 빅 GIS 데이터를 융합하여 실시간 온라인 소음지도로 시각화하기 위한 분산병렬처리 방법론)

  • Park, Jong-Won;Sim, Ye-Chan;Jung, Hae-Sun;Lee, Yong-Woo
    • Journal of Internet Computing and Services
    • /
    • v.19 no.4
    • /
    • pp.1-6
    • /
    • 2018
  • In smart cities, data from various kinds of sensors are collected and processed to provide smart services to the citizens. Noise information services with noise maps using the collected sensor data from various kinds of ubiquitous sensor networks is one of them. This paper presents a research result which generates three dimensional (3D) noise maps in real-time for smart cities. To make a noise map, we have to converge many informal data which include big image data of geographical Information and massive sensor data. Making such a 3D noise map in real-time requires the processing of the stream data from the ubiquitous sensor networks in real-time and the convergence operation in real-time. They are very challenging works. We developed our own methodology for real-time distributed and parallel processing for it and present it in this paper. Further, we developed our own real-time 3D noise map generation system, with the methodology. The system uses open source softwares for it. Here in this paper, we do introduce one of our systems which uses Apache Storm. We did performance evaluation using the developed system. Cloud computing was used for the performance evaluation experiments. It was confirmed that our system was working properly with good performance and the system can produce the 3D noise maps in real-time. The performance evaluation results are given in this paper, as well.

A Study on the Data Collection Methods based Hadoop Distributed Environment (하둡 분산 환경 기반의 데이터 수집 기법 연구)

  • Jin, Go-Whan
    • Journal of the Korea Convergence Society
    • /
    • v.7 no.5
    • /
    • pp.1-6
    • /
    • 2016
  • Many studies have been carried out for the development of big data utilization and analysis technology recently. There is a tendency that government agencies and companies to introduce a Hadoop of a processing platform for analyzing big data is increasing gradually. Increased interest with respect to the processing and analysis of these big data collection technology of data has become a major issue in parallel to it. However, study of the collection technology as compared to the study of data analysis techniques, it is insignificant situation. Therefore, in this paper, to build on the Hadoop cluster is a big data analysis platform, through the Apache sqoop, stylized from relational databases, to collect the data. In addition, to provide a sensor through the Apache flume, a system to collect on the basis of the data file of the Web application, the non-structured data such as log files to stream. The collection of data through these convergence would be able to utilize as a basic material of big data analysis.

A Query Preprocessing Tool for Performance Improvement in Complex Event Stream Query Processing (복합 이벤트 스트림 질의 처리 성능 개선을 위한 질의 전처리 도구)

  • Choi, Joong-Hyun;Cho, Eun-Sun;Lee, Kang-Woo
    • KIISE Transactions on Computing Practices
    • /
    • v.21 no.8
    • /
    • pp.513-523
    • /
    • 2015
  • A complex event processing system, becoming useful in real life domains, efficiently processes stream of continuous events like sensor data from IoT systems. However, those systems do not work well on some types of queries yet, so that programmers should be careful about that. For instance, they do not sufficiently provide detailed guide to choose efficient queries among the almost same meaning queries. In this paper, we propose an query preprocessing tool for event stream processing systems, which helps programmers by giving them the hints to improve performance whenever their queries fall in any possible bad formats in the performance sense. We expect that our proposed module would be a big help to increases productivity of writing programs where debugging, testing, and performance tuning are not straightforward.

Flood Disaster Prediction and Prevention through Hybrid BigData Analysis (하이브리드 빅데이터 분석을 통한 홍수 재해 예측 및 예방)

  • Ki-Yeol Eom;Jai-Hyun Lee
    • The Journal of Bigdata
    • /
    • v.8 no.1
    • /
    • pp.99-109
    • /
    • 2023
  • Recently, not only in Korea but also around the world, we have been experiencing constant disasters such as typhoons, wildfires, and heavy rains. The property damage caused by typhoons and heavy rain in South Korea alone has exceeded 1 trillion won. These disasters have resulted in significant loss of life and property damage, and the recovery process will also take a considerable amount of time. In addition, the government's contingency funds are insufficient for the current situation. To prevent and effectively respond to these issues, it is necessary to collect and analyze accurate data in real-time. However, delays and data loss can occur depending on the environment where the sensors are located, the status of the communication network, and the receiving servers. In this paper, we propose a two-stage hybrid situation analysis and prediction algorithm that can accurately analyze even in such communication network conditions. In the first step, data on river and stream levels are collected, filtered, and refined from diverse sensors of different types and stored in a bigdata. An AI rule-based inference algorithm is applied to analyze the crisis alert levels. If the rainfall exceeds a certain threshold, but it remains below the desired level of interest, the second step of deep learning image analysis is performed to determine the final crisis alert level.