• Title/Summary/Keyword: data stream technology

Search Result 408, Processing Time 0.03 seconds

Generation and Interpretation of data stream for position data of objects synchronized with video (비디오와 동기화된 물체의 위치정보 표현 data stream 생성 및 해석기 구현)

  • Na, Hee-Joo;Kim, Jung-Hwan;Jung, Moon-Ryul
    • Proceedings of the Korean Society of Broadcast Engineers Conference
    • /
    • 2005.11a
    • /
    • pp.249-254
    • /
    • 2005
  • 본 논문은 디지털 방송 프로그램 진행 중 비디오의 특정 시점에 동기화된 특정 객체의 위치정보를 표현하는 data stream을 생성하고, 그 시점에 해당 위치 정보를 해석하는 해석기에 관한 것이다. 현재의 상용 스트림 생성기는 디지털 방송 표준에서 권고하는 스트림 이벤트의 발생 시각과 셋톱박스에서 디코딩 시에 사용할 참조값을 적절하게 생성하지 못하고 있다. 또한, 셋톱박스에서 동작하는 애플리케이션(Xlet) 역시 STC(System Time Clock), PCR(Program Clock Reference), NPT(Normal Play Time) 등의 시간값을 적절하게 읽어내지 못하고 있다. 더욱이, 현재의 디지털 방송 표준에서는 영상 내 특정 객체를 위해 정보를 제공하는 데에는 한계가 있다. 따라서, 본 논문에서는 다양한 연동형 디지털 방송 프로그램 제작을 위해 비디오의 특정 시점에 동기화된 객체의 위치정보를 표현하는 data stream을 생성하는 방법과, 동기화된 데이터를 처리하는 애플리케이션에 대해서 설명한다.

  • PDF

Multi-dimensional Query Authentication for On-line Stream Analytics

  • Chen, Xiangrui;Kim, Gyoung-Bae;Bae, Hae-Young
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • v.4 no.2
    • /
    • pp.154-173
    • /
    • 2010
  • Database outsourcing is unavoidable in the near future. In the scenario of data stream outsourcing, the data owner continuously publishes the latest data and associated authentication information through a service provider. Clients may register queries to the service provider and verify the result's correctness, utilizing the additional authentication information. Research on On-line Stream Analytics (OLSA) is motivated by extending the data cube technology for higher multi-level abstraction on the low-level-abstracted data streams. Existing work on OLSA fails to consider the issue of database outsourcing, while previous work on stream authentication does not support OLSA. To close this gap and solve the problem of OLSA query authentication while outsourcing data streams, we propose MDAHRB and MDAHB, two multi-dimensional authentication approaches. They are based on the general data model for OLSA, the stream cube. First, we improve the data structure of the H-tree, which is used to store the stream cube. Then, we design and implement two authentication schemes based on the improved H-trees, the HRB- and HB-trees, in accordance with the main stream query authentication framework for database outsourcing. Along with a cost models analysis, consistent with state-of-the-art cost metrics, an experimental evaluation is performed on a real data set. It exhibits that both MDAHRB and MDAHB are feasible for authenticating OLSA queries, while MDAHRB is more scalable.

Uncertainty quantification of PWR spent fuel due to nuclear data and modeling parameters

  • Ebiwonjumi, Bamidele;Kong, Chidong;Zhang, Peng;Cherezov, Alexey;Lee, Deokjung
    • Nuclear Engineering and Technology
    • /
    • v.53 no.3
    • /
    • pp.715-731
    • /
    • 2021
  • Uncertainties are calculated for pressurized water reactor (PWR) spent nuclear fuel (SNF) characteristics. The deterministic code STREAM is currently being used as an SNF analysis tool to obtain isotopic inventory, radioactivity, decay heat, neutron and gamma source strengths. The SNF analysis capability of STREAM was recently validated. However, the uncertainty analysis is yet to be conducted. To estimate the uncertainty due to nuclear data, STREAM is used to perturb nuclear cross section (XS) and resonance integral (RI) libraries produced by NJOY99. The perturbation of XS and RI involves the stochastic sampling of ENDF/B-VII.1 covariance data. To estimate the uncertainty due to modeling parameters (fuel design and irradiation history), surrogate models are built based on polynomial chaos expansion (PCE) and variance-based sensitivity indices (i.e., Sobol' indices) are employed to perform global sensitivity analysis (GSA). The calculation results indicate that uncertainty of SNF due to modeling parameters are also very important and as a result can contribute significantly to the difference of uncertainties due to nuclear data and modeling parameters. In addition, the surrogate model offers a computationally efficient approach with significantly reduced computation time, to accurately evaluate uncertainties of SNF integral characteristics.

Stream Data Processing based on Sliding Window at u-Health System (u-Health 시스템에서 슬라이딩 윈도우 기반 스트림 데이터 처리)

  • Kim, Tae-Yeun;Song, Byoung-Ho;Bae, Sang-Hyun
    • The Journal of Korea Institute of Information, Electronics, and Communication Technology
    • /
    • v.4 no.2
    • /
    • pp.103-110
    • /
    • 2011
  • It is necessary to accurate and efficient management for measured digital data from sensors in u-health system. It is not efficient that sensor network process input stream data of mass storage stored in database the same time. We propose to improve the processing performance of multidimensional stream data continuous incoming from multiple sensor. We propose process query based on sliding window for efficient input stream and found multiple query plan to Mjoin method and we reduce stored data using backpropagation algorithm. As a result, we obtained to efficient result about 18.3% reduction rate of database using 14,324 data sets.

Jave based Embedded System Design and Implementation for Real-time Stream Data Processing (Java 기반 실시간 센서 데이터스트림처리 및 임베디드 시스템 구현)

  • Kim, Hyu-Chan;Ko, Wan-Ki;Park, Sang-Yeol
    • Journal of Korea Society of Digital Industry and Information Management
    • /
    • v.4 no.2
    • /
    • pp.1-12
    • /
    • 2008
  • Home network is a technology that provides possibilities of monitoring/controling/mutilating-recognition between optional home network machines in residences. Currently, home network or other networks like entertainment, residential electronic networks are jumbled together with heterogeneous networks in a rampaging condition. In a reality of high expectation for home networks system like the mutual application for various machines, we are required to have the unification technology for conveniences to satisfy expectations. This thesis reflects how to develop Java applications or mutual products based on convenient interfaces actually that process various sensors which create real time data stream in Java platform through Java based sensor data-stream processing embedded middleware design and realization in real time.

Development of an Event Stream Processing System for the Vehicle Telematics Environment

  • Kim, Jong-Ik;Kwon, Oh-Cheon;Kim, Hyun-Suk
    • ETRI Journal
    • /
    • v.31 no.4
    • /
    • pp.463-465
    • /
    • 2009
  • In this letter, we present an event stream processing system that can evaluate a pattern query for a data sequence with predicates. We propose a pattern query language and develop a pattern query processing system. In our system, we propose novel techniques for run-time aggregation and negation processing and apply our system to stream data generated from vehicles to monitor unusual driving patterns.

Performance Evaluation and Analysis of Multiple Scenarios of Big Data Stream Computing on Storm Platform

  • Sun, Dawei;Yan, Hongbin;Gao, Shang;Zhou, Zhangbing
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • v.12 no.7
    • /
    • pp.2977-2997
    • /
    • 2018
  • In big data era, fresh data grows rapidly every day. More than 30,000 gigabytes of data are created every second and the rate is accelerating. Many organizations rely heavily on real time streaming, while big data stream computing helps them spot opportunities and risks from real time big data. Storm, one of the most common online stream computing platforms, has been used for big data stream computing, with response time ranging from milliseconds to sub-seconds. The performance of Storm plays a crucial role in different application scenarios, however, few studies were conducted to evaluate the performance of Storm. In this paper, we investigate the performance of Storm under different application scenarios. Our experimental results show that throughput and latency of Storm are greatly affected by the number of instances of each vertex in task topology, and the number of available resources in data center. The fault-tolerant mechanism of Storm works well in most big data stream computing environments. As a result, it is suggested that a dynamic topology, an elastic scheduling framework, and a memory based fault-tolerant mechanism are necessary for providing high throughput and low latency services on Storm platform.

An Application of Physico-Environmental Evaluation System of Stream - Focusing on urban streams - (하천의 물리 환경성 평가체계의 적용 - 도시하천을 중심으로 -)

  • Jung, Hea-Reyn;Kim, Ki-Heung
    • Journal of the Korean Society of Environmental Restoration Technology
    • /
    • v.20 no.1
    • /
    • pp.55-75
    • /
    • 2017
  • The purpose of this study is to present the basic data for restoration of physical stream environment by analyzing habitat variables because habitat environment is changed due to the construction of waterfront space in urban streams. Assessment results of 10 habitat variables(three divisions) were almost same as optimal condition, in the reach of reference stream where there are no stream crossing structures and channel alteration. Assessment results of reaches in urban rivers, where streams were improved on water-friendly recreation activities, appeared to be marginal condition. Because habitat environment got worse due to stream improvement works such as construction of weir for water landscape, stepping stones for walking, low water revetment and high water revetment, and high water channel. In addition, in the case of mid gradient stream, the frequency of riffles was small or not existed because the intervals of the river crossing structures was short. In the case of mild stream types, the diversity of the pool was damaged due to the deposition of sludge in the upstream pool of weir and the installation of low water revetment.

WT-Heuristics: An Efficient Filter Operator Ordering Technology in Stream Data Environments (WT-Heuristics: 스트림 데이터 환경에서의 효율적인 필터 연산자 순서화 기법)

  • Min, Jun-Ki
    • The KIPS Transactions:PartD
    • /
    • v.15D no.2
    • /
    • pp.163-170
    • /
    • 2008
  • Due to the proliferation of the Internet and intranet, a new application domain called stream data processing has emerged. Stream data is real-timely and continuously generated. In this paper, we focus on the processing of stream data whose characteristics vary unpredictably by over time. Particularly, we suggest a method which generates an efficient operator execution order called WT-Heuristics. WT-Heuristics efficiently determines the operator execution order since it considers only two adjacent operators in the operator execution order. Also, our method changes the execution order with respect to the change of data characteristics with minimum overheads.

A Study on Transport Stream Analysis and Parsing Ability Enhancement in Digital Broadcasting and Service (디지털 방송 서비스에서 트랜스포트 스트림 분석 및 파싱 능력 향상에 관한 연구)

  • Kim, Jang-Won
    • The Journal of Korea Institute of Information, Electronics, and Communication Technology
    • /
    • v.10 no.6
    • /
    • pp.552-557
    • /
    • 2017
  • Wire, wireless digital broadcasting has sharply expanded with the birth of high definition TV since 2010, the use of duplex contents as well as simplex contents has rapidly increased. Currently, our satellite communications system adopted DVB by European digital broadcasting standardization organization as a standard of domestic data broadcasting, the method how to use selective contents has been studied variously according to the development of IPTV. Digital broadcasting utilizes the method using Transport Stream Packet(TSP) by the way of multiplexing of information in order to send multimedia information such as video, audio and data of MPEG-2, this streams include detail information on TV guide and program as well as video and audio information. In order to understand these data broadcasting system, this study realized TS analyzer that divides transport stream (TS) by packet in Linux environment, analyzes and prints by function, it can help the understanding of TS, the enhancement of stream parsing ability.