• Title/Summary/Keyword: Data Requirement

Search Result 1,800, Processing Time 0.029 seconds

IMAGE DATA CHAIN ANALYSIS FOR SATELLITE CAMERA ELECTRONIC SYSTEM

  • Park, Jong-Euk;Kong, Jong-Pil;Heo, Haeng-Pal;Kim, Young-Sun;Chang, Young-Jun
    • Proceedings of the KSRS Conference
    • /
    • v.2
    • /
    • pp.791-793
    • /
    • 2006
  • In the satellite camera, the incoming light source is converted to electronic analog signals by the electronic component for example CCD (Charge Coupled Device) detectors. The analog signals are amplified, biased and converted into digital signals (pixel data stream) in the video processor (A/Ds). The outputs of the A/Ds are digitally multiplexed and driven out using differential line drivers (two pairs of wires) for cross strap requirement. The MSC (Multi-Spectral Camera) in the KOMPSAT-2 which is a LEO spacecraft will be used to generate observation imagery data in two main channels. The MSC is to obtain data for high-resolution images by converting incoming light from the earth into digital stream of pixel data. The video data outputs are then MUXd, converted to 8 bit bytes, serialized and transmitted to the NUC (Non-Uniformity Correction) module by the Hotlink data transmitter. In this paper, the video data streams, the video data format, and the image data processing routine for satellite camera are described in terms of satellite camera control hardware. The advanced satellite with very high resolution requires faster and more complex image data chain than this algorithm. So, the effective change of the used image data chain and the fast video data transmission method are discussed in this paper

  • PDF

Software Buffering Technique For Real-time Recording of High Speed Satellite Data

  • Shin, Dong-Seok;Choi, Wook-Hyun;Kim, Moon-Gyu;Park, Won-Kyu
    • Korean Journal of Remote Sensing
    • /
    • v.18 no.3
    • /
    • pp.147-153
    • /
    • 2002
  • The real-time reception and recording of down-link mission data from a satellite requires the highest reliability because the data lost in receiving process cannot be recovered. The data receiving and recording system has moved from a set of dedicated hardware and software components to commercial-off-the-shelf (COTS) components in order to reduce the system cost as well as to upgrade the system easily for handling other satellite data. The use of COTS hardware and middleware components prevents the system developer from correcting or modifying the internal operations of the COTS components, and hence, instant performance degradation of the COTS components which affects the reliable data acquisition must be covered by a software algorithm. This paper introduces the instant performance problem of a COTS data recording device which leads to the data loss in the real-time data reception and recording process. As a result, the requirement of the modification of the conventional data read/write technique is issued. In order to overcome the data loss problem due to the use of COTS components and the conventional software technique, a new algorithm called a software buffering technique is proposed. The experiments show that the application of the proposed technique results in reliable real-time reception and recording of high speed serial data.

Implementing Data warehouse Methodology Architecture: From Metadata Perspective

  • Kim, Sang-Youl;Kim, Tae-Hun
    • International Commerce and Information Review
    • /
    • v.7 no.1
    • /
    • pp.55-74
    • /
    • 2005
  • Recently, many enterprises have attempted to construct data warehousing systems for decision-support. Data warehouse is an intelligent store of data that can aggregate vast amounts of information. Building DW requires two important development issues:(i) DW for the decision making of business users and (ii) metadata within it. Most DW development methodologies have not considered metadata development; it is necessary to adopt a DW development methodology which develops a DW and its metadata simultaneously. Metadata is a key to success of data warehousing system and is critical for implementing DW. That is, metadata is crucial documentation for a data warehousing system where users should be empowered to meet their own information needs; users need to know what data exists, what it represents, where it is located, and how to access it. Furthermore, metadata is used for extracting data and managing DW. However, metadata has failed because its management has been segregated from the DW development process. Metadata must be integrated with data warehousing systems. Without metadata, the decision support of DW is under the control of technical users. Therefore, integrating data warehouse with its metadata offers a new opportunity to create a more adaptive information system. Therefore, this paper proposes a DW development methodology from a metadata perspective. The proposed methodology consists of five phases: preparatory, requirement analysis, data warehouse (informational database) development, metastore development, and maintenance. To demonstrate the practical usefulness of the methodology, one case is illustrated

  • PDF

IoT-Based Health Big-Data Process Technologies: A Survey

  • Yoo, Hyun;Park, Roy C.;Chung, Kyungyong
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • v.15 no.3
    • /
    • pp.974-992
    • /
    • 2021
  • Recently, the healthcare field has undergone rapid changes owing to the accumulation of health big data and the development of machine learning. Data mining research in the field of healthcare has different characteristics from those of other data analyses, such as the structural complexity of the medical data, requirement for medical expertise, and security of personal medical information. Various methods have been implemented to address these issues, including the machine learning model and cloud platform. However, the machine learning model presents the problem of opaque result interpretation, and the cloud platform requires more in-depth research on security and efficiency. To address these issues, this paper presents a recent technology for Internet-of-Things-based (IoT-based) health big data processing. We present a cloud-based IoT health platform and health big data processing technology that reduces the medical data management costs and enhances safety. We also present a data mining technology for health-risk prediction, which is the core of healthcare. Finally, we propose a study using explainable artificial intelligence that enhances the reliability and transparency of the decision-making system, which is called the black box model owing to its lack of transparency.

RESEARCH OF COMMUNICATION SCHEDULING BETWEEN COMPUTER I/O AND S/W FOR ACQUISITION OF SATELLITE SENSORED DATA

  • Koo, Cheol-Hea;Park, Su-Hyun;Kang, Soo-Yeon;Yang, Koon-Ho;Choi, Sung-Bong
    • Proceedings of the KSRS Conference
    • /
    • v.1
    • /
    • pp.196-199
    • /
    • 2006
  • Various communication mechanisms have been developed to acquire a meaningful data from sensors. The key requirement during the sensor data acquisition is determinism and reduction of time dependency. It is a fundermental level of satellite data management for controlling satellite operation software data acquisition from sensors or subsystem. Satellite operation software has various software modules to be operated in addition to data acquisition. Therefore, unnecessary time delay shall be minimized to perform the data acquisition. As the result, interrupt method might be prefered than polling method because the former can decrease the restriction of design during implementation of data acquisition logic. The possible problems while using interrupt method like as interrupt latency caused by delaying of interrupt processing time are analyzed. In this paper, communication mechanism which can be used to interface with satellite computer and subsidary subsystem by using interrupt is presented. As well, time dependency between software scheduling and data acquisition is analyzed.

  • PDF

Annotation Method for Reliable Video Data (신뢰성 영상자료를 위한 어노테이션 기법)

  • Yun-Hee Kang;Taeun Kwon
    • Journal of Platform Technology
    • /
    • v.12 no.1
    • /
    • pp.77-84
    • /
    • 2024
  • With the recent increase in the use of artificial intelligence, AI TRiSM data management within organizations has become important, and thus securing data reliability has emerged as an essential requirement for data-based decision-making. Digital content is transmitted through the unreliable Internet to the cloud where the digital content storage is located, then used in various applications. When detecting anomaly of data, it is difficult to provide a function to check content modification due to its damage in digital content systems. In this paper, we design a technique to guarantee the reliability of video data by expanding the function of data annotation. The designed annotation technique constitutes a prototype based on gRPC to handle a request and a response in a webUI that generates classification label and Merkle tree of given video data.

  • PDF

ANALYSIS OF DYNAMIC PRIORITY QUEUE WITH APPLICATIONS IN ATM NETWORKS

  • Choi, Doo-Il;Lee, Yu-Tae
    • Journal of applied mathematics & informatics
    • /
    • v.7 no.2
    • /
    • pp.617-627
    • /
    • 2000
  • ATM networks support diverse traffic types with different service requirement such as data, voice, video and image. This paper analyzes a dynamic priority queue to satisfy Quality of Service (QoS) requirements of traffic. to consider the burstiness of traffic, we assume the arrival to be a Markovian arrival process(MAP) . Performance measures such as loss and delay are derived, Finally, some numerical results show the performance of the system.

The Architecting and Interface Management and for Standard and Regulation of Urban Transit System (도시철도시스템의 표준화기준 아키텍쳐 구축 및 인터페이스 관리방안 연구)

  • Lee, Wool-Dong;Chung, Jong-Duk;Kim, In-Goo
    • Proceedings of the KSR Conference
    • /
    • 2011.10a
    • /
    • pp.1235-1240
    • /
    • 2011
  • Urban transit system consist of complex system such as vehicle, signal, train control equipment, railway facilities, electricity facilities, station facilities and information & communications facilities. So a study on construction of management system for complex system. using system-engineering tool, Urban transit standardization of steps 2 perform architecture for urban transit standardization, requirement management and data-base construction. A study expect that provide more efficient and methodical information system to all user for using standardization.

  • PDF

Analysis of the Characteristics of Nutrients Loading and the Water Purification Function in the Paddy-fields (논의 영양물질 배출부하 특성과 수질정화 기능 분석)

  • Kim Hyoon Soo;Kim Jin Soo;Kim Young Il;Cheong Byeong Ho
    • KCID journal
    • /
    • v.11 no.1
    • /
    • pp.36-44
    • /
    • 2004
  • The objective of this study is to analyze the characteristics of the nutrients loading and the water purification function in the paddy-fields. This study was carried out based on the research data in the three case studies. 1. Irrigation requirement is m

  • PDF

MRP System with Emergent Lead Time (긴급선행기간을 이용한 MRP 시스템)

  • Nam, Sun-Hee;Yun, Won-Young
    • IE interfaces
    • /
    • v.4 no.1
    • /
    • pp.47-61
    • /
    • 1991
  • This paper develops MRP system with two-types of lead time(average lead time and emergent lead time). In this proposed MRP system, Material Requirement Planning is scheduled by using average lead time, but the emergent lead time is used only when start date of planned order is past. Btrieve data management technique and Stack structure are used for recalculating procedure of planned order with the TURBO PASCAL Version 5.5. An example is also considered.

  • PDF