• Title/Summary/Keyword: 컴포넌트 활용

Search Result 376, Processing Time 0.032 seconds

A Study on the Implementation of Management System Based on UHD Transmission Contents (UHD 송출 콘텐츠 기반 관리시스템 구현)

  • Kim, Moo Yeon;Jang, Byung Min;Choi, Seong Jhin
    • Journal of Broadcast Engineering
    • /
    • v.24 no.5
    • /
    • pp.813-826
    • /
    • 2019
  • This paper is a study on the implementation of MAM(Media Asset Management) to utilize UHD contents as high quality broadcast material. The implementation method of this paper is to separate MAM roles with content management functions and transmission workflow functions from workflow, metadata and system interface related work, which are divided into core MAM and MAM-Ex structure. Through the method proposed in this paper, we improved the content management method by applying the page menu method to the material metadata modification and applying the template method to the material structure API. In addition, the storage of UHD material and the configuration of the component server are pooled without any distinction of channels, thereby enhancing the security of UHD transmission assets by minimizing the movement of contents together with broadcasting material protection.

Compiler Optimization Techniques for The Next Generation Low Power Multibank Memory (차세대 저전력 멀티뱅크 메모리를 위한 컴파일러 최적화 기법)

  • Cho, Doosan
    • The Journal of the Institute of Internet, Broadcasting and Communication
    • /
    • v.21 no.6
    • /
    • pp.141-145
    • /
    • 2021
  • Various types of memory architectures have been developed, and various compiler optimization techniques have been studied to efficiently use them. In particular, since a memory is a major component that determines performance in mobile computing devices, various optimization techniques have been developed to support them. Recently, a lot of research on hybrid type memory architecture is being conducted, so various compiler techniques are being studied to support it. Existing compiler optimization techniques can be used to achieve the required minimum performance and constraint on low power according to market requirements. References for determining the low-power effect and the degree of performance improvement using these optimization techniques are not properly provided yet. This study was conducted to provide the experimental results of the existing compiler technique as a reference for the development of multibank memory architecture.

Security Enhancements for Distributed Ledger Technology Systems Based on Open Source (오픈소스 기반 분산원장기술 시스템을 위한 보안 강화 방안)

  • Park, Keundug;Kim, Dae Kyung;Youm, Heung Youl
    • Journal of the Korea Institute of Information Security & Cryptology
    • /
    • v.29 no.4
    • /
    • pp.919-943
    • /
    • 2019
  • Distributed ledger technology, which is attracting attention as an emerging technology related to the 4th Industrial Revolution, is implemented as an open source based distributed ledger technology system and widely used for development with various applications (or services), but the security functions provided by the distributed general ledger system are very insufficient. This paper proposes security enhancements for distributed ledger technology systems based on open source. To do so, potential security threats that may occur under running an open source based distributed ledger technology systems are identified and security functional requirements against the security threats identified are provided by analyzing legislation and security certification criteria (ISMS-P). In addition, it proposes a method to implement the security functions required for an open source based distributed ledger technology systems through analysis of security functional components of Common Criteria (CC), an international standard.

Estimation of relative evaluation effort ratios for each EALs in CC 2.3 and CC 3.1 (CC 2.3과 CC 3.1의 보증수준별 상대적 평가업무량 배율 추정)

  • Kou, Kab-Seung;Kim, Young-Soo;Lee, Gang-Soo
    • Journal of the Korea Institute of Information Security & Cryptology
    • /
    • v.17 no.4
    • /
    • pp.61-74
    • /
    • 2007
  • In Common Criteria evaluation scheme, sponsor and evaluator should estimate evaluation cost and duration of IT security system evaluation in contracting the evaluation project. In this paper, We analyzed study result that achieve at 2003 and 2005, and utilized part of study result. And we empirically estimate relative evaluation effort ratios among evaluation assurance levels($EAL1{\sim}EAL7$) in CC v2.3 and CC v3.1. Also, we estimate the ratios from 'developer action elements', adjusted 'content and presentation of evidence elements', and 'evaluator action elements 'for each assurance component. We, especially, use ratio of amount of effort for each 'evaluator action elements', that was obtained from real evaluators in KISA in 2003. Our result will useful for TOE sponsor as well as evaluation project manager who should estimate evaluation cost and duration for a specific EAL and type of TOE, in a new CC v3.1 based evaluation schem.

Study on AR/VR Model Generation Techniques Using Piping Isometric Drawing Files (배관 ISO도면 파일 기반 AR/VR모델 생성 기법 연구)

  • Lee, Jung-Min;Lee, Kyung-Ho;Kim, Yang-Ouk;Han, Young-Soo
    • Journal of the Computational Structural Engineering Institute of Korea
    • /
    • v.34 no.1
    • /
    • pp.19-24
    • /
    • 2021
  • This paper presents a method to generate three-dimensional AR/VR models using the information in Isogen data files (IDFs). An IDF is an output file produced by ISOGEN that contains piping isometric drawings. A piping isometric drawing is used for pipeline installation in the shipyard; therefore, the drawing describes assembly information with symbolic features, not with detailed geometric features. An IDF specifies relationships between piping routes and components with three-dimensional points and tag information as well as the bill of the materials of a pipeline. The key idea of this paper is that AR/VR models can be generated with the piping route points data and piping components tag information in real time, without any conversion of standard data exchange file formats, such as STP, IGES, and SAT. This paper describes IDF data structure and suggests the geometry generation process with IDF data and parametric functions.

Development of Simulation Architecture Framework for Simulation Based Acquisition (모의기반획득을 위한 시뮬레이션 아키텍처 프레임워크 개발)

  • Cho, Kyu-Tae;Shim, Jun-Yong;Lee, Yong-Heon;Lee, Seung-Young;Kim, Sae-Hwan
    • Journal of the Korea Society for Simulation
    • /
    • v.19 no.3
    • /
    • pp.81-89
    • /
    • 2010
  • Recent modeling and simulation technology is being used in various fields. Especially in the field of military, Simulation-Based Acquisition (SBA) is recognized as a essential policy. To effectively carry out SBA, modeling and simulation techniques should be applied in the whole life-cycle for the weapon system development, and simulation architecture framework which provides easily reusable and interoperability is needed. Through reusability and interoperability, the costs of constructing the integrated collaborate environment for simulation based acquisition can be minimized. In this study, we define requirements, issues for enhancing reusability and interoperability, and propose simulation framework as a solution of the problem including structural design. Proposing simulation framework provides common functions for producing simulator as reusable units and easily changeable structure on user's purpose. In addition, we provide the result for applying simulation framework to our project.

Adaptive Burst Size-based Loss Differentiation for Transmitting Massive Medical Data in Optical Internet (광 인터넷에서 대용량 의학 데이터 전송을 위한 적응형 버스트 길이 기반 손실 차등화 기법)

  • Lee, Yonggyu
    • Journal of Digital Convergence
    • /
    • v.20 no.3
    • /
    • pp.389-397
    • /
    • 2022
  • As increasing the growth of the Internet in medical area, a new technology to transmit effectively massive medical data is required. In optical internet, all OBS nodes have fiber delay lines, hardware components. These components are calculated under some optimal traffic conditions, and this means that if the conditions change, then the components should be altered. Therefore, in this article a new service differentiation algorithm using the previously installed components is proposed, which is used although the conditions vary. When traffic conditions change, the algorithm dynamically recalculates the threshold value used to decide the length of data bursts. By doing so, irrelevant to changes, the algorithm can maintain the service differentiation between classes without replacing any fiber delay lines. With the algorithm, loss sensitive medical data can be transferred well.

Development of Pollutant Load Estimation System for Hydrologic Component, WAPLE4 (수문컴포넌트별 오염부하 산정이 가능한 WAPLE4의 개발)

  • Jeong, Yeon Ji;Jeong, Yeon Seok;Lee, Seo Ro;Yang, Dong Seok;Lee, Gwan Jae;Choi, Yong Hun;Lim, Kyoung Jae
    • Proceedings of the Korea Water Resources Association Conference
    • /
    • 2022.05a
    • /
    • pp.192-192
    • /
    • 2022
  • 하천유량은 기저유량과 직접유량으로 구성되어 있으며 기저유량은 갈수기 하천유량의 대부분을 차지하므로 직접유출과 기저유출의 분리는 중요하다. 또한, T-N, T-P는 기저유출에 영향을 많이 받는 수질 항목으로 기저유출과 직접유출에 의한 오염부하량을 정확히 분석해야한다. 따라서, 기저유출의 오염부하량 산정을 위해 기존의 WAPLE 2의 단점을 개선한 WAPLE 3가 개발되었으며, WAPLE 3는 유량 곡선의 하강부 변곡점에 붙는 특성을 가지고 있는 Baseflow filter program(BFlow) pass 1값을 사용하여 기저유량을 분리해 기저유출 오염부하량을 산정한다. WAPLE 3는 하천유량 중 기저유출을 어느 정도 분리하는지 결정하는 filter parameter 값을 Nathan과 McMahon가 제시한 최적값인 0.925를 사용하였다. 그러나 지형과 강우량 등에 따라 하천 유량에서의 기저유출 비율은 달라지기 때문에 이러한 한계점을 극복하기 위해 WAPLE 4를 개발하였다. WAPLE 4는 filter parameter 값을 사용자가 변경할 수 있게 개발하여 강우에 의한 유량변동 특정이 고려된 기저유량 및 오염부하를 산정하여 결과에 대한 정확도를 높였다. 또한, WAPLE 4는 강우시 오염부하량 산정에 탁월한 Numeric Integration(NI) 방법을 사용하여 직접유량, 기저유량의 오염부하량 및 유량가중평균농도(FWMC) 산정이 가능하도록 하였다. 본 연구의 결과는 오염총량제 및 기저유량 관리를 통한 유량 관련 정책 수립 시 기초자료로 활용될 수 있을 것으로 판단된다.

  • PDF

Improvement of Class Reuse at Sensor Network System Based on TinyOS Using CATL Model and Facade Pattern (CATL 모델과 Facade 패턴을 이용한 TinyOS 기반 센서네트워크 시스템 클래스 재사용 개선)

  • Baek, Jeong-Ho;Lee, Hong-Ro
    • Journal of the Korean Association of Geographic Information Studies
    • /
    • v.15 no.2
    • /
    • pp.46-56
    • /
    • 2012
  • Recently, when software architecture is designed, the efficiency of reusability is emphasized. The reusability of the design can raise the quality of GIS software, and reduce the cost of maintenance. Because the object oriented GoF design pattern provides the class hierarchy that can represent repetitively, the importance is emphasized more. This method that designs the GIS software can be applied from various application systems. A multiple distributed sensor network system is composed of the complex structure that each node of the sensor network nodes has different functions and sensor nodes and server are designed by the combination of many classes. Furthermore, this sensor network system may be changed into more complex systems according to a particular purpose of software designer. This paper will design the CATL model by applying Facade pattern that can enhance the efficiency of reuse according to attributes and behaviors in classes in order to implement the complicated structure of the multiple distributed sensor network system based on TinyOS. Therefore, our object oriented GIS design pattern model will be expected to utilize efficiently for design, update, or maintenance, etc. of new systems by packing up attributes and behaviors of classes at complex sensor network systems.

A Study on Data Adjustment and Quality Enhancement Method for Public Administrative Dataset Records in the Transfer Process-Based on the Experiences of Datawarehouses' ETT (행정정보 데이터세트 기록 이관 시 데이터 보정 및 품질 개선 방법 연구 - 데이터웨어하우스 ETT 경험을 기반으로)

  • Yim, Jin-Hee;Cho, Eun-Hee
    • The Korean Journal of Archival Studies
    • /
    • no.25
    • /
    • pp.91-129
    • /
    • 2010
  • As it grows more heavily reliant on information system, researchers seek for various ways to manage and utilize of dataset records which is accumulated in public information system. It might be needed to adjust date and enhance the quality of public administrative dataset records during transferring to archive system or sharing server. The purpose of this paper is presenting data adjustment and quality enhancement methods for public administrative dataset records, and it refers to ETT procedure and method of construction of datawarehouses. It suggests seven typical examples and processing method of data adjustment and quality enhancement, which are (1) verification of quantity and data domain (2) code conversion for a consistent code value (3) making component with combinded information (4) making a decision of precision of date data (5) standardization of data (6) comment information about code value (7) capturing of metadata. It should be reviewed during dataset record transfer. This paper made Data adjustment and quality enhancement requirements for dataset record transfer, and it could be used as data quality requirement of administrative information system which produces dataset.