• Title/Summary/Keyword: code mapping

Search Result 171, Processing Time 0.032 seconds

Multi-unit Level 2 probabilistic safety assessment: Approaches and their application to a six-unit nuclear power plant site

  • Cho, Jaehyun;Han, Sang Hoon;Kim, Dong-San;Lim, Ho-Gon
    • Nuclear Engineering and Technology
    • /
    • v.50 no.8
    • /
    • pp.1234-1245
    • /
    • 2018
  • The risk of multi-unit nuclear power plants (NPPs) at a site has received considerable critical attention recently. However, current probabilistic safety assessment (PSA) procedures and computer code do not support multi-unit PSA because the traditional PSA structure is mostly used for the quantification of single-unit NPP risk. In this study, the main purpose is to develop a multi-unit Level 2 PSA method and apply it to full-power operating six-unit OPR1000. Multi-unit Level 2 PSA method consists of three steps: (1) development of single-unit Level 2 PSA; (2) extracting the mapping data from plant damage state to source term category; and (3) combining multi-unit Level 1 PSA results and mapping fractions. By applying developed multi-unit Level 2 PSA method into six-unit OPR1000, site containment failure probabilities in case of loss of ultimate heat sink, loss of off-site power, tsunami, and seismic event were quantified.

Program Translation from Conventional Programming Source to Java Bytecode (기존 프로그래밍 원시코드에서 자바 바이트 코드로의 변환)

  • Jeon-Geun Kang;Haeng-Kon Kim
    • Journal of the Korea Computer Industry Society
    • /
    • v.3 no.8
    • /
    • pp.963-980
    • /
    • 2002
  • Software reengineering is making various research for solutions against problem of maintain existing systems. Reengineering has a meaning of development of software on exizting systems through the reverse engineering auf forward engineering. Most of the important concepts used in reengineering is composition that is restructuring of the existing objects. Is there a compiler that can compile a program written in a traditional procedural language (like C or Pascal) and generate a Java bytecode, rather than an executable code that runs oかy on the machine it was compiled (such as an a.out file on a Unix machine)\ulcorner This type of compiler may be very handy for today's computing environment of heterogeneous networks. In this paper we present a software system that does this job at the binary-to-binary level. It takes the compiled binary code of a procedural language and translates it into Java bytecode. To do this, we first translate into an assembler code called Jasmin [7] that is a human-readable representation of Java bytecode. Then the Jasmin assembler converts it into real Java bytecode. The system is not a compiler because it does not start at the source level. We believe this kind of translator is even more useful than a compiler because most of the executable code that is available for sharing does not come with source programs. Of course, it works only if the format of the executable binary code is known. This translation process consists of three major stages: (1) analysis stage that identifies the language constructs in the given binary code, (2) initialization stage where variables and objects are located, classified, and initialized, and (3) mapping stage that maps the given binary code into a Jasmin assembler code that is then converted to Java bytecode.

  • PDF

A Study on Validation for Mapping of Gas Detectors at a BTX Plant (BTX 공정에서 Gas Detector Mapping 적정성 검토에 관한 연구)

  • Seo, Ji Hye;Han, Man Hyoeng;Kim, Il Kwon;Chon, Young Woo
    • Journal of the Korean Society of Safety
    • /
    • v.32 no.5
    • /
    • pp.168-178
    • /
    • 2017
  • In order to prevent major and chemical accidents, some of the plants which would like to install and operate hazard chemicals handling facilities must submit Off-site Consequence Analysis due to recent arisen leak accidents since 2015. A lot of chemical industrials choose gas detectors as mitigation equipment to early detect gas vapor. The way of placement of gas detectors has two methods; Code-based Design(CBD) and Performance-based Design. The CBD has principles for gas detectors to be installed with consideration for the place that is expected to accumulate gas, and the leak locations according to legal standards and technical guidelines, and has a possibility to be unable to detect by these rules to locate gas detectors by vapor density information. The PBD has two methods; a Geographic Method and Scenario based Method. The Scenario-based Method has been suggested to make up for the Geographic Coverage Method. This Scenario-based Method draw the best optimum placement of gas detectors by considering leak locations, leak speed information, leak directions and etc. However, the domestic placement guidelines just refers to the CBD. Therefore, this study is to compare existing placement location of gas detectors by the domestic CBD with placement locations, coverages and the number of gas detectors in accordance with the Scenario-based Method. Also this study has measures for early detecting interest of Vapor Cloud and suitable placement of gas detectors to prevent chemical accidents. The Phast software was selected to simulate vapor cloud dispersion to predict the consequence. There are two cases; an accident hole size of leak(8 mm) from API which is the highst accident hole size less than 24.5 mm, and a normal leak hole size from KOSHA Guide (1.8 mm). Detect3D was also selected to locate gas detectors efficiently and compare CBD results and PBD results. Currently, domestic methods of gas detectors do not consider any risk, but just depend on domestic code methods which lead to placement of gas detectors not to make personnels recognize tolerable or intolerable risks. The results of the Scenario-based Method, however, analyze the leak estimated range by simulating leak dispersion, and then it is able to tell tolerable risks. Thus it is considered that individuals will be able to place gas detectors reasonably by making objectives and roles flexibly according to situations in a specific plant.

Design and Implementation of a Framework for Automatically Generating Control and Monitoring Software

  • Yoo, Dae-Sung;Sim, Min-Suck;Park, Sung-Ghue;Kim, Jong-Hwan;Yi, Myeong-Jae
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 2004.08a
    • /
    • pp.932-935
    • /
    • 2004
  • In this paper, we present a framework that is easy to develop, modify, maintain and extend a control and monitoring software for any kinds of instruments. The presented framework is composed of three XML documents (IID, MAP, and CMIML) and two tools (Virtual Instrument Wizard, Generator). Interface information about behaviors and states of instruments is written on IID. Mapping information between the interface information in IID and API of a real instrument driver is written on MAP. Finally information about control and monitoring software is written on CMIML. IID, MAP and CMIML are written with XML format to provide common usage and platform independence of the suggested framework. VI Wizard generates CMIML (platform independent intermediate document) using IID and existing CMIML, and Generator generates source code of a control and monitoring software (platform dependent code) automatically using CMIML and MAP. The suggested framework that automatically generates control and monitoring software based on GUI provides easy development and maintenance. Also, reusability can be increased by reusing platform independent software description documents.

  • PDF

Mapping Items of Functioning Questionnaires into the International Classification of Functioning, Disability and Health: Stroke

  • Song, Ju-Min;Lee, Hae-Jung
    • The Journal of Korean Physical Therapy
    • /
    • v.28 no.5
    • /
    • pp.341-347
    • /
    • 2016
  • Purpose: The aim of the study was to investigate items of commonly used questionnaires that measure functioning status of persons with stroke and map to the International Classification of Functioning, Disability and Health (ICF). Methods: Eighty-six patients with stroke were recruited from 12 medical institutes for the study. Each item of the Modified Bathel Index (MBI), Stroke Impact Scale (SIS), Mini Mental Status Evaluation (MMSE) and SF-36 were examined and compared its concept with the ICF. Concept linking was performed by 10 health professionals independently. A field test was performed to assess its correlation between those of scales and their linked ICF category sets. Results: It was found that 11 items in MBI was linked to 14 ICF categories, whereas 27 items of MMSE had 10 categories of ICF linked. 60 items of SIS were to be linked with 35 ICF categories. Agreement between professionals in linking was found to be high: 97.5% for MBI items, 78.0%, 78.0%, and 74.8% for MMSE, SIS, and SF-36 respectively. Strong relationship was observed between measurement scales and linked ICF code sets (r=-0.76 for SIS, r=-0.78 for MBI, r=-0.47 for MMSE) whereas there was no relationship was found between SF-36 and its ICF code set (r=-0.06) from the field test. Conclusion: It was found that items of SIS, MMSE and MBI may be linked to ICF categories. Those of linking concept between clinical tools and the ICF could be helpful for clinical data standardization.

Simulation, design optimization, and experimental validation of a silver SPND for neutron flux mapping in the Tehran MTR

  • Saghafi, Mahdi;Ayyoubzadeh, Seyed Mohsen;Terman, Mohammad Sadegh
    • Nuclear Engineering and Technology
    • /
    • v.52 no.12
    • /
    • pp.2852-2859
    • /
    • 2020
  • This paper deals with the simulation-based design optimization and experimental validation of the characteristics of an in-core silver Self-Powered Neutron Detector (SPND). Optimized dimensions of the SPND are determined by combining Monte Carlo simulations and analytical methods. As a first step, the Monte Carlo transport code MCNPX is used to follow the trajectory and fate of the neutrons emitted from an external source. This simulation is able to seamlessly integrate various phenomena, including neutron slowing-down and shielding effects. Then, the expected number of beta particles and their energy spectrum following a neutron capture reaction in the silver emitter are fetched from the TENDEL database using the JANIS software interface and integrated with the data from the first step to yield the origin and spectrum of the source electrons. Eventually, the MCNPX transport code is used for the Monte Carlo calculation of the ballistic current of beta particles in the various regions of the SPND. Then, the output current and the maximum insulator thickness to avoid breakdown are determined. The optimum design of the SPND is then manufactured and experimental tests are conducted. The calculated design parameters of this detector have been found in good agreement with the obtained experimental results.

FPGA Mapping Incorporated with Multiplexer Tree Synthesis (멀티플렉서 트리 합성이 통합된 FPGA 매핑)

  • Kim, Kyosun
    • Journal of the Institute of Electronics and Information Engineers
    • /
    • v.53 no.4
    • /
    • pp.37-47
    • /
    • 2016
  • The practical constraints on the commercial FPGAs which contain dedicated wide function multiplexers in their slice structure are incorporated with one of the most advanced FPGA mapping algorithms based on the AIG (And-Inverter Graph), one of the best logic representations in academia. As the first step of the mapping process, cuts are enumerated as intermediate structures. And then, the cuts which can be mapped to the multiplexers are recognized. Without any increased complexity, the delay and area of multiplexers as well as LUTs are calculated after checking the requirements for the tree construction such as symmetry and depth limit against dynamically changing mapping of neighboring nodes. Besides, the root positions of multiplexer trees are identified from the RTL code, and annotated to the AIG as AOs (Auxiliary Outputs). A new AIG embedding the multiplexer tree structures which are intentionally synthesized by Shannon expansion at the AOs, is overlapped with the optimized AIG. The lossless synthesis technique which employs FRAIG (Functionally Reduced AIG) is applied to this approach. The proposed approach and techniques are validated by implementing and applying them to two RISC processor examples, which yielded 13~30% area reduction, and up to 32% delay reduction. The research will be extended to take into account the constraints on the dedicated hardware for carry chains.

Adaptive-learning Code Allocation Technique for Improving Dimming Level and Reducing Flicker in Visible Light Communication (가시광통신에서 Dimming Level 향상 및 Flicker 감소를 위한 적응-학습 코드할당 기법)

  • Lee, Kyu-Jin;Han, Doo-Hee
    • Journal of Convergence for Information Technology
    • /
    • v.12 no.2
    • /
    • pp.30-36
    • /
    • 2022
  • In this paper, when the lighting and communication functions of the visible light communication system are used at the same time, we propose a technique to reduce the dimming level and flicker of the lighting. Visible light communication must satisfy both communication and lighting performance. However, the existing data code method results in reducing the brightness of the entire lighting. This causes deterioration of lighting performance and flicker phenomenon. To solve this problem, in this paper, we propose an adaptive learning code allocation technique that allocates binary codes to transmitted characters and optimizes and matches the binary codes allocated according to the frequency of occurrence of alphabets in character strings. Through this, we studied a technique that can faithfully play the role of lighting as well as communication function by allocating codes so that the 'OFF' pattern does not occur continuously while maintaining the maximum dimming level of each character string. As a result of the performance evaluation, the frequency of occurrence of '1' increased significantly without significantly affecting the overall communication performance, and on the contrary, the frequency of consecutive '0' decreased, indicating that the lighting performance of the system was greatly improved.

Fluid-Structure Interaction Analysis of High Aspect Ratio Wing for the Prediction of Aero-elasticity (유체-구조 연계 해석기법을 이용한 세장비가 큰 비행체 날개의 공탄성 해석)

  • Lee, Ki-Du;Lee, Young-Shin;Lee, Dae-Yearl;Lee, In-Won
    • Journal of the Korean Society for Aeronautical & Space Sciences
    • /
    • v.38 no.6
    • /
    • pp.547-556
    • /
    • 2010
  • For the safety of aircraft and accuracy of bombs, many companies have researched the new concept of adaptive kit to flying-bombs. For the long distance flying, it's normally used deployed high-aspect ratio wing because of limited volume. The probabilities of large elastic deformation and flutter are increased due to decreased stiffness of high-aspect ratio wing. In this paper, computational fluid dynamics and computational structure dynamics interaction methodology are applied for prediction of aerodynamic characteristics. FLUENT and ABAQUS are used to calculate fluid and structural dynamics. Code-bridge was made base on the compactly supported radial basis function to execute interpolation and mapping. There are some differences between rigid body and fluid-structure interaction analysis which are results of aerodynamics characteristics due to structural deformation. Small successive vibration was observed by interaction.

Implementation of C++ ID Compiler (C++ IDL 컴파일러 구현)

  • Park, Chan-Mo;Lee, Joon
    • Journal of the Korea Institute of Information and Communication Engineering
    • /
    • v.5 no.5
    • /
    • pp.970-976
    • /
    • 2001
  • In this paper, OUIG IDL CFE, provided by Sunsoft, is used to take a IDL definitions as inputs and parse those. OmniORB3 is introduced to support functionality of the ORB. Suns CFE produce AST after parsing inputs. Actually, the node of AST Is instances of classes which are derived from CFE classes. As the compiler back end visit the node of the AST using iterator class, UTL_ScopeActiveIterator, it dumps codes of output. During processing, two files are generated. Routines of generating code are invoked by BE_produce.cc and codes are produced while visiting root of AST, idl_global->root(). The dump* functions which dump codes is called according to the type of node. In this paper, Mapping C++ of IDL definition is experimented and results In the same as that of omniidl which is provided by omniORB3. The code of results behavior correctly on omniORB3. In the future, we are interested in optimizing the performance of marshalling code via IDL compiler.

  • PDF