• Title/Summary/Keyword: Library Application

Search Result 863, Processing Time 0.027 seconds

A Reduced-Boron OPR1000 Core Based on the BigT Burnable Absorber

  • Yu, Hwanyeal;Yahya, Mohd-Syukri;Kim, Yonghee
    • Nuclear Engineering and Technology
    • /
    • v.48 no.2
    • /
    • pp.318-329
    • /
    • 2016
  • Reducing critical boron concentration in a commercial pressurized water reactor core offers many advantages in view of safety and economics. This paper presents a preliminary investigation of a reduced-boron pressurized water reactor core to achieve a clearly negative moderator temperature coefficient at hot zero power using the newly-proposed "Burnable absorber-Integrated Guide Thimble" (BigT) absorbers. The reference core is based on a commercial OPR1000 equilibrium configuration. The reduced-boron ORP1000 configuration was determined by simply replacing commercial gadolinia-based burnable absorbers with the optimized BigT-loaded design. The equilibrium cores in this study were directly searched via repetitive Monte Carlo depletion calculations until convergence. The results demonstrate that, with the same fuel management scheme as in the reference core, application of the BigT absorbers can effectively reduce the critical boron concentration at the beginning of cycle by about 65 ppm. More crucially, the analyses indicate promising potential of the reduced-boron OPR1000 core with the BigT absorbers, as its moderator temperature coefficient at the beginning of cycle is clearly more negative and all other vital neutronic parameters are within practical safety limits. All simulations were completed using the Monte Carlo Serpent code with the ENDF/B-VII.0 library.

A PARALLEL PRECONDITIONER FOR GENERALIZED EIGENVALUE PROBLEMS BY CG-TYPE METHOD

  • MA, SANGBACK;JANG, HO-JONG
    • Journal of the Korean Society for Industrial and Applied Mathematics
    • /
    • v.5 no.2
    • /
    • pp.63-69
    • /
    • 2001
  • In this study, we shall be concerned with computing in parallel a few of the smallest eigenvalues and their corresponding eigenvectors of the eigenvalue problem, $Ax={\lambda}Bx$, where A is symmetric, and B is symmetric positive definite. Both A and B are large and sparse. Recently iterative algorithms based on the optimization of the Rayleigh quotient have been developed, and CG scheme for the optimization of the Rayleigh quotient has been proven a very attractive and promising technique for large sparse eigenproblems for small extreme eigenvalues. As in the case of a system of linear equations, successful application of the CG scheme to eigenproblems depends also upon the preconditioning techniques. A proper choice of the preconditioner significantly improves the convergence of the CG scheme. The idea underlying the present work is a parallel computation of the Multi-Color Block SSOR preconditioning for the CG optimization of the Rayleigh quotient together with deflation techniques. Multi-Coloring is a simple technique to obatin the parallelism of order n, where n is the dimension of the matrix. Block SSOR is a symmetric preconditioner which is expected to minimize the interprocessor communication due to the blocking. We implemented the results on the CRAY-T3E with 128 nodes. The MPI(Message Passing Interface) library was adopted for the interprocessor communications. The test problems were drawn from the discretizations of partial differential equations by finite difference methods.

  • PDF

Design and Implementation of a Secure Communication API Using OpenSSL (OpenSSL을 이용한 보안 통신 API의 설계 및 구현)

  • Jung In-sung;Shin Yong-tae
    • Journal of Internet Computing and Services
    • /
    • v.4 no.5
    • /
    • pp.87-96
    • /
    • 2003
  • The additional mechanism is required to set up a secure connection among the communication subjects in the internet environment. Each entity should transfer and receive the encrypted and hashed data to guarantee the data integrity. Also, the mutual authentication procedure should be processed using a secure communication protocol. Although the OpenSSL which implements the TLS is using by many developers and its stability and performance are proved, it has a difficulty in using because of its large size. So, this paper designs and implements the secure communication which the users can use easily by modification works of OpenSSL library API. We proved the real application results using the client/server case which supports a secure communication using the implemented API.

  • PDF

A Study on Obsolescence and Weeding by Citation Analysis - Application to Economics - (인용분석(引用分析)을 통한 문헌(文獻)의 이용률 감소현상(減少現象) 및 장서폐기(藏書廢棄) 연구 - 경제학(經濟學) 분야를 중심으로 -)

  • Shin, Eun-Ja
    • Journal of Information Management
    • /
    • v.24 no.4
    • /
    • pp.1-23
    • /
    • 1993
  • The purpose of this study is to investigate and analyze the obsolescence of documents according to their dates, types and locations of publication. The main results are as follows : The half-life of monographs is 12.09 years while those of articles and research papers are 9.68 and 8.93 years, respectively. Moreover, documents are most often cited by researchers within two years since their publication. Lastly, but not the least, the estimated weeding points for monographs, articles, and research papers, assuming that their weeding points have been realized when their accumulated citation rates reach 90%, are 40.15, 32.15, and 29.65 years, respectively.

  • PDF

A Design of HAS-160 Processor for Smartcard Application (스마트카드용 HAS-160 프로세서 설계)

  • Kim, Hae-ju;Shin, Kyung-Wook
    • Proceedings of the Korean Institute of Information and Commucation Sciences Conference
    • /
    • 2009.10a
    • /
    • pp.913-916
    • /
    • 2009
  • This paper describes a hardware design of hash processor which implements HAS-160 algorithm adopted as a Korean standard. To achieve a high-speed operation with small-area, the arithmetic operation is implemented using a hybrid structure of 5:3 and 3:2 carry-save adders and a carry-select adder. The HAS-160 processor synthesized with $0.35-{\mu}m$ CMOS cell library has 17,600 gates. It computes a 160-bit hash code from a message block of 512 bits in 82 clock cycles, and has 312 Mbps throughput at 50 MHz@3.3-V clock frequency.

  • PDF

Development of an Evacuation Time Calculation Program for Passenger Ships Based on IMO Guidelines, MSC.1/Circ.1238 (IMO 피난지침 기반의 여객선 탈출시간 계산 프로그램 개발)

  • Choi, Jin;Kim, Soo-Young;Shin, Sung-Chul;Kang, Hee-Jin;Park, Beom-Jin
    • Journal of the Society of Naval Architects of Korea
    • /
    • v.47 no.5
    • /
    • pp.719-724
    • /
    • 2010
  • Thousands of passengers and crews are onboard a cruise ship and there are many cabins and large public spaces such as atria and theaters. Therefore it is easy to cause a huge loss of life and damage to property when accidents happen at sea. To improve the safety of passenger ships, in October 2007, IMO proposed MSC.1/Circ.1238 on guidelines for evacuation analysis and recommended its use. However, this guideline is difficult to apply because ship designers need to get many pieces of information from CAD drawings such as width and length of stairs and corridors and manually calculate the evacuation time. In this paper, for practical application of the guidelines, an evacuation time calculation program is developed using AutoCAD .NET API library and C Sharp language.

The Study of Structure and Application of EAD (EAD의 구조와 적용에 관한 연구)

  • Kang, So-Youn
    • The Korean Journal of Archival Studies
    • /
    • no.8
    • /
    • pp.181-211
    • /
    • 2003
  • The purpose of this study is to reveal the context within which EAD was developed, to review the elements and the structure of EAD 1.0 version and to introduce EAD as new standard for encoded archival finding aids in Korea. Encoded Archival Description(EAD) has been developed in 1993 in order to facilitate exchange of ISAD(G) descriptive information. EAD is currently administered and maintained jointly by the Society of American Archivists and the United States Library of Congress. While development was initiated in the United Stares, international interest and contribution are increasing. EAD is a encoding standard designed specifically for marking up information contained in archival finding aids. From its inception, EAD was based on SGML, and, with the release of EAD version 1.0 in 1998, it is also compliant with XML in order to facilitate easier internet access to SGML-encoded finding aids. EAD is the first tool to preserve the multilevel and hierarchical description manifest in finding aids by providing structures in which to describe entire record collections and increasingly smaller subcomponents thereof such as series, subseries, folders, and even items. Archival institutions can form a EAD consortium and also create a union database of EAD finding aids for the geographically dispersed collections. The EAD DTD provides a flexible way for archives to convert finding aids that exist in paper form into electronic documents or to create new finding aids in electronic form.

Information Needs of Korean Immigrant Mothers in the United States for Their Children's College Preparation

  • Yoon, JungWon;Taylor, Natalie;Kim, Soojung
    • Journal of Information Science Theory and Practice
    • /
    • v.6 no.4
    • /
    • pp.54-64
    • /
    • 2018
  • This study aims to understand the information needs of Korean immigrant mothers in the United States for their high school children's college preparation. A content analysis was conducted for the messages posted to a "motherhood" forum on the MissyUSA website. In total, 754 posts were analyzed in terms of a child's grade, college preparation stage, type of post, and topic of post. The study found that there is a range of information needed at different stages in a child's education. Many of the demonstrated information needs showed similarities to those of other immigrant groups, but there were also community-specific themes, such as an emphasis on STEM (science, technology, engineering, and math) and standardized tests. The forum was mainly used for factual questions, not emotional support. We concluded that the findings of the study would help researchers in understanding immigrant information needs for the college application process and how information professionals and educators could combine the needs of different ethnic groups to create customized services for them.

Neutronics design of VVER-1000 fuel assembly with burnable poison particles

  • Tran, Hoai-Nam;Hoang, Van-Khanh;Liem, Peng Hong;Hoang, Hung T.P.
    • Nuclear Engineering and Technology
    • /
    • v.51 no.7
    • /
    • pp.1729-1737
    • /
    • 2019
  • This paper presents neutronics design of VVER-1000 fuel assembly using burnable poison particles (BPPs) for controlling excess reactivity and pin-wise power distribution. The advantage of using BPPs is that the thermal conductivity of BPP-dispersed fuel pin could be improved. Numerical calculations have been conducted for optimizing the BPP parameters using the MVP code and the JENDL-3.3 data library. The results show that by using $Gd_2O_3$ particles with the diameter of $60{\mu}m$ and the packing fraction of 5%, the burnup reactivity curve and pin-wise power distribution are obtained approximately that of the reference design. To minimize power peaking factor (PPF), total BP amount has been distributed in a larger number of fuel rods. Optimization has been conducted for the number of BPP-dispersed rods, their distribution, BPP diameter and packing fraction. Two models of assembly consisting of 18 BPP-dispersed rods have been selected. The diameter of $300{\mu}m$ and the packing fraction of 3.33% were determined so that the burnup reactivity curve is approximate that of the reference one, while the PPF can be decreased from 1.167 to 1.105 and 1.113, respectively. Application of BPPs for compensating the reduction of soluble boron content to 50% and 0% is also investigated.

A Universal Analysis Pipeline for Hybrid Capture-Based Targeted Sequencing Data with Unique Molecular Indexes

  • Kim, Min-Jung;Kim, Si-Cho;Kim, Young-Joon
    • Genomics & Informatics
    • /
    • v.16 no.4
    • /
    • pp.29.1-29.5
    • /
    • 2018
  • Hybrid capture-based targeted sequencing is being used increasingly for genomic variant profiling in tumor patients. Unique molecular index (UMI) technology has recently been developed and helps to increase the accuracy of variant calling by minimizing polymerase chain reaction biases and sequencing errors. However, UMI-adopted targeted sequencing data analysis is slightly different from the methods for other types of omics data, and its pipeline for variant calling is still being optimized in various study groups for their own purposes. Due to this provincial usage of tools, our group built an analysis pipeline for global application to many studies of targeted sequencing generated with different methods. First, we generated hybrid capture-based data using genomic DNA extracted from tumor tissues of colorectal cancer patients. Sequencing libraries were prepared and pooled together, and an 8-plexed capture library was processed to the enrichment step before 150-bp paired-end sequencing with Illumina HiSeq series. For the analysis, we evaluated several published tools. We focused mainly on the compatibility of the input and output of each tool. Finally, our laboratory built an analysis pipeline specialized for UMI-adopted data. Through this pipeline, we were able to estimate even on-target rates and filtered consensus reads for more accurate variant calling. These results suggest the potential of our analysis pipeline in the precise examination of the quality and efficiency of conducted experiments.