• Title/Summary/Keyword: research data repository

Search Result 175, Processing Time 0.028 seconds

A Study on National Research Data Repository Management (연구데이터 국가 리포지터리 관리에 관한 연구 - 국내외 대표 연구데이터 리포지터리 홈페이지 메타데이터를 중심으로 -)

  • An, Byoung-Goon;Park, Kyu-Ri;Shim, Jiwoo;Cho, Hyungmin
    • Proceedings of the Korean Society for Information Management Conference
    • /
    • 2018.08a
    • /
    • pp.117-126
    • /
    • 2018
  • 본 연구는 국내외 대표 국가 리포지터리들을 대상으로 학문지원시스템으로서의 역할 수행을 위해 각 리포지터리들의 연구데이터 관리 체계를 살펴보았다. 이를 위하여 한국, 미국, 영국, 호주, 네덜란드의 대표 연구기관 홈페이지에서 제공되는 연구데이터의 메타데이터 현황을 내용분석의 방법을 통해 조사 분석하였다. 즉 연구데이터를 KRM의 표현체 분류 기준에 따라 '단행본군', '보고서군' 등 10종으로 분류하고, 연구데이터 분류별 메타데이터의 적용 현황을 분석하였다. 분석 결과에 따라 학문지원시스템으로서 연구데이터 관리의 문제점과 표준 개발의 개선점을 제시하였다.

  • PDF

A Study on Developing National Research Data Repository (연구데이터 국가 리포지터리 구축에 관한 연구 - 국내외 사례를 중심으로 -)

  • Park, Kyu-Ri;An, Byoung-Goon
    • Proceedings of the Korean Society for Information Management Conference
    • /
    • 2017.08a
    • /
    • pp.33-38
    • /
    • 2017
  • 본 연구는 연구데이터가 공공재로써 관리와 공유의 주체가 국가이어야 한다는 주장으로부터 시작하였다. 국내외의 대표적인 연구데이터 리포지터리 구축 사례를 통해 연구데이터의 관리 및 공유 현황에 대해 살펴보고, 이를 통해 국내 인문사회분야 대표 리포지터리인 한국연구재단 기초학문자료센터(KRM)의 개선방향을 확인하는 것을 목적으로 하였다. 미국, 영국, 호주의 경우와 KRM의 연구데이터관리 현황을 비교한 결과 가장 큰 차이점은 KRM의 경우 연구데이터 관리의 중요성에 대한 인식개선을 위한 교육과 실제 구축을 위한 DMP 및 메타데이터 작성 교육 등이 부재한 상황으로 연구자의 적극적 참여 유도에 긍정적 환경을 제시하지 못하고 있다는 것이다. 이외에도 장기보존을 위한 국제적 표준의 채택이나 전담 인력의 부족 또한 주요한 개선점으로 확인되었다.

  • PDF

Continuous Query over Business Event Streams in EPCIS Middleware (비즈니스 이벤트 스트리밍 대한 연속 질의 처리)

  • Piao, Yong-Xu;Hong, Bong-Hee;Park, Jeak-Wan;Kim, Gi-Hong
    • Proceedings of the Korea Information Processing Society Conference
    • /
    • 2008.05a
    • /
    • pp.718-720
    • /
    • 2008
  • In this paper, the study focus on continuous query in EPC Information Services(EPCIS) middleware which is a component of RFID system. We can consider EPCIS as a data stream system with a repository. In our work continuous query is implemented in two query execution model. One is standing query model another is traditional query execution model in which continuous query run over database periodically. Furthermore a balance strategy is presented. It is used to determine which continuous query implementation model is suitable for the query. Finally we conclude our work and issue some research topic for future work.

Role of Project Owner in OSS Project: Based on Impression Formation and Social Capital Theory (오픈소스 소프트웨어 운영자 역할이 성과에 미치는 영향: 인상형성과 사회적 자본 이론을 중심으로)

  • Lee, Saerom;Baek, Hyunmi;Jahng, Jungjoo
    • The Journal of Society for e-Business Studies
    • /
    • v.21 no.2
    • /
    • pp.23-46
    • /
    • 2016
  • With the increasing socio-economic value of an open collaboration over the Internet, it has become significantly important to successfully manage open source software development program. Most of the previous research have focused on various factors that influence the performance of the project, but studies on how the project owners recognized as "leader" affect the outcome of the project are very limited. This research investigates how individual and governance characteristics of an owner influences the performance of project based on impression formation and social capital theory. For a data set, we collect 611 Repositories and the owner's data from the open source development platform Github, and we form knowledge sharing network of an each repository by using social network analysis. We use hierarchical regression analysis, and our results show that a leader, who exposes a lot of one's personal information or who has actively followed and showed interests to communicate with other developers, affects positive impacts on project performance. A leader who has a high centrality in knowledge sharing network also positively affects on project performance. On the other hand, if a leader was highly willing to accept external knowledge or is recognized as an expert in the community with large numbers of followers, the result show negative impacts on project performance. The research may serve as a useful guideline not only for the future open source software projects but also for the effective management of different types of open collaboration.

Geological Factor Analysis for Evaluating the Long-term Safety Performance of Natural Barriers in Deep Geological Repository System of High-level Radioactive Waste (지질학적 심지층 처분지 내 천연방벽의 고준위 방사성 폐기물 장기 처분 안전성 평가를 위한 지질학적 인자 분석)

  • Hyeongmok Lee;Jiho Jeong;Jaesung Park;Subi Lee;Suwan So;Jina Jeong
    • Economic and Environmental Geology
    • /
    • v.56 no.5
    • /
    • pp.533-545
    • /
    • 2023
  • In this study, an investigation was conducted on the features, events, and processes (FEP) that could impact the long-term safety of the natural barriers constituting high-level radioactive waste geological repositories. The FEP list was developed utilizing the IFEP list 3.0 provided by the Nuclear Energy Agency (NEA) as foundational data, supplemented by geological investigations and research findings from leading countries in this field. A total of 49 FEPs related to the performance of the natural barrier were identified. For each FEP, detailed definitions, classifications, impacts on long-term safety, significance in domestic conditions, and feasibility of quantification were provided. Moreover, based on the compiled FEP list, three scenarios that could affect the long-term safety of the disposal facility were developed. Geological factors affecting the performance of the natural barrier in each scenario were selected and their relationships were visualized. The constructed FEP list and the visualization of interrelated factors in various scenarios are anticipated to provide essential information for selecting and organizing factors that must be considered in the development of mathematical models for quantitatively evaluating the long-term safety of deep geological repositories. In addition, these findings could be effectively utilized in establishing criteria related to the key performance of natural barriers for the confirmation of repository sites.

Analysis and Modeling of Essential Concepts and Process for Peer-Reviewing Data Paper (데이터논문 동료심사를 위한 핵심 개념 분석과 프로세스 모델링)

  • Sungsoo Ahn;Sung-Nam Cho;Youngim Jung
    • Journal of Korean Library and Information Science Society
    • /
    • v.54 no.3
    • /
    • pp.321-346
    • /
    • 2023
  • A data paper describing research data helps credit researchers producing the data while helping other researchers verify previous research and start new research by reusing the data. Publishing a data paper and depositing data to a public data repository are increasing with these benefits. A domestic academic society that plans to publish data papers faces challenges, including timely acquiring tremendous knowledge concerning data paper structures and templates, peer review policy and process, and trustworthy data repositories, as a data paper has different characteristics, unlike a research paper. However, the need for more research and information concerning the critical elements of data paper and the peer-review process makes it difficult to operate for data paper review and publication. To address these issues, we propose essential concepts of the data paper and the data paper peer-review, including the process model of the peer-review with in-depth analysis of five data journals' data paper templates, articles, and other guides worldwide. Academic societies intending to publish or add data papers as a new type of paper may establish policies and define a peer-review process by adopting the proposed conceptual models, effectively streamlining the preparation of data paper publication.

Review of Instant Release Fractions of Long-lived Radionuclides in CANDU and PWR Spent Nuclear Fuels Under the Geological Disposal Conditions

  • Choi, Heui Joo;Koo, Yang-Hyun;Cho, Dong-Keun
    • Journal of Nuclear Fuel Cycle and Waste Technology(JNFCWT)
    • /
    • v.20 no.2
    • /
    • pp.231-241
    • /
    • 2022
  • Several countries, including Korea, are considering the direct disposal of spent nuclear fuels. The radiological safety assessment results published after a geological repository closure indicate that the instant release is the main radiation source rather than the congruent release. Three Safety Case reports recently published were reviewed and the IRF values of seven long-lived radionuclides, including relevant experimental results, were compared. According to the literature review, the IRF values of both the CANDU and low burnup PWR spent fuel have been experimentally measured and used reasonably. In particular, the IRF values of volatile long-lived nuclides, such as 129I and 135Cs, were estimated from the FGR value. Because experimental leaching data regarding high burnup spent nuclear fuels are extremely scarce, a mathematical modelling approach proposed by Johnson and McGinnes was successfully applied to the domestic high burnup PWR spent nuclear fuel to derive the IRF values of iodine and cesium. The best estimate of the IRF was 5.5% at a discharge burnup of 55 GWd tHM-1.

An Evaluation of Academic Institutional Repositories in Ghana

  • Kumah, Mariyama Abdulai;Filson, Christopher Kwame
    • International Journal of Knowledge Content Development & Technology
    • /
    • v.12 no.1
    • /
    • pp.67-83
    • /
    • 2022
  • This study aims to evaluate some academic institutional repositories (IRs) in Ghana. Data were collected using observation and interview methods by examining the various websites of seven (7) selected academic institutional repositories in Ghana. The findings revealed that the University of Ghana, Legon, leads records count of the seven Institutional Repository (IRs) examined. Dspace was the prioritized software for managing and preserving the digital contents of these IRs. Theses, dissertations and research articles were the leading contents deposited on the IRs. The majority of the IRs have incorporated RSS (Rich Site Summary or Really Simple Syndication) feeds on their IRs with few using other Web 2.0 features. English was the only interface language used on all the IRs. From the interview, the findings revealed that most of the faculty members did not use the IR very often and 9(42.9) also indicated that, they have never deposited their materials on the IR. Faculty members again indicated that, inadequate ICT connectivity and infrastructure, unreliable power supply, Copyrights and intellectual rights, financial constrain, inadequate advocacy and training on the importance of IRs to faculty and users were the major challenges of academic libraries in operating IRs in Ghana.

Sample size determination for conducting a pilot study to assess reliability of a questionnaire

  • Mohamad Adam Bujang;Evi Diana Omar;Diana Hui Ping Foo ;Yoon Khee Hon
    • Restorative Dentistry and Endodontics
    • /
    • v.49 no.1
    • /
    • pp.3.1-3.8
    • /
    • 2024
  • This article is a narrative review that discusses the recommended sample size requirements to design a pilot study to assess the reliability of a questionnaire. A list of various sample size tables that are based on the kappa agreement test, intra-class correlation test and Cronbach's alpha test has been compiled together. For all calculations, type I error (alpha) was set at a maximum value of 0.05, and power was set at a minimum value of 80.0%. For the kappa agreement test, intra-class correlation test, and Cronbach's alpha test, the recommended minimum sample size requirement based on the ideal effect sizes shall be at least 15, 22, and 24 subjects respectively. By making allowances for a non-response rate of 20.0%, a minimum sample size of 30 respondents will be sufficient to assess the reliability of the questionnaire. The clear guideline of minimum sample size requirement for the pilot study to assess the reliability of a questionnaire is discussed and this will ease researchers in preparation for the pilot study. This study provides justification for a minimum requirement of a sample size of 30 respondents specifically to test the reliability of a questionnaire.

Dense Sub-Cube Extraction Algorithm for a Multidimensional Large Sparse Data Cube (다차원 대용량 저밀도 데이타 큐브에 대한 고밀도 서브 큐브 추출 알고리즘)

  • Lee Seok-Lyong;Chun Seok-Ju;Chung Chin-Wan
    • Journal of KIISE:Databases
    • /
    • v.33 no.4
    • /
    • pp.353-362
    • /
    • 2006
  • A data warehouse is a data repository that enables users to store large volume of data and to analyze it effectively. In this research, we investigate an algorithm to establish a multidimensional data cube which is a powerful analysis tool for the contents of data warehouses and databases. There exists an inevitable retrieval overhead in a multidimensional data cube due to the sparsity of the cube. In this paper, we propose a dense sub-cube extraction algorithm that identifies dense regions from a large sparse data cube and constructs the sub-cubes based on the dense regions found. It reduces the retrieval overhead remarkably by retrieving those small dense sub-cubes instead of scanning a large sparse cube. The algorithm utilizes the bitmap and histogram based techniques to extract dense sub-cubes from the data cube, and its effectiveness is demonstrated via an experiment.