• Title/Summary/Keyword: 요구사항 분석 프로세스

Search Result 296, Processing Time 0.024 seconds

A Study to Improve the Trustworthiness of Data Repositories by Obtaining CoreTrustSeal Certification (CoreTrustSeal 인증 획득을 통한 데이터 리포지토리의 신뢰성 향상을 위한 연구)

  • Hea Lim Rhee;Jung-Ho Um;Youngho Shin;Hyung-jun Yim;Na-eun Han
    • Journal of the Korean Society for information Management
    • /
    • v.41 no.2
    • /
    • pp.245-268
    • /
    • 2024
  • As the recognition of data's value increases, the role of data repositories in managing, preserving, and utilizing data is becoming increasingly important. This study investigates ways to enhance the trustworthiness of data repositories through obtaining CoreTrustSeal (CTS) certification. Trust in data repositories is critical not only for data protection but also for building and maintaining trust between the repository and stakeholders, which in turn affects researchers' decisions on depositing and utilizing data. The study examines the CoreTrustSeal, an international certification for trustworthy data repositories, analyzing its impact on the trustworthiness and efficiency of repositories. Using the example of DataON, Korea's first CTS-certified repository operated by the Korea Institute of Science and Technology Information (KISTI), the study compares and analyzes four repositories that have obtained CTS certification. These include DataON, the Physical Oceanography Distributed Active Archive Center (PO.DAAC) from NASA, Yareta from the University of Geneva, and the DARIAH-DE repository from Germany. The research assesses how these repositories meet the mandatory requirements set by CTS and proposes strategies for improving the trustworthiness of data repositories. Key findings indicate that obtaining CTS certification involves rigorous evaluation of organizational infrastructure, digital object management, and technological aspects. The study highlights the importance of transparent data processes, robust data quality assurance, enhanced accessibility and usability, sustainability, security measures, and compliance with legal and ethical standards. By implementing these strategies, data repositories can enhance their reliability and efficiency, ultimately promoting wider data sharing and utilization in the scientific community.

CIA-Level Driven Secure SDLC Framework for Integrating Security into SDLC Process (CIA-Level 기반 보안내재화 개발 프레임워크)

  • Kang, Sooyoung;Kim, Seungjoo
    • Journal of the Korea Institute of Information Security & Cryptology
    • /
    • v.30 no.5
    • /
    • pp.909-928
    • /
    • 2020
  • From the early 1970s, the US government began to recognize that penetration testing could not assure the security quality of products. Results of penetration testing such as identified vulnerabilities and faults can be varied depending on the capabilities of the team. In other words none of penetration team can assure that "vulnerabilities are not found" is not equal to "product does not have any vulnerabilities". So the U.S. government realized that in order to improve the security quality of products, the development process itself should be managed systematically and strictly. Therefore, the US government began to publish various standards related to the development methodology and evaluation procurement system embedding "security-by-design" concept from the 1980s. Security-by-design means reducing product's complexity by considering security from the initial phase of development lifecycle such as the product requirements analysis and design phase to achieve trustworthiness of product ultimately. Since then, the security-by-design concept has been spread to the private sector since 2002 in the name of Secure SDLC by Microsoft and IBM, and is currently being used in various fields such as automotive and advanced weapon systems. However, the problem is that it is not easy to implement in the actual field because the standard or guidelines related to Secure SDLC contain only abstract and declarative contents. Therefore, in this paper, we present the new framework in order to specify the level of Secure SDLC desired by enterprises. Our proposed CIA (functional Correctness, safety Integrity, security Assurance)-level-based security-by-design framework combines the evidence-based security approach with the existing Secure SDLC. Using our methodology, first we can quantitatively show gap of Secure SDLC process level between competitor and the company. Second, it is very useful when you want to build Secure SDLC in the actual field because you can easily derive detailed activities and documents to build the desired level of Secure SDLC.

A Study on the Operation Support and Activation of Drone Geospatial Information Service (드론 공간정보 서비스 운영지원 및 활성화에 대한 연구)

  • Ok, Jin-A;Yoo, Soonduck
    • The Journal of the Institute of Internet, Broadcasting and Communication
    • /
    • v.21 no.6
    • /
    • pp.147-153
    • /
    • 2021
  • The purpose of this study is to establish strategic suggestions for the direction of the drone-related business by Gyeonggi-do through a survey on the operation and actual conditions of the drone geospatial information service business experienced by Gyeonggi-do civil servants. For this purpose, as a result of surveying the demand survey of 219 people in charge of drone field work, it was analyzed by dividing it into four categories: technology-based operation support, business discovery and support, legal and institutional support, and education and public relations. As an improvement measure, technology-based operation support is to secure service operation efficiency by establishing a dedicated manpower and a dedicated organization and securing drone-related experts.The plan for improvement of project discovery and support is as follows. 1) The government proactively discovers prior research projects for project discovery and support; 2) Legal and institutional support requires support services for simplification of administration such as drone geospatial data shooting schedules and permits, 3) legal and institutional review on improving the scope and restrictions for using drone geospatial data. In the field of education and publicity, it is necessary to operate an education program on the overall operation of drones, and to conduct seminars for each field and use, and to seek activation methods through practical application guidelines for the application process and system. The limitation of this study is that the survey subjects are related to Gyeonggi-do, and in the future, survey analysis through a wide range of participants is required.

Design and Implementation of MongoDB-based Unstructured Log Processing System over Cloud Computing Environment (클라우드 환경에서 MongoDB 기반의 비정형 로그 처리 시스템 설계 및 구현)

  • Kim, Myoungjin;Han, Seungho;Cui, Yun;Lee, Hanku
    • Journal of Internet Computing and Services
    • /
    • v.14 no.6
    • /
    • pp.71-84
    • /
    • 2013
  • Log data, which record the multitude of information created when operating computer systems, are utilized in many processes, from carrying out computer system inspection and process optimization to providing customized user optimization. In this paper, we propose a MongoDB-based unstructured log processing system in a cloud environment for processing the massive amount of log data of banks. Most of the log data generated during banking operations come from handling a client's business. Therefore, in order to gather, store, categorize, and analyze the log data generated while processing the client's business, a separate log data processing system needs to be established. However, the realization of flexible storage expansion functions for processing a massive amount of unstructured log data and executing a considerable number of functions to categorize and analyze the stored unstructured log data is difficult in existing computer environments. Thus, in this study, we use cloud computing technology to realize a cloud-based log data processing system for processing unstructured log data that are difficult to process using the existing computing infrastructure's analysis tools and management system. The proposed system uses the IaaS (Infrastructure as a Service) cloud environment to provide a flexible expansion of computing resources and includes the ability to flexibly expand resources such as storage space and memory under conditions such as extended storage or rapid increase in log data. Moreover, to overcome the processing limits of the existing analysis tool when a real-time analysis of the aggregated unstructured log data is required, the proposed system includes a Hadoop-based analysis module for quick and reliable parallel-distributed processing of the massive amount of log data. Furthermore, because the HDFS (Hadoop Distributed File System) stores data by generating copies of the block units of the aggregated log data, the proposed system offers automatic restore functions for the system to continually operate after it recovers from a malfunction. Finally, by establishing a distributed database using the NoSQL-based Mongo DB, the proposed system provides methods of effectively processing unstructured log data. Relational databases such as the MySQL databases have complex schemas that are inappropriate for processing unstructured log data. Further, strict schemas like those of relational databases cannot expand nodes in the case wherein the stored data are distributed to various nodes when the amount of data rapidly increases. NoSQL does not provide the complex computations that relational databases may provide but can easily expand the database through node dispersion when the amount of data increases rapidly; it is a non-relational database with an appropriate structure for processing unstructured data. The data models of the NoSQL are usually classified as Key-Value, column-oriented, and document-oriented types. Of these, the representative document-oriented data model, MongoDB, which has a free schema structure, is used in the proposed system. MongoDB is introduced to the proposed system because it makes it easy to process unstructured log data through a flexible schema structure, facilitates flexible node expansion when the amount of data is rapidly increasing, and provides an Auto-Sharding function that automatically expands storage. The proposed system is composed of a log collector module, a log graph generator module, a MongoDB module, a Hadoop-based analysis module, and a MySQL module. When the log data generated over the entire client business process of each bank are sent to the cloud server, the log collector module collects and classifies data according to the type of log data and distributes it to the MongoDB module and the MySQL module. The log graph generator module generates the results of the log analysis of the MongoDB module, Hadoop-based analysis module, and the MySQL module per analysis time and type of the aggregated log data, and provides them to the user through a web interface. Log data that require a real-time log data analysis are stored in the MySQL module and provided real-time by the log graph generator module. The aggregated log data per unit time are stored in the MongoDB module and plotted in a graph according to the user's various analysis conditions. The aggregated log data in the MongoDB module are parallel-distributed and processed by the Hadoop-based analysis module. A comparative evaluation is carried out against a log data processing system that uses only MySQL for inserting log data and estimating query performance; this evaluation proves the proposed system's superiority. Moreover, an optimal chunk size is confirmed through the log data insert performance evaluation of MongoDB for various chunk sizes.

Mobile Service Modeling Based on Service Oriented Architecture (서비스 지향 아키텍처 기반의 모바일 서비스 모델링)

  • Chang, Young-Won;Noh, Hye-Min;Yoo, Cheol-Jung
    • Journal of the Institute of Electronics Engineers of Korea SD
    • /
    • v.45 no.2
    • /
    • pp.140-149
    • /
    • 2008
  • Recently, the need for accessing information from anywhere at any time has been a driving force for a variety of mobile applications. As the number of mobile applications increases rapidly, there has been a growing demand for the use of Service Oriented Architectures(SOA) for various applications. Mobile based SOA offers a systematic way to classify and assess technical realizations of business processes. But mobile has severly restricted range of utilizing services in computing environment and more, a mobile computer is envisioned to be equipped with more powerful capabilities, including the storage of a small database, the capacity of data processing, a narrow user input and small size of display. This paper present mobile adaption method based on SOA to overcome mobile restriction. To improve mobile efficient we analyzing mobile application requirement writing service specification, optimizing design, providing extended use case specification which test use case testing and testing service test case which derived from service specification. We discuss an mobile application testing that uses a SOA as a model for deploying discovering, specifying, integrating, implementing, testing, and invoking services. Such a service use case specification and testing technique including some idea could help the mobile application to develop cost efficient and dependable mobile services.

An Empirical Study on the Success Factors of Implementing Product Life Cycle Management Systems (제품수명주기관리 시스템 도입의 성공요인에 관한 실증연구)

  • Kim, Jeong-Beom
    • Journal of KIISE:Software and Applications
    • /
    • v.37 no.12
    • /
    • pp.909-918
    • /
    • 2010
  • To analyze the national competitiveness of Korea leads to the conclusion that global high-tech enterprises have been playing leading and pulling roles in making Korea in line with advanced countries even though the country is lacking in various natural resources. The characteristics of these companies above are as follows; Firstly, these enterprises continue to accumulate core technologies and know-how with highly competent human resources and well-organized management. Secondly, they are well structured and equipped with information technology infrastructures which are, for example, ERP, SCM, CRM, and PLM. Among them PLM is considered to be the principal core information technology infra in manufacturing industry. The urgent task of manufacturing industry recently is to develop new products to accept various needs of consumers, and to launch the products in time to market, which requires the manufactures to be equipped with product development infra and system to upgrade product fulfillment and mass production system in a short period. The introduction of PLM System is a solution of core strategy as a manufacturer for collaboration, global development, reengineering of manufacturing system, the innovation and efficiency of manufacturing process, and product quality improvement. The purpose of this study is to analyze the success factors of introducing PLM System and its practicing effectiveness. And the results of empirical study are as follows; (1) Technical success factors positively impact system quality and user satisfaction, (2) Organizational success factors positively impact system quality, but does not impact user satisfaction, (3) Environmental success factors positively impact system quality and user satisfaction, (4) System quality positively impacts user satisfaction, (5) User satisfaction positively impacts the effectiveness of implementing PLM systems, but system quality does not impact it.