• Title/Summary/Keyword: information needs analysis

Search Result 2,542, Processing Time 0.033 seconds

A Linkage Analysis of ISMS-P and GDPR; Focused on Personal Information Protection (ISMS-P와 GDPR의 개인정보보호 부문 연계 분석)

  • Park, Minjung;Yu, Jieun;Chai, Sangmi
    • Journal of Information Technology Services
    • /
    • v.18 no.2
    • /
    • pp.55-73
    • /
    • 2019
  • The importance of the personal information has been increased, there have been a lot of efforts to establish a new policy, certification or law for administrating personal information more effectively and safely. Korean government has operated ISMS and PIMS certification system to assess whether an organization has established and managed appropriate information security system or not. However, it has been addressed the needs for revising and modifying of PIMS and ISMS. It is evaluated there are a few overlapped criteria to assess information management system in both ISMS and PIMS. ISMS-P certification, combining with ISMS and PIMS, is, finally, suggested, in the recent. GDPR is established having an aim of primarily to give control to individuals over their personal data and to simplify the regulatory environment for international business by unifying the regulation within the EU. This study compares GDPR and ISMS-P, focusing on "personal information". It can be expected to contribute as followings. This study can be a criterion for self-evaluation of possibility to violate of GDPR of a firm in preparation for ISMS-P. Second, this study also aims to increase the understanding of the role of ISMS-P and GDPR, among various certifications with the purpose of assessment of the information security management system, by reducing the costs required to obtain the unnecessary certification and alleviating the burden. Third, it contributes to diffusion of ISMS-P newly implemented in Korea.

Intellectualization of Higher Education: An Information and Communication Model

  • Kaidanovska, Olena;Pymonenko, Mariia;Morklyanyk, Oksana;Iurchyshyn, Oksana;Rakochyi, Yaroslav
    • International Journal of Computer Science & Network Security
    • /
    • v.22 no.11
    • /
    • pp.87-92
    • /
    • 2022
  • Today the system of higher education needs significant reforms. Intellectualization of the educational process in HEIs aims to improve the quality of educational services. Intellectual information technologies are information technologies that help a person to accelerate the analysis of the political, economic, social, and technical situation, as well as the synthesis of management decisions. The basis for their mastery is information and communication technologies. The purpose of the research work is to identify the relationship between the introduction of information and communication technologies and the increase in the level of intellectualization of higher education. The article substantiates the expediency of introducing information and communication technologies in order to improve the intellectualization of the educational process in higher education. An empirical study of the variables that characterize the level of intellectualization of higher education through the proposed techniques has been conducted. The tendencies characteristic of pedagogical conditions of implementation of information and communication model in the educational process were revealed. It is proved that the level of intellectualization of higher education depends on the implemented pedagogical conditions. The effectiveness of the proposed information and communication model is also confirmed. Given the data obtained during the study and the low constraints that may affect the results of further research on this issue should focus on the study of other variables that characterize the state of intellectualization of the educational process.

The Design and Implementation of Web-Based Integrated Genome Analysis Tools (웹 기반 통합 유전체 분석 시스템의 설계 및 구현)

  • 최범순;이경희;권해룡;조완섭;이충세;김영창
    • Journal of Korea Multimedia Society
    • /
    • v.7 no.3
    • /
    • pp.408-417
    • /
    • 2004
  • Genome analysis process requires several steps of various software analysis tools. We propose WGAT(Web-based Genome Analysis Tool), which combines several tools for gene analysis and provides a graphic user interface for users. Software tools related to gene analysis are based on Linux or Unix oriented program, which is difficult to install and use for biologists. Furthermore, files generated from gene analysis frequently require manual transformation for next step input file. Web-based tools which are recently developed process orily one sequence at a time. So it needs many repetitive processes to analyze large size data file. WGAT is developed to support Web-based genome analysis for easy use as well as fast service for users. Whole genome data analysis can be done by running WGAT on Linux server and giving sequence data files with various options. Therefore many steps of the analysis can be done automatically by the system. Simulation shows that WGAT method gives 20 times faster analysis when sequence segment is one thousand.

  • PDF

Parameter Optimization and Automation of the FLEXPART Lagrangian Particle Dispersion Model for Atmospheric Back-trajectory Analysis (공기괴 역궤적 분석을 위한 FLEXPART Lagrangian Particle Dispersion 모델의 최적화 및 자동화)

  • Kim, Jooil;Park, Sunyoung;Park, Mi-Kyung;Li, Shanlan;Kim, Jae-Yeon;Jo, Chun Ok;Kim, Ji-Yoon;Kim, Kyung-Ryul
    • Atmosphere
    • /
    • v.23 no.1
    • /
    • pp.93-102
    • /
    • 2013
  • Atmospheric transport pathway of an air mass is an important constraint controlling the chemical properties of the air mass observed at a designated location. Such information could be utilized for understanding observed temporal variabilities in atmospheric concentrations of long-lived chemical compounds, of which sinks and/or sources are related particularly with natural and/or anthropogenic processes in the surface, and as well as for performing inversions to constrain the fluxes of such compounds. The Lagrangian particle dispersion model FLEXPART provides a useful tool for estimating detailed particle dispersion during atmospheric transport, a significant improvement over traditional "single-line" trajectory models that have been widely used. However, those without a modeling background seeking to create simple back-trajectory maps may find it challenging to optimize FLEXPART for their needs. In this study, we explain how to set up, operate, and optimize FLEXPART for back-trajectory analysis, and also provide automatization programs based on the open-source R language. Discussions include setting up an "AVAILABLE" file (directory of input meteorological fields stored on the computer), creating C-shell scripts for initiating FLEXPART runs and storing the output in directories designated by date, as wells as processing the FLEXPART output to create figures for a back-trajectory "footprint" (potential emission sensitivity within the boundary layer). Step by step instructions are explained for an example case of calculating back trajectories derived for Anmyeon-do, Korea for January 2011. One application is also demonstrated in interpreting observed variabilities in atmospheric $CO_2$ concentration at Anmyeon-do during this period. Back-trajectory modeling information introduced in this study should facilitate the creation and automation of most common back-trajectory calculation needs in atmospheric research.

A Study on Costs of Digital Preservation (디지털 보존의 비용요소에 관한 연구)

  • Chung, Hye-Kyung
    • Journal of the Korean Society for information Management
    • /
    • v.22 no.1 s.55
    • /
    • pp.47-64
    • /
    • 2005
  • To guarantee the long-term access to digital material, digital preservation needs to be systemized, and detailed investigation on cost elements of digital preservation should be done for the continued support of budget. To meet the needs in this area, this paper categorized the digital preservation cost into direct and indirect cost through deriving common elements used in prior research on this issue. For case analysis, two institutions, representing domestic University Library and National Library of Korea under large-scale digitization currently, are selected to analyze the current status of digital preservation and estimate the preservation cost. The case analysis shows the systematic preservation function should be performed to guarantee the long-term access digital material, even though a basic digital preservation is currently conducted. It was projected that the digital preservation cost for the two libraries, accounting for $11.8\%$ and $8.6\%$ of digitization cost, respectively, should be injected every year. However, the estimated figures are very conservative, because the cost for estimating the preservation function, such as installing digital repository and producing meta data, was excluded in the estimation. This proves that digital preservation is a synthetic activity linked directly and indirectly to various activities from production to access of digital object and an essential costs that should be considered from the beginning stage of digitization project.

Problems in Quantification of Adequacy of Academic Library Collections -Critical Analysis of Standards for Academic Libraries in the U.S.- (종합대학 도서관장서의 적정량기준 설정에 관한 고찰 -미국의 종합대학도서관기준을 중심으로-)

  • Chung Young Sun
    • Journal of the Korean Society for Library and Information Science
    • /
    • v.8
    • /
    • pp.183-207
    • /
    • 1981
  • Library standards have been the source of considerable controversy, whereas many problems are involved in developing stardard for university library collections. For evaluation purposes, standards should be precise, quantifiable and measurable. In the United States, however, standards for academic libraries are limited to qualitative statements and principles. Quantitative standards, when given, are ususally related to the number of population in the institution being served, or the prescribed quantitative objectives are often arbitrarily formulated by value judgements. The study in this paper attempts to explain the problems involved in developing quantitative standard for academic library collections. Two problems facing in the formulation of the optimal size of collection are identified. One is the theoretically faulty concept of adequacy of collection to meet the situations of diversity of university libraies, and the other is the difficulties in quantification and measurement, along with the lack of concept of adequacy of collection. However, quantification of adequate size of collection is proved to be useful on the pratical level, even though not valid theoretically. ACRL, Clapp/Jordan and Voigt developed formulas or models for setting the optimal size of a library collection for any particular university library. The main purpose of this study is the analysis of the above formulas. ACRL standard was drawn from obervation and analysis of statistcs in leading library collections. In academic field, this judgement appears to have been based on the assumption that a high-grade institution would be apt to have a good library collection. This study criticizes ACRL standard for its failure to include some determinants of measurements, and points out the limitations of the standard. In contrast. Clapp/Jordan developed a formula rather scientifically based upon bibliographical sources. This is similarly empirical but has the advantage of bringing into play the elements which make universities diverse in nature. Both ACRL and Clapp/Jordan formulas share two major defects. (1) the specific subject needs of the collection are not indiacted directly, and (2) percentage rate of growth is an indicator in measuring the potential utility of a collection. Thus both formulas failed to provide a basis for meaningful evaluation. Voigt further developed a model for determining acquisition rates for currently published materials based on bibliographic technique. Voigt model encourages experimentation with different programs and different allocations of input resources, designed to meet the needs of the library's particular population. Standard for university library collections can be formulated in terms of input(traditional indicator), or additionally, in terms of output(cost-effectiveness). Cost effectiveness is expressed as user satisfaction, ability to provide wanted materials within a reasonable time period. Thus simple quantitative method does not cover all the situations of diversity of university library collections, nor measures the effectiveness of collections. Valid standard could not be established without further research.

  • PDF

A Study on the Recognition and Needs Analysis of Community Residents to Reuse Closed Schools as Library (폐교 시설의 효과적인 도서관 활용을 위한 폐교 발생 지역 주민의 인식 및 요구 분석에 관한 연구)

  • Noh, Younghee;Ro, Ji-Yoon
    • Journal of the Korean Society for Library and Information Science
    • /
    • v.53 no.2
    • /
    • pp.91-116
    • /
    • 2019
  • This study conducted a demand analysis for the operation and management of closed schools, the direction of future use of closed schools, and the utilization of closed schools, among local residents expected to play a key role in the successful use of closed libraries. According to the study, residents in areas where schools actually occur may actively consider the use of closed schools when they are lacking in the number of libraries in the region or when they hope to remodel their schools as libraries, given that they are positive about future use of closed schools and that remodeling them is more economical in terms of cost than building new libraries. The current awareness of the use and operation of closed schools remains at a normal level, so it is deemed necessary to improve the direction of the promotion, operation and recycling of closed schools in the order of promotion, operation and physical aspects by grouping obstacles to the operation and management of closed schools. In addition, the current status of cultural facilities and the demands of local residents are met in terms of the types of use of closed school facilities, and it means that closed schools can be used in a relatively easy-to-access location based on their location, while libraries can be used as a space for cultural facilities that are different from the existing public libraries, exhibition halls, and eco-friendly libraries.

A Study on improvement for disaster resilience of the smart city - Mainly on the data analysis in Great East Japan Earthquake (스마트시티의 재해회복력 향상을 위한 고찰 - 동일본 대지진 데이터 분석을 중심으로)

  • Chang, Hye-Jung;Kim, Do-Nyun
    • The Journal of Korea Institute of Information, Electronics, and Communication Technology
    • /
    • v.9 no.4
    • /
    • pp.373-387
    • /
    • 2016
  • The citizen is going to live on security for better life stably in all times, and, as for such human basic desire, it is to the base which is important about the durability and the development of the smart city. I defined needs and the priority about the disaster recovery of the community as a citizen through date analysis until I came back to the normal environment again after a smart city suffered the damage by the misfortune in the study. I was going to suggest a method to support inhabitants of the damage area that was the immediate, and was necessary for a base in such date analysis and recovery of the community. I considered the Great East Japan Earthquake in an example in 2011. I studied the smart city plan which could improve the resilience of the local citizen and community through data utilization.

An Analysis of Chracteristics of School Library Program Employing Taxonomy and CIPP Model (택소노미와 CIPP 모형을 활용한 학교도서관 우수프로그램의 특징 분석)

  • Kang, Ji Hei
    • Journal of Korean Library and Information Science Society
    • /
    • v.49 no.3
    • /
    • pp.263-282
    • /
    • 2018
  • This study analyzed nineteen best-practices (161 sub-programs) regarding school library media program. The authors conducted content analysis based on The Library Media Specialist's Taxonomy and the CIPP Model. In the analysis of taxonomy showing the degree of involvement of librarian teachers, Level Seven indicating a concerted effort to promote school library program was the most important (37.6%). Among the ten levels of taxonomy, the highest level (levels 6 to 10, high involvement of librarians) accounted for 65.9%. According to the analysis of the CIPP Model, (Context) most of the excellent programs were aimed at reading education (41.5%) and library use education (11.3%), and more than half (53.8%) cases were initiated by learners' needs. (Input) excellence programs employed various teaching materials and were operated at fixed time. (Process) the programs were run in a variety of ways, mainly with reading instruction (30.4%) and cultural activities (17.3%). (Product) As a fruit of programs, students and parents' satisfaction, students' achievement, self-confidence, and self-improvement were reported.

Domain Analysis and Component Extraction for Defence Software (국방 소프트웨어의 도메인 분석과 컴포넌트 추출)

  • Song, Ho-Jin;Choi, Eun-Man;Jeon, Byung-Kook;Kim, Young-Chul
    • The KIPS Transactions:PartD
    • /
    • v.11D no.1
    • /
    • pp.123-132
    • /
    • 2004
  • Defense software has short of interoperability due to the vertical development method which is dependent heavily on application area and development environments. In order to prevent from lack of reusability and operability in application domain software development technology needs component concept and makes shift to the trend of domestic software component industry. This paper covers the research topics such as domain analysis and component architecture to improve and extend reusability and inter-operability for defense information system by two approaches, i.e. CBW (Command Based Work flow) analysis and UML components identification.