• Title/Summary/Keyword: Information Processing Process

Search Result 4,585, Processing Time 0.039 seconds

Extracting Ontology from Medical Documents with Ontology Maturing Process

  • Nyamsuren, Enkhbold;Kang, Dong-Yeop;Kim, Su-Kyoung;Choi, Ho-Jin
    • Proceedings of the Korea Information Processing Society Conference
    • /
    • 2009.04a
    • /
    • pp.50-52
    • /
    • 2009
  • Ontology maintenance is a time consuming and costly process which requires special skill and knowledge. It requires joint effort of both ontology engineer and domain specialist to properly maintain ontology and update knowledge in it. This is specially true for medical domain which is highly specialized domain. This paper proposes a novel approach for maintenance and update of existing ontologies in a medical domain. The proposed approach is based on modified Ontology Maturing Process which was originally developed for web domain. The proposed approach provides way to populate medical ontology with new knowledge obtained from medical documents. This is achieved through use of natural language processing techniques and highly specialized medical knowledge bases such as Unified Medical Language System.

NIST Lightweight Cryptography Standardization Process: Classification of Second Round Candidates, Open Challenges, and Recommendations

  • Gookyi, Dennis Agyemanh Nana;Kanda, Guard;Ryoo, Kwangki
    • Journal of Information Processing Systems
    • /
    • v.17 no.2
    • /
    • pp.253-270
    • /
    • 2021
  • In January 2013, the National Institute of Standards and Technology (NIST) announced the CAESAR (Competition for Authenticated Encryption: Security, Applicability, and Robustness) contest to identify authenticated ciphers that are suitable for a wide range of applications. A total of 57 submissions made it into the first round of the competition out of which 6 were announced as winners in March 2019. In the process of the competition, NIST realized that most of the authenticated ciphers submitted were not suitable for resource-constrained devices used as end nodes in the Internet-of-Things (IoT) platform. For that matter, the NIST Lightweight Cryptography Standardization Process was set up to identify authenticated encryption and hashing algorithms for IoT devices. The call for submissions was initiated in 2018 and in April 2019, 56 submissions made it into the first round of the competition. In August 2019, 32 out of the 56 submissions were selected for the second round which is due to end in the year 2021. This work surveys the 32 authenticated encryption schemes that made it into the second round of the NIST lightweight cryptography standardization process. The paper presents an easy-to-understand comparative overview of the recommended parameters, primitives, mode of operation, features, security parameter, and hardware/software performance of the 32 candidate algorithms. The paper goes further by discussing the challenges of the Lightweight Cryptography Standardization Process and provides some suitable recommendations.

Fuzzy Petri-net Approach to Fault Diagnosis in Power Systems Using the Time Sequence Information of Protection System

  • Roh, Myong-Gyun;Hong, Sang-Eun
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 2003.10a
    • /
    • pp.1727-1731
    • /
    • 2003
  • In this paper we proposed backward fuzzy Petri-net to diagnoses faults in power systems by using the time sequence information of protection system. As the complexity of power systems increases, especially in the case of multiple faults or incorrect operation of protective devices, fault diagnosis requires new and systematic methods to the reasoning process, which improves both its accuracy and its efficiency. The fuzzy Petri-net models of protection system are composed of the operating process of protective devices and the fault diagnosis process. Fault diagnosis model, which makes use of the nature of fuzzy Petri-net, is developed to overcome the drawbacks of methods that depend on operator knowledge. The proposed method can reduce processing time and increase accuracy when compared with the traditional methods. And also this method covers online processing of real-time data from SCADA (Supervisory Control and Data Acquisition)

  • PDF

Applying document routing mode of information access in nursing diagnosis process (문서 라우팅 기법을 이용한 간호진단 과정에서의 정보접근)

  • Paik Woo-Jin
    • Proceedings of the Korean Society for Information Management Conference
    • /
    • 2006.08a
    • /
    • pp.163-168
    • /
    • 2006
  • Nursing diagnosis process is described as nurses assessing the patients' conditions by applying reasoning and looking for patterns, which fit the defining characteristics of one or more diagnoses. This process is similar to using a typical document retrieval system if we consider the patients' conditions as queries, nursing diagnoses as documents, and the defining characteristics as index terms of the documents. However, there is a small fixed number of nursing diagnoses and infinite number of patients' conditions in a typical hospital setting. This state is more suitable to applying document routing mode of information access, which is defined as a number of archived profiles, compared to individual documents. In this paper, we describe a ROUting-based Nursing Diagnosis (ROUND) system and its Natural Language Processing-based query processing component, which converts the defining characteristics of nursing diagnoses into query representations.

  • PDF

Study of the information processing model in a way of product design method (제품디자인 방법에서의 정보 처리 모델 연구)

  • 조성근
    • Archives of design research
    • /
    • v.16 no.1
    • /
    • pp.289-296
    • /
    • 2003
  • The thesis is a study of the information model for the information collection and systematization in the way of product design. In the past, the design was made by the designers hands, worked with the material directly, but today's product design, the material diverted to information, can be considered as it is made essentially through information collection and systematization processing. If the product design is considered the information processing, usually it means a qualitative change of the product design information, not a quantitative change of the information theory. A focus of the study is to grope for a way of changing the subject to information, dealing with when the product design intends to purposes, not the material, When a way of the product design was discussed to solve the problems rationally, in the past, if it is considered as quantitative, qualitative and organic methods and modeling as their means based on the process model, [analysis-generalization-estimation], are formal ism, the way of product design as information is that the product direction as a substance should go through the design information processing, making an alternative plan with the information model cycling to natural order. Because success or failure of the product design in the future depends on information as material.

  • PDF

Intelligent missing persons index system Implementation based on the OpenCV image processing and TensorFlow Deep-running Image Processing

  • Baek, Yeong-Tae;Lee, Se-Hoon;Kim, Ji-Seong
    • Journal of the Korea Society of Computer and Information
    • /
    • v.22 no.1
    • /
    • pp.15-21
    • /
    • 2017
  • In this paper, we present a solution to the problems caused by using only text - based information as an index element when a commercialized missing person indexing system indexes missing persons registered in the database. The existing system could not be used for the missing persons inquiry because it could not formalize the image of the missing person registered together when registering the missing person. To solve these problems, we propose a method to extract the similarity of images by using OpenCV image processing and TensorFlow deep - running image processing, and to process images of missing persons to process them into meaningful information. In order to verify the indexing method used in this paper, we constructed a Web server that operates to provide the information that is most likely to be needed to users first, using the image provided in the non - regular environment of the same subject as the search element.

Zero-knowledge proof algorithm for Data Privacy

  • Min, Youn-A
    • International Journal of Internet, Broadcasting and Communication
    • /
    • v.13 no.2
    • /
    • pp.67-75
    • /
    • 2021
  • As pass the three revised bills, the Personal Information Protection Act was revised to have a larger application for personal information. For an industrial development through an efficient and secure usage of personal information, there is a need to revise the existing anonymity processing method. This paper modifies the Zero Knowledge Proofs algorithm among the anonymity processing methods to modify the anonymity process calculations by taking into account the reliability of the used service company. More detail, the formula of ZKP (Zero Knowledge Proof) used by ZK-SNAKE is used to modify the personal information for pseudonymization processing. The core function of the proposed algorithm is the addition of user variables and adjustment of the difficulty level according to the reliability of the data user organization and the scope of use. Through Setup_p, the additional variable γ can be selectively applied according to the reliability of the user institution, and the degree of agreement of Witness is adjusted according to the reliability of the institution entered through Prove_p. The difficulty of the verification process is adjusted by considering the reliability of the institution entered through Verify_p. SimProve, a simulator, also refers to the scope of use and the reliability of the input authority. With this suggestion, it is possible to increase reliability and security of anonymity processing and distribution of personal information.

The Mapping Method for Parallel Processing of SAR Data

  • In-Pyo Hong;Jae-Woo Joo;Han-Kyu Park
    • The Journal of Korean Institute of Communications and Information Sciences
    • /
    • v.26 no.11A
    • /
    • pp.1963-1970
    • /
    • 2001
  • It is essential design process to analyze processing method and set out top level HW configuration using main parameters before implementation of the SAR processor. This paper identifies the impact of the I/O and algorithm structure upon the parallel processing to be assessed and suggests the practical mapping method fur parallel processing to the SAR data. Also, simulation is performed to the E-SAR processor to examine the usefulness of the method, and the results are analyzed and discussed.

  • PDF

A data-flow oriented framework for video-based 3D reconstruction (삼차원 재구성을 위한 Data-Flow 기반의 프레임워크)

  • Kim, Albert
    • Proceedings of the Korea Information Processing Society Conference
    • /
    • 2009.04a
    • /
    • pp.71-74
    • /
    • 2009
  • The data-flow paradigm has been employed in various application areas. It is particularly useful where large data-streams must be processed, for example in video and audio processing, or for scientific visualization. A video-based 3D reconstruction system should process multiple synchronized video streams. The system exhibits many properties that can be targeted using a data-flow approach that is naturally divided into a sequence of processing tasks. In this paper we introduce our concept to apply the data-flow approach to a multi-video 3D reconstruction system.

자동생산라인에서의 통계적공정관리시스템

  • Park, Jeong-Kee;Jung, Won
    • Journal of Korea Society of Industrial Information Systems
    • /
    • v.1 no.1
    • /
    • pp.111-125
    • /
    • 1996
  • This paper presents a statistical process control(SPC) system in the electronic parts manufacturing process. In this system, an SPC method is integrated into the automated inspection technology on a real time base. It shows how the collected data can be analyzed with the SPC to provide process information. also presented are stuided of subpixel image processing technology to improve the accuracy of parts mearements , and the cumulative-sum(CUSUM) control chart for fraction defectives.An application of the developed system to connector manufacturing process as a part of computer integrated manufacturing (CIM) is presented.

  • PDF