• Title/Summary/Keyword: standardization algorithm

Search Result 122, Processing Time 0.026 seconds

Ontology Matching Method for Solving Ontology Heterogeneity Issue (온톨로지 이질성 문제를 해결하기 위한 온톨로지 매칭 방법)

  • Hongzhou Duan;Yongju Lee
    • The Journal of the Korea institute of electronic communication sciences
    • /
    • v.19 no.3
    • /
    • pp.571-576
    • /
    • 2024
  • Ontologies are created by domain experts, but the same content may be expressed differently by each expert due to different understandings of domain knowledge. Since the ontology standardization is still lacking, multiple ontologies can be exist within the same domain, resulting in a phenomenon called the ontology heterogeneity. Therefore, we propose a novel ontology matching method that combines SCBOW(: Siames Continuois Bag Of Words) and BERT(: Bidirectional Encoder Representations from Transformers) models to solve the ontology heterogeneity issue. Ontologies are expressed as a graph and the SimRank algorithm is used to solve the one-to-many problem that can occur in ontology matching problems. Experimental results showed that our approach improves performance by about 8% over traditional matching algorithm. Proposed method can enhance and refine the alignment technology used in ontology matching.

Detection Model Generation System using Learning (학습을 통한 탐지 모델 생성 시스템)

  • 김선영;오창석
    • The Journal of the Korea Contents Association
    • /
    • v.3 no.1
    • /
    • pp.31-38
    • /
    • 2003
  • In this paper, We propose detection mood generation system using learning to generate automatically detection model. It is improved manpower, efficiency in time. Proposed detection model generator system is consisted of agent system and manager system. Model generation can do existing standardization by genetic algorithm because do model generation and apply by new detection model. according to experiment results, detection model generation using learning proposed sees more efficiently than existing intrusion detection system. When intrusion of new type occur by implemented system and decrease of the False-Positive rate, improve performance of existing intrusion detection system.

  • PDF

Key Management Process in JPWallet of MOSES System (MOSES에서의 JPWallet의 기능과 키 관리 분석)

  • Oh Tae Suk;Kim Yong Cheol;Choi Bum Suk;Choi Jin Soo
    • The Journal of Korean Institute of Communications and Information Sciences
    • /
    • v.31 no.1C
    • /
    • pp.74-79
    • /
    • 2006
  • When DRM systems are built on a specific computing platform and a coding algorithm, the interoperability among them will be improbable. For enhanced compatibility, MOSES has been developed such that it has a structure that can be decomposed into independent modules for interoperability with other DRM systems with IPMP functionality. In MOSES, security in contents transaction is provided by JPWallet which controls licenses with key management. In this paper, we present the structure of JPWallet and how the keys are handled between contents servers and contents-consuming clients. The PDA-based codes from the prototype MOSES system have been ported into PC-based codes and tested for compatibility. Analysis of JPWallet, which is the core of MOSES, will contribute to the standardization of domestic IPMP systems compatible with global standards.

Error propagation in 2-D self-calibration algorithm (2차원 자가 보정 알고리즘에서의 불확도 전파)

  • 유승봉;김승우
    • Proceedings of the Korean Society of Precision Engineering Conference
    • /
    • 2003.06a
    • /
    • pp.434-437
    • /
    • 2003
  • Evaluation or the patterning accuracy of e-beam lithography machines requires a high precision inspection system that is capable of measuring the true xy-locations of fiducial marks generated by the e-beam machine under test. Fiducial marks are fabricated on a single photo mask over the entire working area in the form of equally spaced two-dimensional grids. In performing the evaluation, the principles of self-calibration enable to determine the deviations of fiducial marks from their nominal xy-locations precisely, not being affected by the motion errors of the inspection system itself. It is. however, the fact that only repeatable motion errors can be eliminated, while random motion errors encountered in probing the locations of fiducial marks are not removed. Even worse, a random error occurring from the measurement of a single mark propagates and affects in determining locations of other marks, which phenomenon in fact limits the ultimate calibration accuracy of e-beam machines. In this paper, we describe an uncertainty analysis that has been made to investigate how random errors affect the final result of self-calibration of e-beam machines when one uses an optical inspection system equipped with high-resolution microscope objectives and a precision xy-stages. The guide of uncertainty analysis recommended by the International Organization for Standardization is faithfully followed along with necessary sensitivity analysis. The uncertainty analysis reveals that among the dominant components of the patterning accuracy of e-beam lithography, the rotationally symmetrical component is most significantly affected by random errors, whose propagation becomes more severe in a cascading manner as the number of fiducial marks increases

  • PDF

Preprocessing System for Real-time and High Compression MPEG-4 Video Coding (실시간 고압축 MPEG-4 비디오 코딩을 위한 전처리 시스템)

  • 김준기;홍성수;이호석
    • Journal of KIISE:Computing Practices and Letters
    • /
    • v.9 no.5
    • /
    • pp.509-520
    • /
    • 2003
  • In this paper, we developed a new and robust algorithm for a practical and very efficient MPEG-4 video coding. The MPEG-4 video group has developed the video Verification Model(VM) which evolved through time by means of core experiments. And in the standardization process, MS-FDAM was developed based on the standard document of ISO/IEC 14496-2 and VM as a reference MPEG-4 coding system. But MS -FDAM has drawbacks in practical MPEG-4 coding and it does not have the VOP extraction functionality. In this research, we implemented a preprocessing system for a real-time input and the VOP extraction for a practical content-based MPEG-4 video coding and also implemented the motion detection to achieve the high compression rate of 180:1.

Assesment and Diagnosis of Attention Deficit Hyperactivity Disorder(ADHD) - Focusing on Behavior Rating Scales - (주의력결핍과잉행동장애의 진단 및 평가 - 행동평정척도들을 중심으로 -)

  • Chang, Gyu-Tae;Han, Yun-Jeong
    • The Journal of Pediatrics of Korean Medicine
    • /
    • v.20 no.2
    • /
    • pp.147-175
    • /
    • 2006
  • Objective : This study is to investigate the method for assesment and diagnosis of ADHD, especially focusing on behavior rating scales. Methods : We searched the recent date of the publication and paper in ADHD. Results : For Assesment and Diagnosis of ADHD, various method such as interview with parents, child and teacher, behavior observation, behavior rating scales and neuropsychological test are used. The structured interview consists of the restrictive questions and response, and then have diagnostic algorithm, consequently can be used by untrained clinicians. Of the structured interview, standardization of K-SADS in Korean version is finished. Behavior rating scales, the form of parent, teacher and self-report questionnaires, are used as diagnosis and treatment evaluation of ADHD. Behavior rating scales consist of both ADHD-specific scales and broad-band scales designed to screen for various symptoms (including ADHD symptoms). ADHD-specific scales are useful in differential diagnosis, discrimination of subtype, treatment evaluation, However, broad-band scales are useful in preliminary examination. The neuropsychological tests can evaluate attention deficit and effect of attention deficit on cognitive function and academic performance. The neuropsychological tests also used in diagnosis and treatment evaluation of ADHD. Conclusion : For Assesment and Diagnosis of ADHD, various method are used, especially behavior rating scales are both useful and simple tool for diagnosis and treatment evaluation.

  • PDF

On the Loading Plan of Container Ship (컨테이너선의 적재계량에 관한 연구)

  • 강기중;이철영
    • Journal of the Korean Institute of Navigation
    • /
    • v.14 no.4
    • /
    • pp.1-15
    • /
    • 1990
  • With increasing ship's speed turnround and port time becomes a large percentage of total roundtrip time and this causes to accelerate the introduction of the various kind of modern handling equipment, the standardization of cargoes, and the improvement of the ship. However, it is still a drag on efficient operation of ship. Similarly, the turnround time at the container port is very important as a measure for the decision of the efficiency of port. To decrease operating coasts, the minimization of the time need to cargo handling at the ports of call must be achieved. Thus the optimization of the time need to cargo handling at the ports of call must be achieved. Thus the optimized Container Loading Plan is necessary, especially under the rapid speed of container operations. For the container loading plan, in this thesis, we use the hungarian method and the branch and bound method to get the initial disposition of both maximization of ship's GM and minimization of shift number to the obstructive container in a yard area. We apply the dynamic programming algorithm to get the final disposition for minimizing total turnroudn time and finally we analyzed the results to check whether the initial disposition is proper or not.

  • PDF

A Hardware Implementation of Whirlpool Hash Function using 64-bit datapath (64-비트 데이터패스를 이용한 Whirlpool 해시 함수의 하드웨어 구현)

  • Kwon, Young-Jin;Kim, Dong-Seong;Shin, Kyung-Wook
    • Proceedings of the Korean Institute of Information and Commucation Sciences Conference
    • /
    • 2017.10a
    • /
    • pp.485-487
    • /
    • 2017
  • The whirlpool hash function adopted as an ISO / IEC standard 10118-3 by the international standardization organization is an algorithm that provides message integrity based on an SPN (Substitution Permutation Network) structure similar to AES block cipher. In this paper, we describe the hardware implementation of the Whirlpool hash function. The round block is designed with a 64-bit data path and encryption is performed over 10 rounds. To minimize area, key expansion and encryption algorithms use the same hardware. The Whirlpool hash function was modeled using Verilog HDL, and simulation was performed with ModelSim to verify normal operation.

  • PDF

Distributed Bluetooth Scatternet Formation Protocol (분산형 블루투스 스캐터넷 형성 프로토콜)

  • 손진호;정태명
    • The Journal of Korean Institute of Communications and Information Sciences
    • /
    • v.28 no.10A
    • /
    • pp.838-846
    • /
    • 2003
  • In Bluetooth networks, the scatternet is defined as the internetworking of multiple piconets. Currently, Bluetooth standardization does not include the formation issue of scatternet by piconets. The existing formation algorithms of scatternet do not support the features of ad-hoc networks, which cause the performance degradation of systems when the nodes have certain degree of mobility. Therefore, as the formation of scatternet gets complicated, the throughput is lowered and the delay increases due to the inefficient architectural problems. In this paper, we propose the distributed formation scheme for bluetooth in scatternet, in which the nodes are spread out to form scatternet. Simulation results show that the proposed algorithm outperforms the conventional schemes.

Unsupervised Outpatients Clustering: A Case Study in Avissawella Base Hospital, Sri Lanka

  • Hoang, Huu-Trung;Pham, Quoc-Viet;Kim, Jung Eon;Kim, Hoon;Park, Junseok;Hwang, Won-Joo
    • Journal of Korea Multimedia Society
    • /
    • v.22 no.4
    • /
    • pp.480-490
    • /
    • 2019
  • Nowadays, Electronic Medical Record (EMR) has just implemented at few hospitals for Outpatient Department (OPD). OPD is the diversified data, it includes demographic and diseases of patient, so it need to be clustered in order to explore the hidden rules and the relationship of data types of patient's information. In this paper, we propose a novel approach for unsupervised clustering of patient's demographic and diseases in OPD. Firstly, we collect data from a hospital at OPD. Then, we preprocess and transform data by using powerful techniques such as standardization, label encoder, and categorical encoder. After obtaining transformed data, we use some strong experiments, techniques, and evaluation to select the best number of clusters and best clustering algorithm. In addition, we use some tests and measurements to analyze and evaluate cluster tendency, models, and algorithms. Finally, we obtain the results to analyze and discover new knowledge, meanings, and rules. Clusters that are found out in this research provide knowledge to medical managers and doctors. From these information, they can improve the patient management methods, patient arrangement methods, and doctor's ability. In addition, it is a reference for medical data scientist to mine OPD dataset.