• Title/Summary/Keyword: 필터링 유형

Search Result 81, Processing Time 0.028 seconds

Automatic Construction of Alternative Word Candidates to Improve Patent Information Search Quality (특허 정보 검색 품질 향상을 위한 대체어 후보 자동 생성 방법)

  • Baik, Jong-Bum;Kim, Seong-Min;Lee, Soo-Won
    • Journal of KIISE:Software and Applications
    • /
    • v.36 no.10
    • /
    • pp.861-873
    • /
    • 2009
  • There are many reasons that fail to get appropriate information in information retrieval. Allomorph is one of the reasons for search failure due to keyword mismatch. This research proposes a method to construct alternative word candidates automatically in order to minimize search failure due to keyword mismatch. Assuming that two words have similar meaning if they have similar co-occurrence words, the proposed method uses the concept of concentration, association word set, cosine similarity between association word sets and a filtering technique using confidence. Performance of the proposed method is evaluated using a manually extracted alternative list. Evaluation results show that the proposed method outperforms the context window overlapping in precision and recall.

A Study of Location Correction Algorithm for Pedestrian Location Tracking in Traffic Connective Transferring System (교통 연계 환승 시스템의 보행자 위치 추적을 위한 보정 알고리즘 연구)

  • Jung, Jong-In;Lee, Sang-Sun
    • The Journal of The Korea Institute of Intelligent Transport Systems
    • /
    • v.8 no.2
    • /
    • pp.149-157
    • /
    • 2009
  • Tracking technologies which provide real-time and customized information through various information collecting and processing for pedestrians who use traffic connective and transferring center have been being examined. However some problems are caused due to the wide-range positioning error for some services as device installation and service place. It is also difficult to be applied to traffic linkage and transfer services because many situations can be barren. In the testbed, Gwangmyoung Station, we got some results in bad conditions such as a lot of steel construction and another communication device. Practically, conditions of the place which will be built can be worse than Gwangmyoung station. Therefore, we researched suitable Location correction algorithm as a method for accuracy to traffic connective and transferring system. And its algorithm is designed through grid coordinates, map-matching, modeling coordinates and Kalman filtering and is being implemented continuously. Also preparing for optimization of various transferring center model, we designed for simulator type algorithm what is available for deciding algorithm factor.

  • PDF

Character Region Detection Using Structural Features of Hangul & English Characters in Natural Image (자연영상에서 한글 및 영문자의 구조적 특징을 이용한 문자영역 검출)

  • Oh, Myoung-Kwan;Park, Jong-Cheon
    • Journal of the Korea Academia-Industrial cooperation Society
    • /
    • v.15 no.3
    • /
    • pp.1718-1723
    • /
    • 2014
  • We proposes the method to detect the Hangul and English character region from natural image using structural feature of Hangul and English Characters. First, we extract edge features from natural image, Next, if features are not corresponding to the heuristic rule of character features, extracted features filtered out and select candidates of character region. Next, candidates of Hangul character region are merged into one Hangul character using Hangul character merging algorithm. Finally, we detect the final character region by Hangul character class decision algorithm. English character region detected by edge features of English characters. Experimental result, proposed method could detect a character region effectively in images that contains a complex background and various environments. As a result of the performance evaluation, A proposed method showed advanced results about detection of Hangul and English characters region from natural image.

An Automatic Web Page Classification System Using Meta-Tag (메타 태그를 이용한 자동 웹페이지 분류 시스템)

  • Kim, Sang-Il;Kim, Hwa-Sung
    • The Journal of Korean Institute of Communications and Information Sciences
    • /
    • v.38B no.4
    • /
    • pp.291-297
    • /
    • 2013
  • Recently, the amount of web pages, which include various information, has been drastically increased according to the explosive increase of WWW usage. Therefore, the need for web page classification arose in order to make it easier to access web pages and to make it possible to search the web pages through the grouping. Web page classification means the classification of various web pages that are scattered on the web according to the similarity of documents or the keywords contained in the documents. Web page classification method can be applied to various areas such as web page searching, group searching and e-mail filtering. However, it is impossible to handle the tremendous amount of web pages on the web by using the manual classification. Also, the automatic web page classification has the accuracy problem in that it fails to distinguish the different web pages written in different forms without classification errors. In this paper, we propose the automatic web page classification system using meta-tag that can be obtained from the web pages in order to solve the inaccurate web page retrieval problem.

Crawlers and Morphological Analyzers Utilize to Identify Personal Information Leaks on the Web System (크롤러와 형태소 분석기를 활용한 웹상 개인정보 유출 판별 시스템)

  • Lee, Hyeongseon;Park, Jaehee;Na, Cheolhun;Jung, Hoekyung
    • Proceedings of the Korean Institute of Information and Commucation Sciences Conference
    • /
    • 2017.10a
    • /
    • pp.559-560
    • /
    • 2017
  • Recently, as the problem of personal information leakage has emerged, studies on data collection and web document classification have been made. The existing system judges only the existence of personal information, and there is a problem in that unnecessary data is not filtered because classification of documents published by the same name or user is not performed. In this paper, we propose a system that can identify the types of data or homonyms using the crawler and morphological analyzer for solve the problem. The user collects personal information on the web through the crawler. The collected data can be classified through the morpheme analyzer, and then the leaked data can be confirmed. Also, if the system is reused, more accurate results can be obtained. It is expected that users will be provided with customized data.

  • PDF

Design and Implementation of File protection system based on Windows 2000 system (Windows 2000기반의 파일 보호 시스템 설계 및 구현)

  • Lee, Nam-Hun;Yu, Sin-Geun;Sim, Yeong-Cheol
    • The KIPS Transactions:PartC
    • /
    • v.8C no.6
    • /
    • pp.741-756
    • /
    • 2001
  • With the development of computer systems, there has been a sharp increase in the threats on these systems including attacks by malicious programs such as virus, vandal, etc. Currently virus vaccines are widely used to thwart these threats, but they have many weaknesses. They cannot guard against unknown threats and sometimes, they also cannot detect the existence of malicious programs before these malicious programs make any destructive results. For lack of an efficient security model, the existing security programs have the problem that they raise many false-positive alarms in spite of normal action. So it becomes very important to develop the improved security program that can make up for the weakness of the existing computer security program and can detect many threats of malicious programs as early as possible. In this paper we describe the design of an improved security model and the implementation of a security program that can filter and handle the threats on computer systems at the kernel level in real time.

  • PDF

A Study on Effective Security Control Model Based on Characteristic of Web Service (웹 서비스 특성 기반 효율적인 보안관제 모델 연구)

  • Lee, Jae-heon;Lee, Sang-Jin
    • Journal of the Korea Institute of Information Security & Cryptology
    • /
    • v.29 no.1
    • /
    • pp.175-185
    • /
    • 2019
  • The security control is to protect IT system from cyber infringement by deriving valid result values in the process of gathering and analyzing various information. Currently, security control is very effective by using SIEM equipment which enables analysis of systematic and comprehensive viewpoint based on a lot of data, away from analyzing cyber threat information with only fragmentary information. However, It can also be said that cyber attacks are analyzed and coped with the manual work of security personnel. This means that even if there is excellent security equipment, the results will vary depending on the user using. In case of operating a characteristic web service including information provision, This study suggests the basic point of security control through characteristics information analysis, and proposes a model for intensive security control through the type discovery and application which enable a step-wise analysis and an effective filtering. Using this model would effectively detect, analyze and block attacks.

Detection Models and Response Techniques of Fake Advertising Phishing Websites (가짜 광고성 피싱 사이트 탐지 모델 및 대응 기술)

  • Eunbeen Lee;Jeongeun Cho;Wonhyung Park
    • Convergence Security Journal
    • /
    • v.23 no.3
    • /
    • pp.29-36
    • /
    • 2023
  • With the recent surge in exposure to fake advertising phishing sites in search engines, the damage caused by poor search quality and personal information leakage is increasing. In particular, the seriousness of the problem is worsening faster as the possibility of automating the creation of advertising phishing sites through tools such as ChatGPT increases. In this paper, the source code of fake advertising phishing sites was statically analyzed to derive structural commonalities, and among them, a detection crawler that filters sites step by step based on foreign domains and redirection was developed to confirm that fake advertising posts were finally detected. In addition, we demonstrate the need for new guide lines by verifying that the redirection page of fake advertising sites is divided into three types and returns different sites according to each situation. Furthermore, we propose new detection guidelines for fake advertising phishing sites that cannot be detected by existing detection methods.

Heart Rate Monitoring Using Motion Artifact Modeling with MISO Filters (MISO 필터 기반의 동잡음 모델링을 이용한 심박수 모니터링)

  • Kim, Sunho;Lee, Jungsub;Kang, Hyunil;Ohn, Baeksan;Baek, Gyehyun;Jung, Minkyu;Im, Sungbin
    • Journal of the Institute of Electronics and Information Engineers
    • /
    • v.52 no.8
    • /
    • pp.18-26
    • /
    • 2015
  • Measuring the heart rate during exercise is important to properly control the amount of exercise. With the recent advent of smart device usage, there is a dramatic increase in interest in devices for the real-time measurement of the heart rate during exercise. During intensive exercise, accurate heart rate estimation from wrist-type photoplethysmography (PPG) signals is a very difficult problem due to motion artifact (MA). In this study, we propose an efficient algorithm for an accurate estimation of the heart rate from wrist-type PPG signals. For the twelve data sets, the proposed algorithm achieves the average absolute error of 1.38 beat per minute (BPM) and the Pearson correlation between the estimates and the ground-truth of heart rate was 0.9922. The proposed algorithm presents the strengths in an accurate estimation together with a fast computation speed, which is attractive in application to wearable devices.

A SoC Design Synthesis System for High Performance Vehicles (고성능 차량용 SoC 설계 합성 시스템)

  • Chang, Jeong-Uk;Lin, Chi-Ho
    • The Journal of the Institute of Internet, Broadcasting and Communication
    • /
    • v.20 no.3
    • /
    • pp.181-187
    • /
    • 2020
  • In this paper, we proposed a register allocation algorithm and resource allocation algorithm in the high level synthesis process for the SoC design synthesis system of high performance vehicles We have analyzed to the operator characteristics and structure of datapath in the most important high-level synthesis. We also introduced the concept of virtual operator for the scheduling of multi-cycle operations. Thus, we demonstrated the complexity to implement a multi-cycle operation of the operator, regardless of the type of operation that can be applied for commonly use in the resources allocation algorithm. The algorithm assigns the functional operators so that the number of connecting signal lines which are repeatedly used between the operators would be minimum. This algorithm provides regional graphs with priority depending on connected structure when the registers are allocated. The registers with connecting structure are allocated to the maximum cluster which is generated by the minimum cluster partition algorithm. Also, it minimize the connecting structure by removing the duplicate inputs for the multiplexor in connecting structure and arranging the inputs for the multiplexor which is connected to the operators. In order to evaluate the scheduling performance of the described algorithm, we demonstrate the utility of the proposed algorithm by executing scheduling on the fifth digital wave filter, a standard bench mark model.