• Title/Summary/Keyword: 매칭기법

Search Result 762, Processing Time 0.026 seconds

Semantic Search System using Ontology-based Inference (온톨로지기반 추론을 이용한 시맨틱 검색 시스템)

  • Ha Sang-Bum;Park Yong-Tack
    • Journal of KIISE:Software and Applications
    • /
    • v.32 no.3
    • /
    • pp.202-214
    • /
    • 2005
  • The semantic web is the web paradigm that represents not general link of documents but semantics and relation of document. In addition it enables software agents to understand semantics of documents. We propose a semantic search based on inference with ontologies, which has the following characteristics. First, our search engine enables retrieval using explicit ontologies to reason though a search keyword is different from that of documents. Second, although the concept of two ontologies does not match exactly, can be found out similar results from a rule based translator and ontological reasoning. Third, our approach enables search engine to increase accuracy and precision by using explicit ontologies to reason about meanings of documents rather than guessing meanings of documents just by keyword. Fourth, domain ontology enables users to use more detailed queries based on ontology-based automated query generator that has search area and accuracy similar to NLP. Fifth, it enables agents to do automated search not only documents with keyword but also user-preferable information and knowledge from ontologies. It can perform search more accurately than current retrieval systems which use query to databases or keyword matching. We demonstrate our system, which use ontologies and inference based on explicit ontologies, can perform better than keyword matching approach .

An Empirical Analysis on Determinant Factors of Patent Valuation and Technology Transaction Prices (특허가치 결정요인과 기술거래금액에 관한 실증 분석)

  • Sung, Tae-Eung;Kim, Da Seul;Jang, Jong-Moon;Park, Hyun-Woo
    • Journal of Korea Technology Innovation Society
    • /
    • v.19 no.2
    • /
    • pp.254-279
    • /
    • 2016
  • Recently, with the conversion towards knowledge-based economy era, the importance of the evaluation for patent valuation has been growing rapidly because technology transactions are increasing with the purpose of practically utilizing R&D outcomes such as technology commercialization and technology transfer. Nevertheless, there is a lack of research on determinants of patent valuation by analyzing technology transactions due to the difficulty of collecting data in practice. Hence, to suggest quantitative determinants for the patent valuation which could be applied to scoring methods, 15 patent valuation models domestically and overseas are analysed in order to assure the objectiveness for subjective results from qualitative methods such as expert surveys, comparison assessment, etc. Through this analysis, the important 6 common determinants are drawn and patent information is matched which can be used as proxy variables of individual determinant factors by advanced researches. In addition, to validate whether the model proposed has a statistically meaningful effect, total 517 technology transactions are collected from both public and private technology transaction offices and analysed by multiple regression analysis, which led to significant patent determinant factors in deciding its value. As a result, it is herein presented that patent connectivity(number of literature cited) and commercialization stage in market influence significantly on patent valuation. The meaning of this study is in that it suggests the significant quantitative determinants of patent valuation based on the technology transactions data in practice, and if research results by industry are systematically verified through seamless collection of transaction data and their monitoring, we would propose the customized patent valuation model by industry which is applicable for both strategic planning of patent registration and achievement assessment of research projects (with representative patents).

A Fast and Accurate Face Detection and Tracking Method by using Depth Information and color information (깊이정보와 컬러정보를 이용한 고속 고정밀 얼굴검출 및 추적 방법)

  • Kim, Woo-Youl;Seo, Young-Ho;Kim, Dong-Wook
    • Journal of the Korea Institute of Information and Communication Engineering
    • /
    • v.16 no.9
    • /
    • pp.1825-1838
    • /
    • 2012
  • This paper proposes a fast face detection and tracking method which uses depth images as well as RGB images. It consists of the face detection procedure and the face tracking procedure. The face detection method basically uses an existing method, Adaboost, but it reduces the size of the search area by using the depth information and skin color. The proposed face tracking method uses a template matching technique and incorporates an early-termination scheme to reduce the execution time further. The results from implementing and experimenting the proposed methods showed that the proposed face detection method takes only about 39% of the execution time of the existing method. The proposed tracking method takes only 2.48ms per frame. For the exactness, the proposed detection method and previous method showed a same detection ratio but in the error ratio, which is about 0.66%, the proposed method showed considerably improved performance. In all the cases except a special one, the tracking error ratio is as low as about 1%. Therefore, we expect the proposed face detection and tracking methods can be used individually or in combined for many applications that need fast execution and exact detection or tracking.

Debelppment of C++ Compiler and Programming Environment (C++컴파일러 및 프로그래밍 환경 개발)

  • Jang, Cheon-Hyeon;O, Se-Man
    • The Transactions of the Korea Information Processing Society
    • /
    • v.4 no.3
    • /
    • pp.831-845
    • /
    • 1997
  • In this paper,we proposed and developed a compiler and interactive programming enviroments for C++ wich is mostly worth of nitice among the object -oriented languages.To develope the compiler for C++ we took front=end/back-end model using EM virtual machine.In develpoing Front-End,we formailized C++ gram-mar with the context semsitive tokens which must be manipulated by dexical scanner and designed a AST class li-brary which is the hierarchy of AST node class and well defined interface among them,In develpoing Bacik-End,we proposed model for three major components :code oprtimizer,code generator and run-time enviroments.We emphasized the retargatable back-end which can be systrmatically reconfigured to genrate code for a variety of distinct target computers.We also developed terr pattern matching algorithm and implemented target code gen-erator which produce SPARC code.We also proposed the theroy and model for construction interative pro-gramming enviroments. To represent language features we adopt AST as internal reprsentation and propose uncremental analysis algorithm and viseal digrams.We also studied unparsing scheme, visual diagram,graphical user interface to generate interactive environments automatically Results of our resarch will be very useful for developing a complier and programming environments, and also can be used in compilers for parallel and distributed enviroments.

  • PDF

SpatioTemporal GIS를 활용한 도시공간모형 적용에 관한 연구 / 인구분포모델링을 중심으로

  • 남광우;이성호;김영섭;최철옹
    • Proceedings of the Korean Association of Geographic Inforamtion Studies Conference
    • /
    • 2002.03b
    • /
    • pp.127-141
    • /
    • 2002
  • GIS환경에서 도시모형(urban model)의 적용을 목적으로 사회·경제적 데이터(socio-economic data)를 활용하는 과정은 도시현상이 갖는 복잡성과 변동성으로 인해 하나의 특정시간에서의 상황을 그대로 저장한 형태인 스냅샷 모형(snapshot model)만으로는 효율적인 공간분석의 실행이 불가능하다. 또한 도시모형을 적용하는 과정에서 GIS의 대상이 되는 공간, 속성, 시간의 정의는 분석목적에 따라 다르게 정의되어질 수 있으며 이에 따라 상이한 결과가 도출될 수 있다. 본 연구는 30년 간의 부산시 인구분포의 동적 변화과정 관측을 위해 시간개념을 결합한 Temporal GIS를 구축하고 이를 활용하여 인구밀도모형 및 접근성모형을 적용하는 과정을 통해 보다 효율적이고 다양한 결과를 제시할 수 있는 GIS 활용방안을 제시하고자 하였다. 흔히 공간현상의 계량화와 통계적 기법의 적용을 위한 데이터 처리과정은 많은 오차와 오류를 유발할 수 있다. 이러한 문제의 해결을 위해서는 우선적으로 분석목적에 맞는 데이터의 정의(Data Definition), 적용하고자 하는 모형(Model)의 유용성 검증, 적절한 분석단위의 설정, 결과해석의 객관적 접근 등이 요구된다. 이와 더불어 변동성 파악을 위한 시계열 자료의 효율적 처리를 위한 방법론이 마련되어져야 한다. 즉, GIS환경에서의 도시모형의 적용에 따른 효율성과 효과성의 극대화를 위해서는 분석목적에 맞는 데이터모델의 설정과 공간DB의 구축방법이 이루어져야 하며 분석가능한 데이터의 유형에 대한 충분한 고려와 적용과정에서 분석결과에 중대한 영향을 미칠 수 있는 요소들을 미리 검증하여 결정하는 순환적 의사결정과정이 필요하다., 표준패턴을 음표와 비음표의 두개의 그룹으로 나누어 인식함으로써 DP 매칭의 처리 속도를 개선시켰고, 국소적인 변형이 있는 패턴과 특징의 수가 다른 패턴의 경우에도 좋은 인식률을 얻었다.r interferon alfa concentrated solution can be established according to the monograph of EP suggesting the revision of Minimum requirements for biological productss of e-procurement, e-placement, e-payment are also investigated.. monocytogenes, E. coli 및 S. enteritidis에 대한 키토산의 최소저해농도는 각각 0.1461 mg/mL, 0.2419 mg/mL, 0.0980 mg/mL 및 0.0490 mg/mL로 측정되었다. 또한 2%(v/v) 초산 자체의 최소저해농도를 측정한 결과, B. cereus, L. mosocytogenes, E. eoli에 대해서는 control과 비교시 유의적인 항균효과는 나타나지 않았다. 반면에 S. enteritidis의 경우는 배양시간 4시간까지는 항균활성을 나타내었지만, 8시간 이후부터는 S. enteritidis의 성장이 control 보다 높아져 배양시간 20시간에서는 control 보다 약 2배 이상 균주의 성장을 촉진시켰다.차에 따른 개별화 학습을 가능하게 할 뿐만 아니라 능동적인 참여를 유도하여 학습효율을 높일 수 있을 것으로 기대된다.향은 패션마케팅의 정의와 적용범위를 축소시킬 수 있는 위험을 내재한 것으로 보여진다. 그런가 하면, 많이 다루어진 주제라

  • PDF

A Bloom Filter Application of Network Processor for High-Speed Filtering Buffer-Overflow Worm (버퍼 오버플로우 웜 고속 필터링을 위한 네트워크 프로세서의 Bloom Filter 활용)

  • Kim Ik-Kyun;Oh Jin-Tae;Jang Jong-Soo;Sohn Sung-Won;Han Ki-Jun
    • Journal of the Institute of Electronics Engineers of Korea TC
    • /
    • v.43 no.7 s.349
    • /
    • pp.93-103
    • /
    • 2006
  • Network solutions for protecting against worm attacks that complement partial end system patch deployment is a pressing problem. In the content-based worm filtering, the challenges focus on the detection accuracy and its performance enhancement problem. We present a worm filter architecture using the bloom filter for deployment at high-speed transit points on the Internet, including firewalls and gateways. Content-based packet filtering at multi-gigabit line rates, in general, is a challenging problem due to the signature explosion problem that curtails performance. We show that for worm malware, in particular, buffer overflow worms which comprise a large segment of recent outbreaks, scalable -- accurate, cut-through, and extensible -- filtering performance is feasible. We demonstrate the efficacy of the design by implementing it on an Intel IXP network processor platform with gigabit interfaces. We benchmark the worm filter network appliance on a suite of current/past worms, showing multi-gigabit line speed filtering prowess with minimal footprint on end-to-end network performance.

A 12b 130MS/s 108mW $1.8mm^2$ 0.18um CMOS ADC for High-Quality Video Systems (고화질 영상 시스템 응용을 위한 12비트 130MS/s 108mW $1.8mm^2$ 0.18um CMOS A/D 변환기)

  • Han, Jae-Yeol;Kim, Young-Ju;Lee, Seung-Hoon
    • Journal of the Institute of Electronics Engineers of Korea SD
    • /
    • v.45 no.3
    • /
    • pp.77-85
    • /
    • 2008
  • This work proposes a 12b 130MS/s 108mW $1.8mm^2$ 0.18um CMOS ADC for high-quality video systems such as TFT-LCD displays and digital TVs requiring simultaneously high resolution, low power, and small size at high speed. The proposed ADC optimizes power consumption and chip area at the target resolution and sampling rate based on a three-step pipeline architecture. The input SHA with gate-bootstrapped sampling switches and a properly controlled trans-conductance ratio of two amplifier stages achieves a high gain and phase margin for 12b input accuracy at the Nyquist frequency. A signal-insensitive 3D-fully symmetric layout reduces a capacitor and device mismatch of two MDACs. The proposed supply- and temperature- insensitive current and voltage references are implemented on chip with a small number of transistors. The prototype ADC in a 0.18um 1P6M CMOS technology demonstrates a measured DNL and INL within 0.69LSB and 2.12LSB, respectively. The ADC shows a maximum SNDR of 53dB and 51dB and a maximum SFDR of 68dB and 66dB at 120MS/s and 130MS/s, respectively. The ADC with an active die area of $1.8mm^2$ consumes 108mW at 130MS/s and 1.8V.

A Study of Web Application Attack Detection extended ESM Agent (통합보안관리 에이전트를 확장한 웹 어플리케이션 공격 탐지 연구)

  • Kim, Sung-Rak
    • Journal of the Korea Society of Computer and Information
    • /
    • v.12 no.1 s.45
    • /
    • pp.161-168
    • /
    • 2007
  • Web attack uses structural, logical and coding error or web application rather than vulnerability to Web server itself. According to the Open Web Application Security Project (OWASP) published about ten types of the web application vulnerability to show the causes of hacking, the risk of hacking and the severity of damage are well known. The detection ability and response is important to deal with web hacking. Filtering methods like pattern matching and code modification are used for defense but these methods can not detect new types of attacks. Also though the security unit product like IDS or web application firewall can be used, these require a lot of money and efforts to operate and maintain, and security unit product is likely to generate false positive detection. In this research profiling method that attracts the structure of web application and the attributes of input parameters such as types and length is used, and by installing structural database of web application in advance it is possible that the lack of the validation of user input value check and the verification and attack detection is solved through using profiling identifier of database against illegal request. Integral security management system has been used in most institutes. Therefore even if additional unit security product is not applied, attacks against the web application will be able to be detected by showing the model, which the security monitoring log gathering agent of the integral security management system and the function of the detection of web application attack are combined.

  • PDF

Relative RPCs Bias-compensation for Satellite Stereo Images Processing (고해상도 입체 위성영상 처리를 위한 무기준점 기반 상호표정)

  • Oh, Jae Hong;Lee, Chang No
    • Journal of the Korean Society of Surveying, Geodesy, Photogrammetry and Cartography
    • /
    • v.36 no.4
    • /
    • pp.287-293
    • /
    • 2018
  • It is prerequisite to generate epipolar resampled images by reducing the y-parallax for accurate and efficient processing of satellite stereo images. Minimizing y-parallax requires the accurate sensor modeling that is carried out with ground control points. However, the approach is not feasible over inaccessible areas where control points cannot be easily acquired. For the case, a relative orientation can be utilized only with conjugate points, but its accuracy for satellite sensor should be studied because the sensor has different geometry compared to well-known frame type cameras. Therefore, we carried out the bias-compensation of RPCs (Rational Polynomial Coefficients) without any ground control points to study its precision and effects on the y-parallax in epipolar resampled images. The conjugate points were generated with stereo image matching with outlier removals. RPCs compensation was performed based on the affine and polynomial models. We analyzed the reprojection error of the compensated RPCs and the y-parallax in the resampled images. Experimental result showed one-pixel level of y-parallax for Kompsat-3 stereo data.

Damage Analysis and Accuracy Assessment for River-side Facilities using UAV images (UAV 영상을 활용한 수변구조물 피해분석 및 정확도 평가)

  • Kim, Min Chul;Yoon, Hyuk Jin;Chang, Hwi Jeong;Yoo, Jong Su
    • Journal of Korean Society for Geospatial Information Science
    • /
    • v.24 no.1
    • /
    • pp.81-87
    • /
    • 2016
  • It is important to analyze the exact damage information for fast recovery when natural disasters cause damage on river-side facilities such as dams, bridges, embankments etc. In this study, we shows the method to effectively damage analysis plan using UAV(Unmanned aerial vehicle) images and accuracy assessment of it. The UAV images are captured on area near the river-side facilities and the core methodology for damage analysis are image matching and change detection algorithm. The result(point cloud) from image matching is to construct 3-dimensional data using by 2-dimensional images, it extracts damage areas by comparing the height values on same area with reference data. The results are tested absolute locational precision compared by post-processed aerial LiDAR data named reference data. The assessment analysis test shows our matching results 10-20 centimeter level precision if external orientation parameters are very accurate. This study shows suggested method is very useful for damage analysis in a large size structure like river-side facilities. But the complexity building can't apply this method, it need to the other method for damage analysis.