• Title/Summary/Keyword: Performance Information Use

Search Result 5,694, Processing Time 0.037 seconds

A Study on Iris Image Restoration Based on Focus Value of Iris Image (홍채 영상 초점 값에 기반한 홍채 영상 복원 연구)

  • Kang Byung-Jun;Park Kang-Ryoung
    • Journal of the Institute of Electronics Engineers of Korea SP
    • /
    • v.43 no.2 s.308
    • /
    • pp.30-39
    • /
    • 2006
  • Iris recognition is that identifies a user based on the unique iris texture patterns which has the functionalities of dilating or contracting pupil region. Iris recognition systems extract the iris pattern in iris image captured by iris recognition camera. Therefore performance of iris recognition is affected by the quality of iris image which includes iris pattern. If iris image is blurred, iris pattern is transformed. It causes FRR(False Rejection Error) to be increased. Optical defocusing is the main factor to make blurred iris images. In conventional iris recognition camera, they use two kinds of focusing methods such as lilted and auto-focusing method. In case of fixed focusing method, the users should repeatedly align their eyes in DOF(Depth of Field), while the iris recognition system acquires good focused is image. Therefore it can give much inconvenience to the users. In case of auto-focusing method, the iris recognition camera moves focus lens with auto-focusing algorithm for capturing the best focused image. However, that needs additional H/W equipment such as distance measuring sensor between users and camera lens, and motor to move focus lens. Therefore the size and cost of iris recognition camera are increased and this kind of camera cannot be used for small sized mobile device. To overcome those problems, we propose method to increase DOF by iris image restoration algorithm based on focus value of iris image. When we tested our proposed algorithm with BM-ET100 made by Panasonic, we could increase operation range from 48-53cm to 46-56cm.

A Design of Integrated Scientific Workflow Execution Environment for A Computational Scientific Application (계산 과학 응용을 위한 과학 워크플로우 통합 수행 환경 설계)

  • Kim, Seo-Young;Yoon, Kyoung-A;Kim, Yoon-Hee
    • Journal of Internet Computing and Services
    • /
    • v.13 no.1
    • /
    • pp.37-44
    • /
    • 2012
  • Numerous scientists who are engaged in compute-intensive researches require more computing facilities than before, while the computing resource and techniques are increasingly becoming more advanced. For this reason, many works for e-Science environment have been actively invested and established around the world, but still the scientists look for an intuitive experimental environment, which is guaranteed the improved environmental facilities without additional configurations or installations. In this paper, we present an integrated scientific workflow execution environment for Scientific applications supporting workflow design with high performance computing infrastructure and accessibility for web browser. This portal supports automated consecutive execution of computation jobs in order of the form defined by workflow design tool and execution service concerning characteristics of each job to batch over distributed grid resources. Workflow editor of the portal presents a high-level frontend and easy-to-use interface with monitoring service, which shows the status of workflow execution in real time so that user can check the intermediate data during experiments. Therefore, the scientists can take advantages of the environment to improve the productivity of study based on HTC.

Visible Light Responsive Titanium Dioxide (TiO2) (가시광 감응 산화티탄(TiO2))

  • Shon, Hokyong;Phuntsho, Sherub;Okour, Yousef;Cho, Dong-Lyun;Kim, Kyoung Seok;Li, Hui-Jie;Na, Sukhyun;Kim, Jong Beom;Kim, Jong-Ho
    • Applied Chemistry for Engineering
    • /
    • v.19 no.1
    • /
    • pp.1-16
    • /
    • 2008
  • Titanium dioxide ($TiO_2$) is one of the most researched semiconductor oxides that has revolutionised technologies in the field of environmental purification and energy generation. It has found extensive applications in heterogenous photocatalysis for removing organic pollutants from air and water and also in hydrogen production from photocatalytic water-splitting. Its use is popular because of its low cost, low toxicity, high chemical and thermal stability. But one of the critical limitations of $TiO_2$ as photocatalyst is its poor response to visible light. Several attempts have been made to modify the surface and electronic structures of $TiO_2$ to enhance its activity in the visible light region such as noble metal deposition, metal ion loading, cationic and anionic doping and sensitisation. Most of the results improved photocatalytic performance under visible light irradiation. This paper attempts to review and update some of the information on the $TiO_2$ photocatalytic technology and its accomplishment towards visible light region.

Change detection algorithm based on amplitude statistical distribution for high resolution SAR image (통계분포에 기반한 고해상도 SAR 영상의 변화탐지 알고리즘 구현 및 적용)

  • Lee, Kiwoong;Kang, Seoli;Kim, Ahleum;Song, Kyungmin;Lee, Wookyung
    • Korean Journal of Remote Sensing
    • /
    • v.31 no.3
    • /
    • pp.227-244
    • /
    • 2015
  • Synthetic Aperture Radar is able to provide images of wide coverage in day, night, and all-weather conditions. Recently, as the SAR image resolution improves up to the sub-meter level, their applications are rapidly expanding accordingly. Especially there is a growing interest in the use of geographic information of high resolution SAR images and the change detection will be one of the most important technique for their applications. In this paper, an automatic threshold tracking and change detection algorithm is proposed applicable to high-resolution SAR images. To detect changes within SAR image, a reference image is generated using log-ratio operator and its amplitude distribution is estimated through K-S test. Assuming SAR image has a non-gaussian amplitude distribution, a generalized thresholding technique is applied using Kittler and Illingworth minimum-error estimation. Also, MoLC parametric estimation method is adopted to improve the algorithm performance on rough ground target. The implemented algorithm is tested and verified on the simulated SAR raw data. Then, it is applied to the spaceborne high-resolution SAR images taken by Cosmo-Skymed and KOMPSAT-5 and the performances are analyzed and compared.

Development of an Verification System for Enhancing BIM Design Base on Usability (활용성을 고려한 BIM 설계 오류 검증시스템 개발)

  • Yang, Dong-Suk
    • Land and Housing Review
    • /
    • v.8 no.1
    • /
    • pp.23-29
    • /
    • 2017
  • The BIM design is expected to expand to the domestic and overseas construction industries, depending on the effect of construction productivity and quality improvement. However, with the obligation of Public Procurement Service to design the BIM design, it includes a design error and the problem of utilization of 3D design by choosing a simple 2D to 3D remodelling method that can not be modelled in 3D modeling or use of the construction and maintenance phases. The results reviewed by BIM design results were largely underutilized and were not even performed with the verification of the error. In order to resolve this, one must develop the check system that secures the quality of BIM design and ensure that the reliability of BIM results are available. In this study, it is designed to develop a program that can automatically verify the design of the BIM design results such as violation of the rules of the BIM design, design flaws, and improve the usability of the BIM design. In particular, this programs were developed not only to identify programmes that were not commercially available, but also to validate drawings in low-light computer environments. The developed program(LH-BIM) store the information of attribute extracted from the Revit file(ArchiCAD, IFC file included) in the integrated DB. This provides the ability to freely lookup the features and properties of drawings delivered exclusively by the LH-BIM Program without using the Revit tools. By doing so, it was possible to resolve the difficulties of using traditional commercial programs and to ensure that they operate only with traditional PC performance. Further, the results of the various BIM software can be readily validated, which can be solved the conversion process error of IFC in the case of SMC. Additionally, the developed program has the ability to automatically check the error and design criteria of the drawings, as well as the ability to calculate the area estimation. These functions allow businesses to apply simple and easy tasks to operate tasks of BIM modelling. The developed system(LH-BIM) carried out a verification test by reviewing the review of the BIM Design model of the Korea Land & Housing Corporation. It is hoped that the verification system will not only be able to achieve the Quality of BIM design, but also contribute to the expansion of BIM and future construction BIM.

Antioxidant Activities of Ethanol and Water Extracts from Propolis (프로폴리스 에탄올 및 물 추출물의 항산화 활성)

  • Jeong, Chang-Ho;Shin, Chang-Sik;Bae, Young-Il;Shim, Ki-Hwan
    • Journal of the Korean Society of Food Science and Nutrition
    • /
    • v.39 no.12
    • /
    • pp.1725-1730
    • /
    • 2010
  • To obtain basic information on the potential use of propolis as a raw material in functional food, proximate composition, total phenolics content and antioxidant activities of different propolis extracts in Korea were investigated. Propolis had the highest level of crude fat and the lowest level of crude fiber. The total phenolics content of ethanol and water extract of propolis from Geochang (GEE and GWE), ethanol and water extract of propolis from Jeju (JEE and JWE) were 184.17, 316.19, 204.33 and 47.83 mg gallic acid equivalent/g, respectively. GWE contained relatively higher levels of total phenolics than the other extracts. The antioxidant potential of the extracts was assessed by different in vitro assays such as DPPH, ABTS, reducing power, ferric reducing/antioxidant power (FRAP) and peroxidation inhibiting activities through linoleic acid emulsion system. DPPH and ABTS radical scavenging activities of all the extracts were dose dependent. The GWE exhibited the best performance in reducing power, FRAP, and lipid peroxidation using ferric thiocyanate (FTC) assay. These results demonstrated that GWE has excellent antioxidant activities and thus it has great potential as a raw material for functional food.

CC-GiST: A Generalized Framework for Efficiently Implementing Arbitrary Cache-Conscious Search Trees (CC-GiST: 임의의 캐시 인식 검색 트리를 효율적으로 구현하기 위한 일반화된 프레임워크)

  • Loh, Woong-Kee;Kim, Won-Sik;Han, Wook-Shin
    • The KIPS Transactions:PartD
    • /
    • v.14D no.1 s.111
    • /
    • pp.21-34
    • /
    • 2007
  • According to recent rapid price drop and capacity growth of main memory, the number of applications on main memory databases is dramatically increasing. Cache miss, which means a phenomenon that the data required by CPU is not resident in cache and is accessed from main memory, is one of the major causes of performance degradation of main memory databases. Several cache-conscious trees have been proposed for reducing cache miss and making the most use of cache in main memory databases. Since each cache-conscious tree has its own unique features, more than one cache-conscious tree can be used in a single application depending on the application's requirement. Moreover, if there is no existing cache-conscious tree that satisfies the application's requirement, we should implement a new cache-conscious tree only for the application's sake. In this paper, we propose the cache-conscious generalized search tree (CC-GiST). The CC-GiST is an extension of the disk-based generalized search tree (GiST) [HNP95] to be tache-conscious, and provides the entire common features and algorithms in the existing cache-conscious trees including pointer compression and key compression techniques. For implementing a cache-conscious tree based on the CC-GiST proposed in this paper, one should implement only a few functions specific to the cache-conscious tree. We show how to implement the most representative cache-conscious trees such as the CSB+-tree, the pkB-tree, and the CR-tree based on the CC-GiST. The CC-GiST eliminates the troublesomeness caused by managing mire than one cache-conscious tree in an application, and provides a framework for efficiently implementing arbitrary cache-conscious trees with new features.

Partial Denoising Boundary Image Matching Based on Time-Series Data (시계열 데이터 기반의 부분 노이즈 제거 윤곽선 이미지 매칭)

  • Kim, Bum-Soo;Lee, Sanghoon;Moon, Yang-Sae
    • Journal of KIISE
    • /
    • v.41 no.11
    • /
    • pp.943-957
    • /
    • 2014
  • Removing noise, called denoising, is an essential factor for the more intuitive and more accurate results in boundary image matching. This paper deals with a partial denoising problem that tries to allow a limited amount of partial noise embedded in boundary images. To solve this problem, we first define partial denoising time-series which can be generated from an original image time-series by removing a variety of partial noises and propose an efficient mechanism that quickly obtains those partial denoising time-series in the time-series domain rather than the image domain. We next present the partial denoising distance, which is the minimum distance from a query time-series to all possible partial denoising time-series generated from a data time-series, and we use this partial denoising distance as a similarity measure in boundary image matching. Using the partial denoising distance, however, incurs a severe computational overhead since there are a large number of partial denoising time-series to be considered. To solve this problem, we derive a tight lower bound for the partial denoising distance and formally prove its correctness. We also propose range and k-NN search algorithms exploiting the partial denoising distance in boundary image matching. Through extensive experiments, we finally show that our lower bound-based approach improves search performance by up to an order of magnitude in partial denoising-based boundary image matching.

Contents Routing in the OpenFlow-based Wireless Mesh Network Environment (OpenFlow기반 무선 메쉬 네트워크 환경에서의 컨텐츠 라우팅)

  • Kim, Won-Suk;Chung, Sang-Hwa;Choi, Hyun-Suk;Do, Mi-Rim
    • Journal of KIISE
    • /
    • v.41 no.10
    • /
    • pp.810-823
    • /
    • 2014
  • The wireless mesh network based on IEEE 802.11s provides a routing based on a destination address as it inherits legacy internet architecture. However, this architecture interested in not 'what' which is originally the users goal but 'where'. Futhermore, because of the rapid increase of the number of mobile devices recently, the mobile traffic increases geometrically. It reduces the network effectiveness as increasing many packets which have same payload in the situation of many users access to the same contents. In this paper, we propose an OpenFlow-based contents routing for the wireless mesh network(WMN) to solve this problem. We implement contents layer to the legacy network layer which mesh network uses and the routing technique based on contents identifier for efficient contents routing. In addition we provide flexibility as we use OpenFlow. By using this, we implement caching technique to improve effectiveness of network as decreasing the packet which has same payload in WSN. We measure the network usage to compare the flooding technique, we measure the delay to compare environment using caching and non caching. As a result of delay measure it shows 20% of performance improve, and controller message decrease maximum 89%.

Scalable RDFS Reasoning using Logic Programming Approach in a Single Machine (단일머신 환경에서의 논리적 프로그래밍 방식 기반 대용량 RDFS 추론 기법)

  • Jagvaral, Batselem;Kim, Jemin;Lee, Wan-Gon;Park, Young-Tack
    • Journal of KIISE
    • /
    • v.41 no.10
    • /
    • pp.762-773
    • /
    • 2014
  • As the web of data is increasingly producing large RDFS datasets, it becomes essential in building scalable reasoning engines over large triples. There have been many researches used expensive distributed framework, such as Hadoop, to reason over large RDFS triples. However, in many cases we are required to handle millions of triples. In such cases, it is not necessary to deploy expensive distributed systems because logic program based reasoners in a single machine can produce similar reasoning performances with that of distributed reasoner using Hadoop. In this paper, we propose a scalable RDFS reasoner using logical programming methods in a single machine and compare our empirical results with that of distributed systems. We show that our logic programming based reasoner using a single machine performs as similar as expensive distributed reasoner does up to 200 million RDFS triples. In addition, we designed a meta data structure by decomposing the ontology triples into separate sectors. Instead of loading all the triples into a single model, we selected an appropriate subset of the triples for each ontology reasoning rule. Unification makes it easy to handle conjunctive queries for RDFS schema reasoning, therefore, we have designed and implemented RDFS axioms using logic programming unifications and efficient conjunctive query handling mechanisms. The throughputs of our approach reached to 166K Triples/sec over LUBM1500 with 200 million triples. It is comparable to that of WebPIE, distributed reasoner using Hadoop and Map Reduce, which performs 185K Triples/sec. We show that it is unnecessary to use the distributed system up to 200 million triples and the performance of logic programming based reasoner in a single machine becomes comparable with that of expensive distributed reasoner which employs Hadoop framework.