• Title/Summary/Keyword: 결함 관리 기법

Search Result 2,857, Processing Time 0.037 seconds

Development of the Speed Limit Model for Harbour and Waterway(I) - Considerations Discrimination for Speed Limit Decision - (항만과 수로의 제한속력 설정 모델 개발에 관한 연구(I) - 제한속력 설정을 위한 고려요소 식별 -)

  • Kim, Deug-Bong;Jeong, Jae-Yong;Park, Jin-Soo;Park, Young-Soo
    • Journal of the Korean Society of Marine Environment & Safety
    • /
    • v.21 no.2
    • /
    • pp.171-178
    • /
    • 2015
  • This research is the first research on developing the speed limit select model and also it is the result of the research on the importance of each element and consideration factors when selecting the speed limit. For the consideration factor discrimination and calculation of the importance, the delphi method and AHP method was used. The delphi survey was processed through third round survey, 5 high consideration factor(Level 1) and 23 low consideration factor(Level 2) was discriminated. During the process of the third delphi survey, when the CVR cost was in the range between 0.4~1.0 it was treated as the consideration factor when selecting the speed limit and less than 0.4 cost was eliminated. In the process of the second delphi survey, 33 consideration factors were discriminated but was reordered into 23 categories through the third survey. Based on the 23 categories earned through the third delphi analysis, the AHP survey was processed. The result of the AHP survey was that out of the importance of the 5 high consideration factor(Level 1), the traffic condition was evaluated as the number one factor and the vessel condition, waterway condition, environment condition, supporting condition and etc. conditions were evaluated following the traffic condition. Out of the 23 low consideration factor(Level 2) consideration, the visibility was evaluated to be the first important and the performance of the vessel steering, objective factors within the harbor, amount of traffic and density, distance between the passing vessel, speed of the steering capacity and tidal current were the following evaluated factors.

Semantic Inference System Using Backward Chaining (후방향 추론기법을 이용한 시멘틱 추론 시스템)

  • 함영경;박영택
    • Proceedings of the Korean Information Science Society Conference
    • /
    • 2003.10a
    • /
    • pp.97-99
    • /
    • 2003
  • 대부분의 웹 문서들은 HTML이나 XML로 표현된 웹의 정보들은 Syntactic 구조를 기반으로 표현되기 때문에, 소프트웨어가 정보를 처리하는데 한계가 있다. HTML은 문서의 display안을 위한 tag기반의 문서 표현 방식이고, XML은 문서의 구조를 사람이 이해하기 쉽도록 제안된 표현 방식이기 때문이다. 따라서, HTML 및 XML로 표현된 정보들을 가지고 서비스를 제공하는 웹 에이전트들은 사용자들에게 의미있는 서비스를 제공하기 위해 오프라인 상에서 많은 수작업을 수행해야만 했다. 이와 같은 문제점을 극복하기 위해서 미국과 유럽에서는 시멘틱 웹에 대한 연구를 활발히 진행하고 있다. 시멘틱 웹은 기존의 웹과는 달리 소프트웨어가 이해하고 처리 할 수 있는 형태(machine processable)로 정보를 표현하기 때문에 오프라인 상에서 수행되던 많은 작업들을 에이전트가 이해하고 처리할 수 있게 되었다. 그러나. 온톨로지를 구축하는 과정에서도 필연적으로 정보의 31(Incorrect, incomplete, Inconsistence)가 나타나고, 서비스의 결과 또한 온톨로지에 의해 좌우된다는 단점이 있다. 본 논문에서 제안하는 후방향 추론기법을 이용한 추론엔진은 다음과 같은 시스템을 제안한다. 첫째. 시멘틱 웹을 이용함으로써 소프트웨어 에이전트의 자동화 시스템을 제안한다. 둘째 은톨로지 정보의 한계성을 극복하기 위해 규칙기반의 후방향 추론 기법을 사용하는 시멘틱 추론엔진을 제안한다. 본 논문에서 제안하는 후방향 추론기법을 이용한 시멘틱 추론시스템은 사용자의 질의를 입력받아. 온톨로지와 시멘틱 웹 문서의 정보를 이용하여 후방향 추론을 수행함으로써 웹 정보의 불완전성을 완화하고, 온톨로지의 영향력를 감소시킴으로써 웹 서비스의 질을 향상시키는데 목적이 있다.RED에 비해 향상된 성능을 보여주었다.웍스 네트워크상의 다양한 디바이스들간의 네트워크 다양화와 분산화 기능을 얻을 수 있었고, 기존의 고가의 해외 솔루션인 Echelon사의 LonMaker 소프트웨어를 사용하지 않고도 국내의 순수 솔루션인 리눅스 기반의 LonWare 3.0 다중 바인딩 기능을 통해 저 비용으로 홈 네트워크 구성 관리 서버 시스템 개발에 대한 비용을 줄일 수 있다. 기대된다.e 함량이 대체로 높게 나타났다. 점미가 수가용성분에서 goucose대비 용출함량이 고르게 나타나는 경향을 보였고 흑미는 알칼리가용분에서 glucose가 상당량(0.68%) 포함되고 있음을 보여주었고 arabinose(0.68%), xylose(0.05%)도 다른 종류에 비해서 다량 함유한 것으로 나타났다. 흑미는 총식이섬유 함량이 높고 pectic substances, hemicellulose, uronic acid 함량이 높아서 콜레스테롤 저하 등의 효과가 기대되며 고섬유식품으로서 조리 특성 연구가 필요한 것으로 사료된다.리하였다. 얻어진 소견(所見)은 다음과 같았다. 1. 모년령(母年齡), 임신회수(姙娠回數), 임신기간(姙娠其間), 출산시체중등(出産時體重等)의 제요인(諸要因)은 주산기사망(周産基死亡)에 대(對)하여 통계적(統計的)으로 유의(有意)한 영향을 미치고 있어 $25{\sim}29$세(歲)의 연령군에서, 2번째 임신과 2번째의 출산에서 그리고 만삭의 임신 기간에, 출산시체중(出産時體重) $3.50{\sim}3.99kg$사이의 아이에서 그 주산기사망률(周産基死亡率)이 각각 가장 낮았다. 2. 사산(死産)과 초생아사망(初生兒死亡)을 구분(區分)하여 고려해 볼때 사산(死産)은 모성(母性)의 임신력(姙娠歷)과 매우 밀접한 관련이 있는 것으

  • PDF

Discovery of Active Nodes and Reliable Transmission of Active Packets in IP Networks (IP 망에서의 액티브 노드 발견 및 액티브 패킷의 신뢰성 전송 기법)

  • Kim, Bang-Eun;Chae, Ki-Joon;Kim, Dong-Young;Na, Jung-Chan
    • The KIPS Transactions:PartC
    • /
    • v.11C no.3
    • /
    • pp.361-370
    • /
    • 2004
  • All active nudes which have no physically direct connection with each other in If network must be able to compose and manage network topology Informations. Besides one active program can be performed by the active nodes when every active packet for this program is transmitted without any loss of packets. Also the active packets should be transmitted effectively to minimize the transmission delay and securely from threatens. In this thesis, the discovery scheme of active nodes is adapted for active nodes in IP networks to compose and manage the topology information. The scheme for the efficient, reliable and secure transmission of active packets is also proposed. The sequence number is assigned to every active packet. If a receiver detects the loss of active packet checking the sequence number, the receiver requests the retransmission of the lost packet to the previous active node. kiter receiving an active packet and adapting security and reliability schemes, intermediate active nodes not only copy and send the Packet Instantly but also apply some suity mechanisms to it. And the active packet transmission engine is proposed to provide these transmission schemes The simulation of the adapted active node discovery scheme and the proposed active packet transmission engine is performed. The simulation results show that the adapted active node discovery scheme is efficient and the proposed active engine has the low latency and the high performance.

Development of Alignment Information Extraction System on Highway by Terrestrial Laser Scanning Technique (지상 레이저 스캐닝 기법에 의한 도로선형정보 추출 시스템 개발)

  • Kim, Jin-Soo
    • Journal of the Korean Association of Geographic Information Studies
    • /
    • v.10 no.4
    • /
    • pp.97-110
    • /
    • 2007
  • A laser scanning technique has been attracting much attention as a new technology to acquire location information. This technique might be applicable to a wide range of areas, most notably in geomatics, due to its high accuracy of location and automation of high-density data acquisition. A alignment information extraction system on highway has been developed in this study by utilizing the advantages of the laser scanning technique. The system can accurately interpret the alignment information of highway and can be applied to actual works. To develop the alignment information extraction system on highway, an algorithm that can automatically separate a horizontal alignment into a straight line, a transition curve, and a circular curve was developed. It can increase its efficiency compared to the conventional methods. In addition, an algorithm that can automatically extract design elements of horizontal and vertical alignments of highway was developed and applied to an object highway. This yielded higher practicality with more accurate values compared to those from previous studies on the extraction of design elements of highway alignment. Furthermore, the extracted design elements were used to perform a virtual driving simulation on the object highway. Through this, data were provided for a visual judgment for judging visually whether the topography and structures were harmonized in a three-dimensional manner or not. The study also presents data that can serve as a basis to determine highway surface freezing sections and to analyze three-dimensional sight distance models. Through the establishment of a systematic database for diverse data on highway and the development of web-based operating programs, an efficient highway maintenance can be ensured and also they can provide important information to be used when estimating a highway safety in the future.

  • PDF

New Vehicle Classification Algorithm with Wandering Sensor (원더링 센서를 이용한 차종분류기법 개발)

  • Gwon, Sun-Min;Seo, Yeong-Chan
    • Journal of Korean Society of Transportation
    • /
    • v.27 no.6
    • /
    • pp.79-88
    • /
    • 2009
  • The objective of this study is to develop the new vehicle classification algorithm and minimize classification errors. The existing vehicle classification algorithm collects data from loop and piezo sensors according to the specification("Vehicle classification guide for traffic volume survey" 2006) given by the Ministry of Land, Transport and Maritime Affairs. The new vehicle classification system collects the vehicle length, distance between axles, axle type, wheel-base and tire type to minimize classification error. The main difference of new system is the "Wandering" sensor which is capable of measuring the wheel-base and tire type(single or dual). The wandering sensor obtains the wheel-base and tire type by detecting both left and right tire imprint. Verification tests were completed with the total traffic volume of 762,420 vehicles in a month for the new vehicle classification algorithm. Among them, 47 vehicles(0.006%) were not classified within 12 vehicle types. This results proves very high level of classification accuracy for the new system. Using the new vehicle classification algorithm will improve the accuracy and it can be broadly applicable to the road planning, design, and management. It can also upgrade the level of traffic research for the road and transportation infrastructure.

Reengineering Template-Based Web Applications to Single Page AJAX Applications (단일 페이지 AJAX 애플리케이션을 위한 템플릿 기반 웹 애플리케이션 재공학 기법)

  • Oh, Jaewon;Choi, Hyeon Cheol;Lim, Seung Ho;Ahn, Woo Hyun
    • KIPS Transactions on Software and Data Engineering
    • /
    • v.1 no.1
    • /
    • pp.1-6
    • /
    • 2012
  • Web pages in a template-based web application (TWA) are automatically populated using a template shared by the pages with contents specific to the pages. So users can easily obtain information guided by a consistent structure of the template. Reduced duplicated code helps to increase the level of maintainability as well. However, TWA still has the interaction problem of classic web applications that each time a user clicks a hyperlink a new page is loaded, although a partial update of the page is desirable. This paper proposes a reengineering technique to transform the multi-page structure of legacy Java-based TWA to a single page one with partial page refresh. In this approach, hyperlinks in HTML code are refactored to AJAX-enabled event handlers to achieve the single page structure. In addition, JSP and Servlet code is transformed in order not to send data unnecessary for the partial update. The new single page consists of individual components that are updateable independently when interacting with a user. Therefore, our approach can improve interactivity and responsiveness towards a user while reducing CPU and network usage. The measurement of our technique applied to a typical TWA shows that our technique improves the response time of user requests over the TWA in the range from 1 to 87%.

A Study of DEM Generation in the Ganghwado Southern Intertidal Flat Using Waterline Method and InSAR (수륙경계선 방법과 위상간섭기법을 이용한 강화도 남단 갯벌의 DEM 생성 연구)

  • Lee, Yoon-Kyung;Ryu, Joo-Hyung;Hong, Sang-Hoon;Won, Joong-Sun;Yoo, Hong-Rhyong
    • Journal of Wetlands Research
    • /
    • v.8 no.3
    • /
    • pp.29-38
    • /
    • 2006
  • Digital Elevation Model (DEM) of intertidal flat can be widely used not only for scientific fields, coastal management, fisheries, ocean safety, military, but also for understanding natural and artificial topographic changes of the tidal flat. In this study, we generated DEM of the Ganghwado southern intertidal flat, the largest tidal flat in the west coast of the Korean Peninsula, using waterline method and interferometric synthetic aperture radar (InSAR). Constructed DEM which applied waterline method to the Landsat-5 TM and Landsat-7 ETM+ images closely expresses overall topographic relief of tidal flat. We found that the accuracy was determined by the number of waterlines which reflect various tidal conditions. The application of InSAR to the ERS-1/2 and ENVISAT images showed that only ERS-1/2 tandem pairs successfully generated DEM in the part of northern Yeongjongdo, but construction of DEM in the other areas was difficult due to the low coherence caused by a lot of surface remnant waters. In the near future, Kompsat-2 will provide satellite images having multi-spectral and high spatial resolution within a relatively short period at different sea levels. Application of waterline method to these images will help us construct a high precision tidal flat DEM. Also, we should develop DEM generation method using single-pass microwave satellite images.

  • PDF

An Exploratory Study for Analyzing the Needs of the Customers Who Use Academic Information Service (학술정보 서비스 이용고객의 니즈 분석을 위한 탐색적 연구)

  • Yoon, Jong-Wook
    • Journal of the Korea Society of Computer and Information
    • /
    • v.17 no.2
    • /
    • pp.215-224
    • /
    • 2012
  • This study performs an exploratory investigation of the needs of the customers who use academic information service from a research institute, K, that provides information services for domestic academic institutions of natural science and technology. K institute is planning customized services in order to improve customer satisfaction on the academic information service And therefore, the institute begins the research on customer needs analysis and customer segmentation. The research is regarded as well-timed, because CRM implementation in public organizations has been activated recently. Data mining and data warehousing techniques were used for pilot analyses. For the purpose of customer segmentation, a mixed segmentation model, which adds product life cycle concept to the 'balanced customer segmentation' model, which in turn considers the value of customers from the organizational viewpoint and the value of organizations from the customer's viewpoint, simultaneously, was applied. The result of investigation indicated that, in the case of K, 'balanced customer segmentation' and 'contents reach approach' which uses data warehouse/OLAP, rather than those customer segmentation techniques that are often used within the industry, are the more potent ways of approach. This exploratory case study is expected to provide a useful guideline for 'deriving an organizationally unique CRM model' that recently is one of the hot topics in the CRM area.

Convergence Study in Development of Severity Adjustment Method for Death with Acute Myocardial Infarction Patients using Machine Learning (머신러닝을 이용한 급성심근경색증 환자의 퇴원 시 사망 중증도 보정 방법 개발에 대한 융복합 연구)

  • Baek, Seol-Kyung;Park, Hye-Jin;Kang, Sung-Hong;Choi, Joon-Young;Park, Jong-Ho
    • Journal of Digital Convergence
    • /
    • v.17 no.2
    • /
    • pp.217-230
    • /
    • 2019
  • This study was conducted to develop a customized severity-adjustment method and to evaluate their validity for acute myocardial infarction(AMI) patients to complement the limitations of the existing severity-adjustment method for comorbidities. For this purpose, the subjects of KCD-7 code I20.0 ~ I20.9, which is the main diagnosis of acute myocardial infarction were extracted using the Korean National Hospital Discharge In-depth Injury survey data from 2006 to 2015. Three tools were used for severity-adjustment method of comorbidities : CCI (charlson comorbidity index), ECI (Elixhauser comorbidity index) and the newly proposed CCS (Clinical Classification Software). The results showed that CCS was the best tool for the severity correction, and that support vector machine model was the most predictable. Therefore, we propose the use of the customized method of severity correction and machine learning techniques from this study for the future research on severity adjustment such as assessment of results of medical service.

Design and Evaluation of an Efficient Flushing Scheme for key-value Store (키-값 저장소를 위한 효율적인 로그 처리 기법 설계 및 평가)

  • Han, Hyuck
    • The Journal of the Korea Contents Association
    • /
    • v.19 no.5
    • /
    • pp.187-193
    • /
    • 2019
  • Key-value storage engines are an essential component of growing demand in many computing environments, including social networks, online e-commerce, and cloud services. Recent key-value storage engines offer many features such as transaction, versioning, and replication. In a key-value storage engine, transaction processing provides atomicity through Write-Ahead-Logging (WAL), and a synchronous commit method for transaction processing flushes log data before the transaction completes. According to our observation, flushing log data to persistent storage is a performance bottleneck for key-value storage engines due to the significant overhead of fsync() calls despite the various optimizations of existing systems. In this article, we propose a group synchronization method to improve the performance of the key-value storage engine. We also design and implement a transaction scheduling method to perform other transactions while the system processes fsync() calls. The proposed method is an efficient way to reduce the number of frequent fsync() calls in the synchronous commit while supporting the same level of transaction provided by the existing system. We implement our scheme on the WiredTiger storage engine and our experimental results show that the proposed system improves the performance of key-value workloads over existing systems.